Welcome to the Virtual Reality & Immersive Visualization Group
at RWTH Aachen University!

The Virtual Reality and Immersive Visualization Group started in 1998 as a service team in the RWTH IT Center. Since 2015, we are a research group (Lehr- und Forschungsgebiet) at i12 within the Computer Science Department. Moreover, the Group is member of the Visual Computing Institute and continues to be an integral part of the RWTH IT Center.

In a unique combination of research, teaching, services and infrastructure, we provide Virtual Reality technologies and the underlying methodology as a powerful tool for scientific-technological applications.

In terms of basic research, we develop advanced methods and algorithms for multimodal 3D user interfaces and explorative analyses in virtual environments. Furthermore, we focus on application-driven, interdisciplinary research in collaboration with RWTH Aachen institutes, Forschungszentrum Jülich, research institutions worldwide, and partners from business and industry, covering fields like simulation science, production technology, neuroscience, and medicine.

To this end, we are members of / associated with the following institutes and facilities:

Our offices are located in the RWTH IT Center, where we operate one the largest Virtual Reality labs worldwide. The aixCAVE, a 30 sqm visualization chamber, makes it possible to interactively explore virtual worlds, is open to use by any RWTH Aachen research group.

VARECo @ MuC 2018

Our colleagues Benjamin Weyers and Daniel Zielasko teamed up with Thiess Pfeiffer from University Bielefeld and Markus Funk from TU Darmstadt to organize a workshop in conjunction with the German GI conference Mensch und Computer (MuC), held in Dresden in September 2018. The one-day workshop entitled “Workshop on VR and AR in Everyday Context” (VARECo) focuses on the challenges raised by the increased occurrence of VR consumer hardware, e.g., long-term use of HMDs or simple context creation, as well as on potential solutions. The call for papers is out, with deadline June, 6th 2018.

May 16, 2018

IEEE VR: Conference Commitee and Presentations

The Virtual Reality and Immersive Visualization Group decisively contributes to this year‘s IEEE Virtual Reality, starting on Sunday, March 18th, in Tübingen, Germany. Prof. Torsten W. Kuhlen is serving as General Chair, Dr. Benjamin Weyers as 3DUI Contest Chair and as Tutorial Chair, and Dr. Tom Vierjahn as Workshop Chair. In addition, the group presents overall six papers.

March 17, 2018

Andrea Bönsch on CASA2018 International Program Committee

Andrea Bönsch is serving on the International Program Committee for CASA2018, which will take place in Beijing, China. CASA is the oldest international conference in computer animation and social agents in the world.

Jan. 1, 2018

Dominik Rausch receives doctoral degree from RWTH Aachen University

Today, our colleague Dominik Rausch successfully passed his Ph.D. defense and received a doctoral degree from RWTH Aachen University for his thesis on "Modal Sound Synthesis for Interactive Virtual Environments". Congratulations!

Dec. 14, 2017

ICT Young Researcher Award 2017 for Andrea Bönsch

By means of a yearly award, the profile area "Information and Communication Technology" at RWTH Aachen University honors young researchers who contribute significantly to the ICT-related research and show the potential to further improve the international visibility of ICT research at RWTH Aachen University. Andrea Bönsch was selected by the Steering Committee of the profile area ICT as one of the recipient of the 2017 ICT Young Researcher Award. The award is donated with 3000 Euro, supporting her research career.

Nov. 16, 2017

Torsten Kuhlen gave an invited talk at the ART Days 2017 in Ottobrunn, Munich.

June 2, 2017

Recent Publications

Social VR: How Personal Space is Affected by Virtual Agents’ Emotions

Proceedings of the IEEE Virtual Reality Conference, 2018

Personal space (PS), the flexible protective zone maintained around oneself, is a key element of everyday social interactions. It, e.g., affects people's interpersonal distance and is thus largely involved when navigating through social environments. However, the PS is regulated dynamically, its size depends on numerous social and personal characteristics and its violation evokes different levels of discomfort and physiological arousal. Thus, gaining more insight into this phenomenon is important. We contribute to the PS investigations by presenting the results of a controlled experiment in a CAVE, focusing on German males in the age of 18 to 30 years. The PS preferences of 27 participants have been sampled while they were approached by either a single embodied, computer-controlled virtual agent (VA) or by a group of three VAs. In order to investigate the influence of a VA's emotions, we altered their facial expression between angry and happy. Our results indicate that the emotion as well as the number of VAs approaching influence the PS: larger distances are chosen to angry VAs compared to happy ones; single VAs are allowed closer compared to the group. Thus, our study is a foundation for social and behavioral studies investigating PS preferences.


You Spin my Head Right Round: Threshold of Limited Immersion for Rotation Gains in Redirected Walking

IEEE Transactions on Visualization and Computer Graphics

In virtual environments, the space that can be explored by real walking is limited by the size of the tracked area. To enable unimpeded walking through large virtual spaces in small real-world surroundings, redirection techniques are used. These unnoticeably manipulate the user’s virtual walking trajectory. It is important to know how strongly such techniques can be applied without the user noticing the manipulation—or getting cybersick. Previously, this was estimated by measuring a detection threshold (DT) in highly-controlled psychophysical studies, which experimentally isolate the effect but do not aim for perceived immersion in the context of VR applications. While these studies suggest that only relatively low degrees of manipulation are tolerable, we claim that, besides establishing detection thresholds, it is important to know when the user’s immersion breaks. We hypothesize that the degree of unnoticed manipulation is significantly different from the detection threshold when the user is immersed in a task. We conducted three studies: a) to devise an experimental paradigm to measure the threshold of limited immersion (TLI), b) to measure the TLI for slowly decreasing and increasing rotation gains, and c) to establish a baseline of cybersickness for our experimental setup. For rotation gains greater than 1.0, we found that immersion breaks quite late after the gain is detectable. However, for gains lesser than 1.0, some users reported a break of immersion even before established detection thresholds were reached. Apparently, the developed metric measures an additional quality of user experience. This article contributes to the development of effective spatial compression methods by utilizing the break of immersion as a benchmark for redirection techniques.


Interactive Exploration Assistance for Immersive Virtual Environments Based on Object Visibility and Viewpoint Quality

Proceedings of IEEE Virtual Reality Conference 2018

During free exploration of an unknown virtual scene, users often miss important parts, leading to incorrect or incomplete environment knowledge and a potential negative impact on performance in later tasks. This is addressed by wayfinding aids such as compasses, maps, or trails, and automated exploration schemes such as guided tours. However, these approaches either do not actually ensure exploration success or take away control from the user. Therefore, we present an interactive assistance interface to support exploration that guides users to interesting and unvisited parts of the scene upon request, supplementing their own, free exploration. It is based on an automated analysis of object visibility and viewpoint quality and is therefore applicable to a wide range of scenes without human supervision or manual input. In a user study, we found that the approach improves users' knowledge of the environment, leads to a more complete exploration of the scene, and is also subjectively helpful and easy to use.

Disclaimer Home Visual Computing institute RWTH Aachen University