Welcome to the Virtual Reality & Immersive Visualization Group
at RWTH Aachen University!

The Virtual Reality and Immersive Visualization Group started in 1998 as a service team in the RWTH IT Center. Since 2015, we are a research group (Lehr- und Forschungsgebiet) at i12 within the Computer Science Department. Moreover, the Group is a member of the Visual Computing Institute and continues to be an integral part of the RWTH IT Center.

In a unique combination of research, teaching, services, and infrastructure, we provide Virtual Reality technologies and the underlying methodology as a powerful tool for scientific-technological applications.

In terms of basic research, we develop advanced methods and algorithms for multimodal 3D user interfaces and explorative analyses in virtual environments. Furthermore, we focus on application-driven, interdisciplinary research in collaboration with RWTH Aachen institutes, Forschungszentrum Jülich, research institutions worldwide, and partners from business and industry, covering fields like simulation science, production technology, neuroscience, and medicine.

To this end, we are members of / associated with the following institutes and facilities:

Our offices are located in the RWTH IT Center, where we operate one of the largest Virtual Reality labs worldwide. The aixCAVE, a 30 sqm visualization chamber, makes it possible to interactively explore virtual worlds, is open to use by any RWTH Aachen research group.


29th ACM Symposium on Virtual Reality Software and Technology (VRST 2023)

Together with Dr. Daniel Zielasko from the University of Trier our colleague Dr. Tim Weißker presented his paper entitled "Stay Vigilant: The Threat of a Replication Crisis in VR Locomotion Research" at the 29th ACM Symposium on Virtual Reality Software and Technology (VRST 2023). Their work was awarded with the Best Paper Award. Congratulations!

Oct. 12, 2023

23rd ACM International Conference on Intelligent Virtual Agents (IVA23)

Jonathan Ehret presented his paper entitled "Who's next? Integrating Non-Verbal Turn-Taking Cues for Embodied Conversational Agents" at the 23rd ACM International Conference on Intelligent Virtual Agents. Furthermore, Andrea Bönsch presented two posters in the realm of virtual agents supporting scene exploration, either as conversing groups or as method for constrained navigation.

Sept. 19, 2023

The SPP AUDICTIVE converence took place and we contributed to the programm with two project presentations.

Read more

June 30, 2023

Christian Nowke receives doctoral degree from University of Trier

Today, our colleague Christian Nowke successfully passed his Ph.D. defense and received a doctoral degree from the University of Trier for his thesis on "Semantic-Aware Coordinated Multiple Views for the Interactive Analysis of Neural Activity Data". Congratulations!

May 22, 2023

Industry Meets aixCAVE

On Friday, May 5th, about twenty delegates from renowned companies across Germany visited us to experience the aixCAVE. This event triggered many intriguing thoughts and stimulating discussions between our researchers and guests.

Read more

May 5, 2023

Cover on the German GI Informatik Spektrum

The cover of the current issue of Informatik Spektrum of the Gesellschaft für Informatik e.V. (GI) presents results of a project between the EON Energy Research Center and us on an important issue. The use of air filters in classrooms to fight the ongoing COVID-19 pandemic has been and continues to be a much-discussed topic. The cover shows a visualization in our aixCAVE, enabling an analysis of the temporal and spatial dynamics of aerosol concentration for each person in the respective room. Virtual reality is proving to be an effective tool for scientists here. It demonstrates the potential risk of aerosol dispersion in enclosed spaces with many people, which can be intuitively experienced even by laypersons.

Additional information on this project is provided in the IT Center Annual Report 2020/2021, page 58f (german only).

Dec. 16, 2022

Recent Publications

Wayfinding in Immersive Virtual Environments as Social Activity Supported by Virtual Agents

Frontiers in Virtual Reality, Section Virtual Reality and Human Behaviour

Effective navigation and interaction within immersive virtual environments rely on thorough scene exploration. Therefore, wayfinding is essential, assisting users in comprehending their surroundings, planning routes, and making informed decisions. Based on real-life observations, wayfinding is, thereby, not only a cognitive process but also a social activity profoundly influenced by the presence and behaviors of others. In virtual environments, these 'others' are virtual agents (VAs), defined as anthropomorphic computer-controlled characters, who enliven the environment and can serve as background characters or direct interaction partners. However, little research has been done to explore how to efficiently use VAs as social wayfinding support. In this paper, we aim to assess and contrast user experience, user comfort, and the acquisition of scene knowledge through a between-subjects study involving n = 60 participants across three distinct wayfinding conditions in one slightly populated urban environment: (i) unsupported wayfinding, (ii) strong social wayfinding using a virtual supporter who incorporates guiding and accompanying elements while directly impacting the participants' wayfinding decisions, and (iii) weak social wayfinding using flows of VAs that subtly influence the participants' wayfinding decisions by their locomotion behavior. Our work is the first to compare the impact of VAs' behavior in virtual reality on users' scene exploration, including spatial awareness, scene comprehension, and comfort. The results show the general utility of social wayfinding support, while underscoring the superiority of the strong type. Nevertheless, further exploration of weak social wayfinding as a promising technique is needed. Thus, our work contributes to the enhancement of VAs as advanced user interfaces, increasing user acceptance and usability.

IntenSelect+: Enhancing Score-Based Selection in Virtual Reality

2024 IEEE Transactions on Visualization and Computer Graphics

Object selection in virtual environments is one of the most common and recurring interaction tasks. Therefore, the used technique can critically influence a system’s overall efficiency and usability. IntenSelect is a scoring-based selection-by-volume technique that was shown to offer improved selection performance over conventional raycasting in virtual reality. This initial method, however, is most pronounced for small spherical objects that converge to a point-like appearance only, is challenging to parameterize, and has inherent limitations in terms of flexibility. We present an enhanced version of IntenSelect called IntenSelect+ designed to overcome multiple shortcomings of the original IntenSelect approach. In an empirical within-subjects user study with 42 participants, we compared IntenSelect+ to IntenSelect and conventional raycasting on various complex object configurations motivated by prior work. In addition to replicating the previously shown benefits of IntenSelect over raycasting, our results demonstrate significant advantages of IntenSelect+ over IntenSelect regarding selection performance, task load, and user experience. We, therefore, conclude that IntenSelect+ is a promising enhancement of the original approach that enables faster, more precise, and more comfortable object selection in immersive virtual environments.

Authentication in Immersive Virtual Environments through Gesture-Based Interaction with a Virtual Agent

2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)

Authentication poses a significant challenge in VR applications, as conventional methods, such as text input for usernames and passwords, prove cumbersome and unnatural in immersive virtual environments. Alternatives such as password managers or two-factor authentication may necessitate users to disengage from the virtual experience by removing their headsets. Consequently, we present an innovative system that utilizes virtual agents (VAs) as interaction partners, enabling users to authenticate naturally through a set of ten gestures, such as high fives, fist bumps, or waving. By combining these gestures, users can create personalized authentications akin to PINs, potentially enhancing security without compromising the immersive experience. To gain first insights into the suitability of this authentication process, we conducted a formal expert review with five participants and compared our system to a virtual keypad authentication approach. While our results show that the effectiveness of a VA-mediated gesture-based authentication system is still limited, they motivate further research in this area.

Disclaimer Home Visual Computing institute RWTH Aachen University