header

Welcome


bdrp


Welcome to the Virtual Reality & Immersive Visualization Group
at RWTH Aachen University!

The Virtual Reality and Immersive Visualization Group started in 1998 as a service team in the RWTH IT Center. Since 2015, we are a research group (Lehr- und Forschungsgebiet) at i12 within the Computer Science Department. Moreover, the Group is a member of the Visual Computing Institute and continues to be an integral part of the RWTH IT Center.

In a unique combination of research, teaching, services, and infrastructure, we provide Virtual Reality technologies and the underlying methodology as a powerful tool for scientific-technological applications.

In terms of basic research, we develop advanced methods and algorithms for multimodal 3D user interfaces and explorative analyses in virtual environments. Furthermore, we focus on application-driven, interdisciplinary research in collaboration with RWTH Aachen institutes, Forschungszentrum Jülich, research institutions worldwide, and partners from business and industry, covering fields like simulation science, production technology, neuroscience, and medicine.

To this end, we are members of / associated with the following institutes and facilities:

Our offices are located in the RWTH IT Center, where we operate one of the largest Virtual Reality labs worldwide. The aixCAVE, a 30 sqm visualization chamber, makes it possible to interactively explore virtual worlds, is open to use by any RWTH Aachen research group.

News

Open Software Developer Position

Join our team at the Virtual Reality and Immersive Visualization Group at RWTH Aachen University as a full-time Software Developer! If you are interested in using Unreal Engine 5 to power latest virtual reality and visualization hardware for cutting-edge research, visit the full position description for more information.

Oct. 22, 2024

24th ACM International Conference on Intelligent Virtual Agents (IVA'24)

Together with Willem-Paul Brinkmann from TU Delft University our colleague Dr. Andrea Bönsch presented her work on German and Dutch Translations of the Artificial-Social-Agent Questionnaire Instrument for Evaluating Human-Agent Interactions" at IVA 2024.

Sept. 16, 2024

Andrea Bönsch receives doctoral degree from RWTH Aachen University

Today, our colleague Andrea Bönsch successfully passed her Ph.D. defense and received a doctoral degree from RWTH Aachen University for her thesis on "Social Wayfinding Strategies to Explore Immersive Virtual Environments". Congratulations!

June 26, 2024

Aachen Cathedral Demo: Exploring Aachen Cathedral UNESCO World Heritage Site in Virtual Reality

Read more

June 20, 2024

Social VR Contributions at IEEE Virtual Reality 2024

Together with Anne-Hélène Olivier, Julien Pettré, and Katja Zibrek from INRIA, Rennes our colleague Andrea Bönsch organized and hosted the 8th Workshop on Virtual Humans and Crowds in Immersive Environments at IEEE Virtual Reality 2024, while also presenting some work of our research group, including a gesture authentication approach with interactive virtual agents in collabration with Daniel Rupp and results of our investigation on audiovisual coherence, asking wether embodiment of background noise sources in populated environments is a necessity in collaboration with Jonathan Ehret. Jonathan also presented our work titled StudyFramework: Comfortably Setting up and Conducting Factorial-Design Studies Using the Unreal Engine at the Open Access Tools (OAT) and Libraries for Virtual Reality Workshop.

March 17, 2024

29th ACM Symposium on Virtual Reality Software and Technology (VRST 2023)

Together with Dr. Daniel Zielasko from the University of Trier our colleague Dr. Tim Weißker presented his paper entitled "Stay Vigilant: The Threat of a Replication Crisis in VR Locomotion Research" at the 29th ACM Symposium on Virtual Reality Software and Technology (VRST 2023). Their work was awarded with the Best Paper Award. Congratulations!

Oct. 12, 2023

Recent Publications

pubimg
Wayfinding in Immersive Virtual Environments as Social Activity Supported by Virtual Agents

Frontiers in Virtual Reality, Section Virtual Reality and Human Behaviour

Effective navigation and interaction within immersive virtual environments rely on thorough scene exploration. Therefore, wayfinding is essential, assisting users in comprehending their surroundings, planning routes, and making informed decisions. Based on real-life observations, wayfinding is, thereby, not only a cognitive process but also a social activity profoundly influenced by the presence and behaviors of others. In virtual environments, these 'others' are virtual agents (VAs), defined as anthropomorphic computer-controlled characters, who enliven the environment and can serve as background characters or direct interaction partners. However, little research has been done to explore how to efficiently use VAs as social wayfinding support. In this paper, we aim to assess and contrast user experience, user comfort, and the acquisition of scene knowledge through a between-subjects study involving n = 60 participants across three distinct wayfinding conditions in one slightly populated urban environment: (i) unsupported wayfinding, (ii) strong social wayfinding using a virtual supporter who incorporates guiding and accompanying elements while directly impacting the participants' wayfinding decisions, and (iii) weak social wayfinding using flows of VAs that subtly influence the participants' wayfinding decisions by their locomotion behavior. Our work is the first to compare the impact of VAs' behavior in virtual reality on users' scene exploration, including spatial awareness, scene comprehension, and comfort. The results show the general utility of social wayfinding support, while underscoring the superiority of the strong type. Nevertheless, further exploration of weak social wayfinding as a promising technique is needed. Thus, our work contributes to the enhancement of VAs as advanced user interfaces, increasing user acceptance and usability.

fadeout
 
pubimg
Virtual Reality as a Tool for Monitoring Additive Manufacturing Processes via Digital Shadows

GI VR/AR Workshop 2024, Gesellschaft für Informatik e.V.

We present a data acquisition and visualization pipeline that allows experts to monitor additive manufacturing processes, in particular laser metal deposition with wire (LMD-w) processes, in immersive virtual reality. Our virtual environment consists of a digital shadow of the LMD-w production site enriched with additional measurement data shown on both static as well as handheld virtual displays. Users can explore the production site by enhanced teleportation capabilities that enable them to change their scale as well as their elevation above the ground plane. In an exploratory user study with 22 participants, we demonstrate that our system is generally suitable for the supervision of LMD-w processes while generating low task load and cybersickness. Therefore, it serves as a first promising step towards the successful application of virtual reality technology in the comparatively young field of additive manufacturing.

fadeout
 
pubimg
Semi-Automated Guided Teleportation through Immersive Virtual Environments

Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology

Immersive knowledge spaces like museums or cultural sites are often explored by traversing pre-defined paths that are curated to unfold a specific educational narrative. To support this type of guided exploration in VR, we present a semi-automated, handsfree path traversal technique based on teleportation that features a slow-paced interaction workflow targeted at fostering knowledge acquisition and maintaining spatial awareness. In an empirical user study with 34 participants, we evaluated two variations of our technique, differing in the presence or absence of intermediate teleportation points between the main points of interest along the route. While visiting additional intermediate points was objectively less efficient, our results indicate significant benefits of this approach regarding the user’s spatial awareness and perception of interface dependability. However, the user’s perception of flow, presence, attractiveness, perspicuity, and stimulation did not differ significantly. The overall positive reception of our approach encourages further research into semi-automated locomotion based on teleportation and provides initial insights into the design space of successful techniques in this domain.

fadeout
Disclaimer Home Visual Computing institute RWTH Aachen University