Welcome
at RWTH Aachen University!
The Virtual Reality and Immersive Visualization Group started in 1998 as a service team in the RWTH IT Center. Since 2015, we are a research group (Lehr- und Forschungsgebiet) at i12 within the Computer Science Department. Moreover, the Group is a member of the Visual Computing Institute and continues to be an integral part of the RWTH IT Center.
In a unique combination of research, teaching, services, and infrastructure, we provide Virtual Reality technologies and the underlying methodology as a powerful tool for scientific-technological applications.
In terms of basic research, we develop advanced methods and algorithms for multimodal 3D user interfaces and explorative analyses in virtual environments. Furthermore, we focus on application-driven, interdisciplinary research in collaboration with RWTH Aachen institutes, Forschungszentrum Jülich, research institutions worldwide, and partners from business and industry, covering fields like simulation science, production technology, neuroscience, and medicine.
To this end, we are members of / associated with the following institutes and facilities:
Our offices are located in the RWTH IT Center, where we operate one of the largest Virtual Reality labs worldwide. The aixCAVE, a 30 sqm visualization chamber, makes it possible to interactively explore virtual worlds, is open to use by any RWTH Aachen research group.
News
• |
Active Participation at 2024 IEEE VIS Conference (VIS 2024) At this year's IEEE VIS Conference, several contributions of our visualization group were presented. Dr. Tim Gerrits chaired the 2024 SciVis Contest and presented two accepted papers: The short paper "DaVE - A Curated Database of Visualization Examples" by Jens Koenen, Marvin Petersen, Christoph Garth and Dr. Tim Gerrits as well as the contribution to the Workshop on Uncertainty Exploring Uncertainty Visualization for Degenerate Tensors in 3D Symmetric Second-Order Tensor Field Ensembles by Tadea Schmitz and Dr. Tim Gerrits, which was awarded the best paper award. Congratulations! |
Oct. 22, 2024 |
• |
Open Software Developer Position Join our team at the Virtual Reality and Immersive Visualization Group at RWTH Aachen University as a full-time Software Developer! If you are interested in using Unreal Engine 5 to power latest virtual reality and visualization hardware for cutting-edge research, visit the full position description for more information. |
Oct. 22, 2024 |
• |
Honorable Mention One Best Paper Honorable Mention Award of the VRST 2024 was given to Sevinc Eroglu for her paper entitled “Choose Your Reference Frame Right: An Immersive Authoring Technique for Creating Reactive Behavior”. |
Oct. 11, 2024 |
• |
Tim Gerrits as invited Keynote Speaker at the ParaView User Days in Lyon ParaView, developed by Kitware is one of the most-used open-source visualization and analysis tools, widely used in research and industry. For the second edition of the ParaView user days, Dr. Tim Gerrits was invited to share his insights of developing and providing visualization within the academic communities. |
Sept. 26, 2024 |
• |
Invited Talk at Visual Computing for Biology and Medicine This year's Eurographics Symposium on Visual Computing for Biologigy and Medicine VCBM in Magdeburg included a VCBM Fachgruppen Meeting with an invited presentation by Dr. Tim Gerrits on "Harnessing High Performance Infrastructure for Scientific Visualization of Medical Data". |
Sept. 20, 2024 |
• |
24th ACM International Conference on Intelligent Virtual Agents (IVA'24) Together with Willem-Paul Brinkmann from TU Delft University our colleague Dr. Andrea Bönsch presented her work on German and Dutch Translations of the Artificial-Social-Agent Questionnaire Instrument for Evaluating Human-Agent Interactions" at IVA 2024. |
Sept. 16, 2024 |
Recent Publications
Wayfinding in Immersive Virtual Environments as Social Activity Supported by Virtual Agents Frontiers in Virtual Reality, Section Virtual Reality and Human Behaviour
Effective navigation and interaction within immersive virtual environments rely on thorough scene exploration. Therefore, wayfinding is essential, assisting users in comprehending their surroundings, planning routes, and making informed decisions. Based on real-life observations, wayfinding is, thereby, not only a cognitive process but also a social activity profoundly influenced by the presence and behaviors of others. In virtual environments, these 'others' are virtual agents (VAs), defined as anthropomorphic computer-controlled characters, who enliven the environment and can serve as background characters or direct interaction partners. However, little research has been done to explore how to efficiently use VAs as social wayfinding support. In this paper, we aim to assess and contrast user experience, user comfort, and the acquisition of scene knowledge through a between-subjects study involving n = 60 participants across three distinct wayfinding conditions in one slightly populated urban environment: (i) unsupported wayfinding, (ii) strong social wayfinding using a virtual supporter who incorporates guiding and accompanying elements while directly impacting the participants' wayfinding decisions, and (iii) weak social wayfinding using flows of VAs that subtly influence the participants' wayfinding decisions by their locomotion behavior. Our work is the first to compare the impact of VAs' behavior in virtual reality on users' scene exploration, including spatial awareness, scene comprehension, and comfort. The results show the general utility of social wayfinding support, while underscoring the superiority of the strong type. Nevertheless, further exploration of weak social wayfinding as a promising technique is needed. Thus, our work contributes to the enhancement of VAs as advanced user interfaces, increasing user acceptance and usability. |
Choose Your Reference Frame Right: An Immersive Authoring Technique for Creating Reactive Behavior Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology
Immersive authoring enables content creation for virtual environments without a break of immersion. To enable immersive authoring of reactive behavior for a broad audience, we present modulation mapping, a simplified visual programming technique. To evaluate the applicability of our technique, we investigate the role of reference frames in which the programming elements are positioned, as this can affect the user experience. Thus, we developed two interface layouts: "surround-referenced" and "object-referenced". The former positions the programming elements relative to the physical tracking space, and the latter relative to the virtual scene objects. We compared the layouts in an empirical user study (n = 34) and found the surround-referenced layout faster, lower in task load, less cluttered, easier to learn and use, and preferred by users. Qualitative feedback, however, revealed the object-referenced layout as more intuitive, engaging, and valuable for visual debugging. Based on the results, we propose initial design implications for immersive authoring of reactive behavior by visual programming. Overall, modulation mapping was found to be an effective means for creating reactive behavior by the participants. Honorable Mention for Best Paper! |
InsitUE - Enabling Hybrid In-situ Visualizations through Unreal Engine and Catalyst Springer Lecture Notes (7th International Workshop on In Situ Visualization, ISC 2024)
In-situ, in-transit, and hybrid approaches have become well-established visualization methods over the last decades. Especially for large simulations, these paradigms enable visualization and additionally allow for early insights. While there has been a lot of research on combining these approaches with classical visualization software, only a few worked on combining in-situ/in-transit approaches with modern game engines. In this paper, we present and demonstrate InsitUE, a Catalyst2 compatible hybrid workflow that enables interactive real-time visualization of simulation results using Unreal Engine. |