Welcome to the Virtual Reality & Immersive Visualization Group
at RWTH Aachen University!

The Virtual Reality and Immersive Visualization Group started in 1998 as a service team in the RWTH IT Center. Since 2015, we are a research group (Lehr- und Forschungsgebiet) at i12 within the Computer Science Department. Moreover, the Group is member of the Visual Computing Institute and continues to be an integral part of the RWTH IT Center.

In a unique combination of research, teaching, services and infrastructure, we provide Virtual Reality technologies and the underlying methodology as a powerful tool for scientific-technological applications.

In terms of basic research, we develop advanced methods and algorithms for multimodal 3D user interfaces and explorative analyses in virtual environments. Furthermore, we focus on application-driven, interdisciplinary research in collaboration with RWTH Aachen institutes, Forschungszentrum Jülich, research institutions worldwide, and partners from business and industry, covering fields like simulation science, production technology, neuroscience, and medicine.

To this end, we are members of / associated with the following institutes and facilities:

Our offices are located in the RWTH IT Center, where we operate one the largest Virtual Reality labs worldwide. The aixCAVE, a 30 sqm visualization chamber, makes it possible to interactively explore virtual worlds, is open to use by any RWTH Aachen research group.

Jury Member on SciVis 2016

After being the contest co-chair in 2014 and contest chair in 2015, Dr. Bernd Hentschel is a jury member of this year’s Scientific Visualization Contest 2016. Titled with "Particular Ensembles", the contest aims at visualizing the evolution of viscous fingers across time and multiple resolutions and study the variation of this evolution across the provided ensemble.
On Wednesday, October 26, 2016 the results of the contest will be presented at IEEE VIS 2016 in Baltimore, Maryland, USA.

Oct. 26, 2016

Best Poster Award

The Best Poster Award of the 6th IEEE Symposium on Large Data Analysis and Visualization 2016 (LDAV 2016) was given to Dr. Tom Vierjahn for his poster entitled “Correlating Sub-Phenomena in Performance Data in the Frequency Domain”.

Oct. 24, 2016

Concrete Research: Four Labs of RWTH Aachen presented in AN/AZ magazin

On Saturday, October 8th, 2016, the AN/AN magazin gives an insight into four labs of RWTH Aachen University. On of these, is our Virtual Reality lab.

Oct. 8, 2016

Aachen 2025

In the Aachen 2025, event held on 23rd, 24th and 25th September 2016 in Aachen, we were active in two out of eight Theme Parks, namely “Production” and “Working”. Read more...

Sept. 27, 2016

InnoTrans 2016: Experiencing VR-based Timetabling and Track Occupancy Between Aachen and Cologne

One of our service partners, the VIA Consulting & Development GmbH, is represented at the InnoTrans 2016 in Berlin (20.09 - 23.09), the leading international trade fair for transport technology with focus on five segments: Railway Technology, Railway Infrastructure, Public Transport, Interiors and Tunnel Construction.

At their boost, VIA-Con presents inter alia a short video about our common investigations regarding the benefit of 3D visualizations of timetabling and track occupancies on railways. The video is also available in our youtube profile.

Sept. 20, 2016

Benedikt Thelen passes apprenticeship and bachelor’s degree


Sept. 7, 2016

Recent Publications

Accurate and adaptive contact modeling for multi-rate multi-point haptic rendering of static and deformable environments

Computers & Graphics (Journal) (2016)

Common approaches for the haptic rendering of complex scenarios employ multi-rate simulation schemes. Here, the collision queries or the simulation of a complex deformable object are often performed asynchronously at a lower frequency, while some kind of intermediate contact representation is used to simulate interactions at the haptic rate. However, this can produce artifacts in the haptic rendering when the contact situation quickly changes and the intermediate representation is not able to reflect the changes due to the lower update rate. We address this problem utilizing a novel contact model. It facilitates the creation of contact representations that are accurate for a large range of motions and multiple simulation time-steps. We handle problematic geometrically convex contact regions using a local convex decomposition and special constraints for convex areas. We combine our accurate contact model with an implicit temporal integration scheme to create an intermediate mechanical contact representation, which reflects the dynamic behavior of the simulated objects. To maintain a haptic real time simulation, the size of the region modeled by the contact representation is automatically adapted to the complexity of the geometry in contact. Moreover, we propose a new iterative solving scheme for the involved constrained dynamics problems. We increase the robustness of our method using techniques from trust region-based optimization. Our approach can be combined with standard methods for the modeling of deformable objects or constraint-based approaches for the modeling of, for instance, friction or joints. We demonstrate its benefits with respect to the simulation accuracy and the quality of the rendered haptic forces in several scenarios with one or more haptic proxies.


Interactive 3D Force-Directed Edge Bundling

Computer Graphics Forum (Journal) (2016) (to be published)

Interactive analysis of 3D relational data is challenging. A common way of representing such data are node-link diagrams as they support analysts in achieving a mental model of the data. However, naïve 3D depictions of complex graphs tend to be visually cluttered, even more than in a 2D layout. This makes graph exploration and data analysis less efficient. This problem can be addressed by edge bundling. We introduce a 3D cluster-based edge bundling algorithm that is inspired by the force-directed edge bundling (FDEB) algorithm [Holten2009] and fulfills the requirements to be embedded in an interactive framework for spatial data analysis. It is parallelized and scales with the size of the graph regarding the runtime. Furthermore, it maintains the edge’s model and thus supports rendering the graph in different structural styles. We demonstrate this with a graph originating from a simulation of the function of a macaque brain.


Visual Quality Adjustment for Volume Rendering in a Head-Tracked Virtual Environment

IEEE Transactions on Visualization and Computer Graphics (Journal) (2016)

To avoid simulator sickness and improve presence in immersive virtual environments (IVEs), high frame rates and low latency are required. In contrast, volume rendering applications typically strive for high visual quality that induces high computational load and, thus, leads to low frame rates. To evaluate this trade-off in IVEs, we conducted a controlled user study with 53 participants. Search and count tasks were performed in a CAVE with varying volume rendering conditions which are applied according to viewer position updates corresponding to head tracking. The results of our study indicate that participants preferred the rendering condition with continuous adjustment of the visual quality over an instantaneous adjustment which guaranteed for low latency and over no adjustment providing constant high visual quality but rather low frame rates. Within the continuous condition, the participants showed best task performance and felt less disturbed by effects of the visualization during movements. Our findings provide a good basis for further evaluations of how to accelerate volume rendering in IVEs according to user’s preferences.

Disclaimer Home Visual Computing institute RWTH Aachen University