header

Profile


photo

Martin Bellgardt, M. Sc.

Research Assistant in the Virtual Reality Team



Publications


Augmented Reality-Based Surgery on the Human Cadaver Using a New Generation of Optical Head-Mounted Displays: Development and Feasibility Study


Behrus Puladi, Mark Ooms, Martin Bellgardt, Mark Cesov, Myriam Lipprandt, Stefan Raith, Florian Peters, Stephan Christian Möhlhenrich, Andreas Prescher, Frank Hölzle, Torsten Wolfgang Kuhlen, Ali Modabber
JMIR Serious Games 2022
pubimg

Background: Although nearly one-third of the world’s disease burden requires surgical care, only a small proportion of digital health applications are directly used in the surgical field. In the coming decades, the application of augmented reality (AR) with a new generation of optical-see-through head-mounted displays (OST-HMDs) like the HoloLens (Microsoft Corp) has the potential to bring digital health into the surgical field. However, for the application to be performed on a living person, proof of performance must first be provided due to regulatory requirements. In this regard, cadaver studies could provide initial evidence.

Objective: The goal of the research was to develop an open-source system for AR-based surgery on human cadavers using freely available technologies.

Methods: We tested our system using an easy-to-understand scenario in which fractured zygomatic arches of the face had to be repositioned with visual and auditory feedback to the investigators using a HoloLens. Results were verified with postoperative imaging and assessed in a blinded fashion by 2 investigators. The developed system and scenario were qualitatively evaluated by consensus interview and individual questionnaires.

Results: The development and implementation of our system was feasible and could be realized in the course of a cadaver study. The AR system was found helpful by the investigators for spatial perception in addition to the combination of visual as well as auditory feedback. The surgical end point could be determined metrically as well as by assessment.

Conclusions: The development and application of an AR-based surgical system using freely available technologies to perform OST-HMD–guided surgical procedures in cadavers is feasible. Cadaver studies are suitable for OST-HMD–guided interventions to measure a surgical end point and provide an initial data foundation for future clinical trials. The availability of free systems for researchers could be helpful for a possible translation process from digital health to AR-based surgery using OST-HMDs in the operating theater via cadaver studies.

» Show BibTeX

@article{puladi2022augmented,
title={Augmented Reality-Based Surgery on the Human Cadaver Using a New Generation of Optical Head-Mounted Displays: Development and Feasibility Study},
author={Puladi, Behrus and Ooms, Mark and Bellgardt, Martin and Cesov, Mark and Lipprandt, Myriam and Raith, Stefan and Peters, Florian and M{\"o}hlhenrich, Stephan Christian and Prescher, Andreas and H{\"o}lzle, Frank and others},
journal={JMIR Serious Games},
volume={10},
number={2},
pages={e34781},
year={2022},
publisher={JMIR Publications Inc., Toronto, Canada}
}





Poster: Virtual Optical Bench: A VR Learning Tool For Optical Design


Sebastian Pape, Martin Bellgardt, David Gilbert, Georg König, Torsten Wolfgang Kuhlen
IEEE Conference on Virtual Reality and 3D User Interfaces 2021
pubimg

The design of optical lens assemblies is a difficult process that requires lots of expertise. The teaching of this process today is done on physical optical benches, which are often too expensive for students to purchase. One way of circumventing these costs is to use software to simulate the optical bench. This work presents a virtual optical bench, which leverages real-time ray tracing in combination with VR rendering to create a teaching tool which creates a repeatable, non-hazardous, and feature-rich learning environment. The resulting application was evaluated in an expert review with 6 optical engineers.

» Show Videos
» Show BibTeX

@INPROCEEDINGS{Pape2021,
author = {Pape, Sebastian and Bellgardt, Martin and Gilbert, David and König, Georg and Kuhlen, Torsten W.},
booktitle = {2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)},
title = {Virtual Optical Bench: A VR learning tool for optical design},
year = {2021},
volume ={},
number = {},
pages = {635-636},
doi = {10.1109/VRW52623.2021.00200}
}





An Immersive Node-Link Visualization of Artificial Neural Networks for Machine Learning Experts


Martin Bellgardt, Christian Scheiderer, Torsten Wolfgang Kuhlen
3rd International Conference on Artificial Intelligence & Virtual Reality (IEEE AIVR)
pubimg

The black box problem of artificial neural networks (ANNs) is still a very relevant issue. When communicating basic concepts of ANNs, they are often depicted as node-link diagrams. Despite this being a straight forward way to visualize them, it is rarely used outside an educational context. However, we hypothesize that large-scale node-link diagrams of full ANNs could be useful even to machine learning experts. Hence, we present a visualization tool that depicts convolutional ANNs as node-link diagrams using immersive virtual reality. We applied our tool to a use-case in the field of machine learning research and adapted it to the specific challenges. Finally, we performed an expert review to evaluate the usefulness of our visualization. We found that our node-link visualization of ANNs was perceived as helpful in this professional context.

» Show BibTeX

@inproceedings{Bellgardt2020a,
author = {Bellgardt, Martin and Scheiderer, Christian and Kuhlen, Torsten W.},
booktitle = {Proc. of IEEE AIVR}, title = {{An Immersive Node-Link Visualization of Artificial Neural Networks for Machine Learning Experts}},
year = {2020}
}





When Spatial Devices are not an Option : Object Manipulation in Virtual Reality using 2D Input Devices


Martin Bellgardt, Niklas Krause, Torsten Wolfgang Kuhlen
Virtuelle und Erweiterte Realität, 17. Workshop der GI-Fachgruppe VR/AR
pubimg

With the advent of low-cost virtual reality hardware, new applications arise in professional contexts. These applications have requirements that can differ from the usual premise when developing immersive systems. In this work, we explore the idea that spatial controllers might not be usable for practical reasons, even though they are the best interaction device for the task. Such a reason might be fatigue, as applications might be used over a long period of time. Additionally, some people might have even more difficulty lifting their hands, due to a disability. Hence, we attempt to measure how much the performance in a spatial interaction task decreases when using classical 2D interaction devices instead of a spatial controller. For this, we developed an interaction technique that uses 2D inputs and borrows principles from desktop interaction. We show that our interaction technique is slower to use than the state-of-the-art spatial interaction but is not much worse regarding precision and user preference.

» Show Videos
» Show BibTeX

@inproceedings{Bellgardt2020,
author = {Bellgardt, Martin and Krause, Niklas and Kuhlen, Torsten W.},
booktitle = {Proc. of GI VR / AR Workshop},
title = {{When Spatial Devices are not an Option : Object Manipulation in Virtual Reality using 2D Input Devices}},
DOI = {10.18420/vrar2020_9}
year = {2020}
}





buenoSDIAs: Supporting Desktop Immersive Analytics While Actively Preventing Cybersickness


Daniel Zielasko, Martin Bellgardt, Alexander Meißner, Maliheh Haghgoo, Bernd Hentschel, Benjamin Weyers, Torsten Wolfgang Kuhlen
Proceedings of IEEE VIS Workshop on Immersive Analytics (2017)
pubimg

Immersive data analytics as an emerging research topic in scientific and information visualization has recently been brought back into the focus due to the emergence of low-cost consumer virtual reality hardware. Previous research has shown the positive impact of immersive visualization on data analytics workflows, but in most cases, insights were based on large-screen setups. In contrast, less research focuses on a close integration of immersive technology into existing, i.e., desktop-based data analytics workflows. This implies specific requirements regarding the usability of such systems, which include, i.e., the prevention of cybersickness. In this work, we present a prototypical application, which offers a first set of tools and addresses major challenges for a fully immersive data analytics setting in which the user is sitting at a desktop. In particular, we address the problem of cybersickness by integrating prevention strategies combined with individualized user profiles to maximize time of use.



Utilizing Immersive Virtual Reality in Everyday Work


Martin Bellgardt, Sebastian Pick, Daniel Zielasko, Tom Vierjahn, Benjamin Weyers, Torsten Wolfgang Kuhlen
Workshop on Everyday Virtual Reality (WEVR)
pubimg

Applications of Virtual Reality (VR) have been repeatedly explored with the goal to improve the data analysis process of users from different application domains, such as architecture and simulation sciences. Unfortunately, making VR available in professional application scenarios or even using it on a regular basis has proven to be challenging. We argue that everyday usage environments, such as office spaces, have introduced constraints that critically affect the design of interaction concepts since well-established techniques might be difficult to use. In our opinion, it is crucial to understand the impact of usage scenarios on interaction design, to successfully develop VR applications for everyday use. To substantiate our claim, we define three distinct usage scenarios in this work that primarily differ in the amount of mobility they allow for. We outline each scenario's inherent constraints but also point out opportunities that may be used to design novel, well-suited interaction techniques for different everyday usage environments. In addition, we link each scenario to a concrete application example to clarify its relevance and show how it affects interaction design.




Remain Seated: Towards Fully-Immersive Desktop VR


Daniel Zielasko, Benjamin Weyers, Martin Bellgardt, Sebastian Pick, Alexander Meißner, Tom Vierjahn, Torsten Wolfgang Kuhlen
IEEE Virtual Reality Workshop on Everyday Virtual Reality 2017
pubimg

In this work we describe the scenario of fully-immersive desktop VR, which serves the overall goal to seamlessly integrate with existing workflows and workplaces of data analysts and researchers, such that they can benefit from the gain in productivity when immersed in their data-spaces. Furthermore, we provide a literature review showing the status quo of techniques and methods available for realizing this scenario under the raised restrictions. Finally, we propose a concept of an analysis framework and the decisions made and the decisions still to be taken, to outline how the described scenario and the collected methods are feasible in a real use case.




Gistualizer: An Immersive Glyph for Multidimensional Datapoints


Martin Bellgardt, Sascha Gebhardt, Bernd Hentschel, Torsten Wolfgang Kuhlen
Workshop on Immersive Analytics 2017
pubimg

Data from diverse workflows is often too complex for an adequate analysis without visualization. One kind of data are multi-dimensional datasets, which can be visualized via a wide array of techniques. For instance, glyphs can be used to visualize individual datapoints. However, glyphs need to be actively looked at to be comprehended. This work explores a novel approach towards visualizing a single datapoint, with the intention of increasing the user’s awareness of it while they are looking at something else. The basic concept is to represent this point by a scene that surrounds the user in an immersive virtual environment. This idea is based on the observation that humans can extract low-detailed information, the so-called gist, from a scene nearly instantly (equal or less 100ms). We aim at providing a first step towards answering the question whether enough information can be encoded in the gist of a scene to represent a point in multi-dimensional space and if this information is helpful to the user’s understanding of this space.

» Show BibTeX

@inproceedings{Bellgardt2017,
author = {Bellgardt, Martin and Gebhardt, Sascha and Hentschel, Bernd and Kuhlen, Torsten W.},
booktitle = {Workshop on Immersive Analytics},
title = {{Gistualizer: An Immersive Glyph for Multidimensional Datapoints}},
year = {2017}
}





Disclaimer Home Visual Computing institute RWTH Aachen University