Holoview: An Immersive AR System for Interactive Anatomy Visualization and Learning

Ojaswa Sharma, and Anshul Goswami
  • Paper

  • Code

    (available soon!)

intro
Abstract

We present Holoview, an augmented reality (AR) system designed to support immersive and interactive learning of human anatomy. Holoview enables users to dynamically explore volumetric anatomical data through intuitive hand gestures in a 3D AR environment, allowing inspection of individual organs and cross-sectional views via clipping and bioscope features. The system adopts a lightweight client–server architecture optimized for real-time performance on the HoloLens through hybrid and foveated rendering. Our user study with participants possessing foundational anatomical knowledge demonstrated improvements in task-specific understanding and reported high engagement with gesture-based interactions. The system was perceived as engaging and intuitive, particularly for organ selection and cross-sectional exploration, with low cognitive load and increasing ease of use over time. These findings highlight Holoview's potential as a formative step toward immersive, user-centered AR systems for anatomy education.


Demo video

System architecture
intro
Selective Rendering Navigation
Selective rendering in Holoview enables interactive selection and visualization of specific organs using hand gestures. Navigation enables users to explore the human body model in the AR environment through hand gestures.
Bioscope Clipping Plane
Bioscope mode is designed to examine a single organ in the body. A clipped volume visualization is achieved using a hand-controlled clipping plane.

Website content is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Website source code based on the Nerfies project page. If you want to reuse their source code, please credit them appropriately.