Anatomical Mirroring: Real-time User-specific Anatomy in Motion Using a Commodity Depth Camera
Armelle Bauer, Ali-Hamadi Dicko, François Faure, Olivier Palombi, Jocelyne Troccaz
This paper presents a mirror-like augmented reality (AR) system to display the internal anatomy of a user. Using a single Microsoft V2.0 Kinect, we animate in real-time a user-specific internal anatomy according to the user’s motion and we superimpose it onto the user’s color map, as shown in Fig.1.e. The user can visualize his anatomy moving as if he was able to look inside his own body in real-time.
A new calibration procedure to set up and attach a user-specific anatomy to the Kinect body tracking skeleton is introduced. At calibration time, the bone lengths are estimated using a set of poses. By using Kinect data as input, the practical limitation of skin correspondance in prior work is overcome. The generic 3D anatomical model is attached to the internal anatomy registration skeleton, and warped on the depth image using a novel elastic deformer, subject to a closest-point registration force and anatomical constraints. The noise in Kinect outputs precludes any realistic human display. Therefore, a novel filter to reconstruct plausible motions based on fixed length bones as well as realistic angular degrees of freedom (DOFs) and limits is introduced to enforce anatomical plausibility. Anatomical constraints applied to the Kinect body tracking skeleton joints are used to maximize the physical plausibility of the anatomy motion, while minimizing the distance to the raw data. At run-time, a simulation loop is used to attract the bones towards the raw data, and skinning shaders efficiently drag the resulting anatomy to the user’s tracked motion.
Our user-specific internal anatomy model is validated by comparing the skeleton with segmented MRI images. A user study is established to evaluate the believability of the animated anatomy.
Keywords:User-specific anatomy, Augmented Human, Real-Time, Motion Capture, Augmented Reality, Markerless Device. Motion In Games (MIG) 2016 - https://hal.inria.fr/hal-01366704
Living Book of Anatomy Project : See your Insides in Motion!
Armelle Bauer, Ali-Hamadi Dicko, Olivier Palombi, François Faure, Jocelyne Troccaz
The complexity of human anatomy makes learning and understanding it a difficult task. We present the Living Book of Anatomy (LBA) project, an augmented reality system for teaching anatomy.
Using a Kinect, we superimpose our 3d highly-detailed anatomical model onto the user's color map and we animate it. By showing our work, we hope to have interesting feedback from Emerging Technologies attendees.
We propose a framework to investigate a new way to learn musculoskeletal anatomical kinetics using interactive motion capture and visualization. It can be used to facilitate the learning of anatomy by medicine and sports students, and for the general public to discover human anatomy in action. We illustrate our approach using the example of knee flexion and extension by visualizing the knee muscle activation prediction with agonist and antagonist co-contraction.
activation data for specified movements is first measured during a preliminary phase. The user is then tracked in real time, and its motion is analyzed to recognize the motion being performed. This is used to efficiently evaluate muscle activation by interpolating the activation data stored in tables. The visual feedback consists of a user-specific 3D avatar created by deforming a reference model and animated using the tracking. Muscle activation is visualized using colored lines of action or 3D meshes. This work was made possible by the collaboration of three complementary labs specialized in computer-aided medical intervention, computer graphics and biomechanics.
Keywords:Anatomy Learning, Biomechanical Simulation, Real-Time, Augmented Reality, Motion Capture and Reconstruction. Eurographics Workshop on Visual Computing for Biology and Medicine (VCBM) 2014 - https://hal.inria.fr/hal-01057027
My Corporis Fabrica : Making Anatomy Easy
Armelle Bauer, Federico Ulliana, Ali-Hamadi Dicko, Benjamin Gilles, Olivier Palombi, François Faure
We demonstrate MyCorporisFabrica, the first assistant tool for easily modeling and simulating anatomy. Based on a character skin provided by the user, a complete anatomical model is automatically created. The user can create a selection graphically or using an interactive knowledge base, then export it for visualization or simulation.