In this paper, we study low-cost motion tracking systems for range of motion (RoM) measurements in the tele-rehabilitation context using Augmented Reality. We propose simple yet effective extensions of the Microsoft Kinect SDK 2.0 skeleton tracking algorithm. Our extensions consist of temporal smoothing of the joint estimates as well as an intuitive, patient-specific adjustment of the bone lengths that is implemented as a quick, one-time calibration performed by the therapist. We compare our system to the Kinect v1, the non-modified Kinect v2, a marker-based optical tracking system, and the clinical gold standard set by two subject-matter-experts using a goniometer. We study the accuracy of all systems in RoM measurement on the elbow joints. We quantitatively compare angular deviation from the expert measurements and perform analysis on statistical confidence. The results indicate, that the proposed personalized setup substantially outperforms all competing systems and effectively corrects for the systematic error of the skeleton tracking, particularly at full flexion. The improved system matched the observations of both experts with a mean error of 3:78° We conclude, that the proposed, personalized method for RoM measurement with Augmented Reality feedback is promising for tele-rehabilitation scenarios. Future work will investigate whether similar strategies can be applied to more complex joints, such as the shoulder.
C-Arm X-Ray systems are the workhorse modality for guidance of percutaneous orthopaedic surgical procedures. However, two-dimensional observations of the three-dimensional anatomy suffer from the effects of projective simplification. Consequently, many X-Ray images from various orientations need to be acquired for the surgeon to accurately assess the spatial relations between the patient’s anatomy and the surgical tools.
In this paper, we present an on-the-fly surgical support system that provides guidance using augmented reality and can be used in quasi-unprepared operating rooms. The proposed system builds upon a multi-modality marker and simultaneous localization and mapping technique to co-calibrate an optical see-through head mounted display to a C-Arm fluoroscopy system. Then, annotations on the 2-D X-Ray images can be rendered as virtual objects in 3-D providing surgical guidance. In a feasibility study on a semi-anthropomorphic phantom we found the accuracy of our system to be comparable to the traditional image-guided technique while substantially reducing the number of acquired X-Ray images as well as procedure time. Our promising results encourage further research on the interaction between virtual and real objects, that we believe will directly benefit the proposed method. Further, we would like to explore the capabilities of our on-the-fly augmented reality support system in a larger study directed towards common orthopaedic interventions.
Fluoroscopic x-ray guidance is a cornerstone for percutaneous orthopedic surgical procedures. However, two-dimensional (2-D) observations of the three-dimensional (3-D) anatomy suffer from the effects of projective simplification. Consequently, many x-ray images from various orientations need to be acquired for the surgeon to accurately assess the spatial relations between the patient’s anatomy and the surgical tools. We present an on-the-fly surgical support system that provides guidance using augmented reality and can be used in quasiunprepared operating rooms. The proposed system builds upon a multimodality marker and simultaneous localization and mapping technique to cocalibrate an optical see-through head mounted display to a C-arm fluoroscopy system. Then, annotations on the 2-D x-ray images can be rendered as virtual objects in 3-D providing surgical guidance. We quantitatively evaluate the components of the proposed system and, finally, design a feasibility study on a semianthropomorphic phantom. The accuracy of our system was comparable to the traditional image-guided technique while substantially reducing the number of acquired x-ray images as well as procedure time. Our promising results encourage further research on the interaction between virtual and real objects that we believe will directly benefit the proposed method. Further, we would like to explore the capabilities of our on-the-fly augmented reality support system in a larger study directed toward common orthopedic interventions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.