A mixed reality system combining augmented reality, 3D bio‑printed physical environments and inertial measurement unit sensors for task planning

dc.contributor.authorKabuye, Ernest
dc.contributor.authorLeDuc, Philip
dc.contributor.authorCagan, Jonathan
dc.date.accessioned2023-03-30T11:21:24Z
dc.date.available2023-03-30T11:21:24Z
dc.date.issued2023
dc.description.abstractSuccessful surgical operations are characterized by preplanning routines to be executed during actual surgical operations. To achieve this, surgeons rely on the experience acquired from the use of cadavers, enabling technologies like virtual reality (VR) and clinical years of practice. However, cadavers, having no dynamism and realism as they lack blood, can exhibit limited tissue degradation and shrinkage, while current VR systems do not provide amplified haptic feedback. This can impact surgical training increasing the likelihood of medical errors. This work proposes a novel Mixed Reality Combination System (MRCS) that pairs Augmented Reality (AR) technology and an inertial measurement unit (IMU) sensor with 3D printed, collagen-based specimens that can enhance task performance like planning and execution. To achieve this, the MRCS charts out a path prior to a user task execution based on a visual, physical, and dynamic environment on the state of a target object by utilizing surgeon-created virtual imagery that, when projected onto a 3D printed biospecimen as AR, reacts visually to user input on its actual physical state. This allows a real-time user reaction of the MRCS by displaying new multi-sensory virtual states of an object prior to performing on the actual physical state of that same object enabling effective task planning. Tracked user actions using an integrated 9-Degree of Freedom IMU demonstrate task execution This demonstrates that a user, with limited knowledge of specific anatomy, can, under guidance, execute a preplanned task. In addition, to surgical planning, this system can be generally applied in areas such as construction, maintenance, and education.en_US
dc.identifier.citationKabuye, E., LeDuc, P., & Cagan, J. (2023). A mixed reality system combining augmented reality, 3D bio-printed physical environments and inertial measurement unit sensors for task planning. Virtual Reality, 1-14.https://doi.org/10.1007/s10055-023-00777-0en_US
dc.identifier.urihttps://nru.uncst.go.ug/handle/123456789/8337
dc.language.isoenen_US
dc.publisherVirtual Realityen_US
dc.subjectTask planningen_US
dc.subjectInertial measurement uniten_US
dc.subjectThree-dimensional biological printingen_US
dc.subjectAugmented realityen_US
dc.titleA mixed reality system combining augmented reality, 3D bio‑printed physical environments and inertial measurement unit sensors for task planningen_US
dc.typeArticleen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
A mixed reality system combining augmented reality, 3D bio-printed physical environments and inertial measurement unit sensors for task planning.pdf
Size:
2.16 MB
Format:
Adobe Portable Document Format
Description:
A mixed reality system combining augmented reality, 3D bio-printed physical environments and inertial measurement unit sensors for task planning
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: