Browsing by Author "Kabuye, Ernest"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item A mixed reality system combining augmented reality, 3D bio‑printed physical environments and inertial measurement unit sensors for task planning(Virtual Reality, 2023) Kabuye, Ernest; LeDuc, Philip; Cagan, JonathanSuccessful surgical operations are characterized by preplanning routines to be executed during actual surgical operations. To achieve this, surgeons rely on the experience acquired from the use of cadavers, enabling technologies like virtual reality (VR) and clinical years of practice. However, cadavers, having no dynamism and realism as they lack blood, can exhibit limited tissue degradation and shrinkage, while current VR systems do not provide amplified haptic feedback. This can impact surgical training increasing the likelihood of medical errors. This work proposes a novel Mixed Reality Combination System (MRCS) that pairs Augmented Reality (AR) technology and an inertial measurement unit (IMU) sensor with 3D printed, collagen-based specimens that can enhance task performance like planning and execution. To achieve this, the MRCS charts out a path prior to a user task execution based on a visual, physical, and dynamic environment on the state of a target object by utilizing surgeon-created virtual imagery that, when projected onto a 3D printed biospecimen as AR, reacts visually to user input on its actual physical state. This allows a real-time user reaction of the MRCS by displaying new multi-sensory virtual states of an object prior to performing on the actual physical state of that same object enabling effective task planning. Tracked user actions using an integrated 9-Degree of Freedom IMU demonstrate task execution This demonstrates that a user, with limited knowledge of specific anatomy, can, under guidance, execute a preplanned task. In addition, to surgical planning, this system can be generally applied in areas such as construction, maintenance, and education.Item Tracking of Scalpel Motions With an Inertial Measurement Unit System(IEEE Sensors Journal, 2022) Kabuye, Ernest; Hellebrekers, Tess; Majidi, Carmel; Cagan, Jonathan; Leduc, PhilipSurgical planning to visualize a complete procedure before surgical intervention, paired with the advanced surgical techniques of a surgeon, has been shown to improve surgical outcomes. Efforts to improve surgical planning have included tracking real-time surgeon movements via surgical instruments in a confined body cavity space in the human body to enhance specific techniques when performing minimally invasive surgery. In this work, a surgical tool tracking approach is presented that leverages small scale electronics to enable real-time position capture for use in iterative surgical planning. By integrating a lightweight 9 degree-of-freedom Inertial Measurement Unit (IMU), our system captures both spatial and temporal information of the surgical tool without requiring a direct line-of-sight. The IMU is printed on a flexible film and attached to and integrated with a surgical tool demonstrating its tracking capabilities. Data from the IMU is analyzed to determine the full range of motion during angular displacement for measurement and tracking. The results show an accuracy of 2.2 0 , 2.9 0 and 3.1 0 of the full range of motion of the X (Yaw), Y (Roll) and Z (Pitch) Euler angle coordinate system respectively demonstrating the potential for surgical tool tracking measurement without the need for a direct line of sight and with future impact including flexible electronics and motion tracking. This work will be helpful in a diversity of fields including surgery, surgical training, biomaterials, and motion tracking.