3,066 results on '"Augmented reality"'
Search Results
202. Systematic Review on the Impact of Mobile Applications with Augmented Reality to Improve Health.
- Author
-
Piqueras-Sola B, Cortés-Martín J, Rodríguez-Blanque R, Menor-Rodríguez MJ, Mellado-García E, Merino Lobato C, and Sánchez-García JC
- Abstract
Physical inactivity represents a significant public health challenge globally. Mobile applications, particularly those utilizing augmented reality (AR), have emerged as innovative tools for promoting physical activity. However, a systematic evaluation of their efficacy is essential. This systematic review aims to evaluate and synthesize the evidence regarding the effectiveness and benefits of mobile applications with augmented reality in enhancing physical activity and improving health outcomes. A comprehensive search was conducted in Scopus, PubMed, WOS, and the Cochrane Library databases following PRISMA guidelines. Observational and interventional studies evaluating AR mobile applications for physical exercise were included, without restrictions on publication date or language. The search terms included "Mobile Applications", "Augmented Reality", "Physical Fitness", "Exercise Therapy", and "Health Behavior". The methodological quality was assessed using the ROBINS tool. The review identified twelve eligible studies encompassing 5,534,661 participants. The findings indicated significant increases in physical activity and improvements in mental health associated with the use of AR applications, such as Pokémon GO. However, potential risk behaviors were also noted. The evidence suggests that AR interventions can effectively promote physical activity and enhance health. Nonetheless, further research is needed to address limitations and optimize their efficacy. Future interventions should be tailored to diverse cultural contexts to maximize benefits and mitigate risks. AR mobile applications hold promise for promoting physical activity and improving health outcomes. Strategies to optimize their effectiveness and address identified risks should be explored to fully realize their potential.
- Published
- 2024
- Full Text
- View/download PDF
203. How people with brain injury run and evaluate a SLAM-based smartphone augmented reality application to assess object-location memory.
- Author
-
Mendez-Lopez M, Juan MC, Burgos T, Mendez M, and Fidalgo C
- Abstract
Augmented reality (AR) technology allows virtual objects to be superimposed on the real-world environment, offering significant potential for improving cognitive assessments and rehabilitation processes in the field of visuospatial learning. This study examines how patients with acquired brain injury (ABI) evaluate the functions and usability of a SLAM-based smartphone AR app to assess object-location skills. Ten ABI patients performed a task for the spatial recall of four objects using an AR app. The data collected from 10 healthy participants provided reference values for the best performance. Their perceptions of the AR app/technology and its usability were investigated. The results indicate lower effectiveness in solving the task in the patient group, as the time they needed to complete it was related to their level of impairment. The patients showed lower, yet positive, scores in factors related to app usability and acceptance (e.g., mental effort and satisfaction, respectively). There were more patients reported on entertainment as a positive aspect of the app. Patients' perceived enjoyment was related to concentration and calm, whereas usability was associated with perceived competence, expertise, and a lower level of physical effort. For patients, the sensory aspects of the objects were related to their presence, while for healthy participants, they were related to enjoyment and required effort. The results show that AR seems to be a promising tool to assess spatial orientation in the target patient population., (© 2024 The Author(s). PsyCh Journal published by Institute of Psychology, Chinese Academy of Sciences and John Wiley & Sons Australia, Ltd.)
- Published
- 2024
- Full Text
- View/download PDF
204. Augmented Reality-Guided Extraction of Fully Impacted Lower Third Molars Based on Maxillofacial CBCT Scans.
- Author
-
Rieder M, Remschmidt B, Gsaxner C, Gaessler J, Payer M, Zemann W, and Wallner J
- Abstract
(1) Background: This study aimed to integrate an augmented reality (AR) image-guided surgery (IGS) system, based on preoperative cone beam computed tomography (CBCT) scans, into clinical practice. (2) Methods: In preclinical and clinical surgical setups, an AR-guided visualization system based on Microsoft's HoloLens 2 was assessed for complex lower third molar (LTM) extractions. In this study, the system's potential intraoperative feasibility and usability is described first. Preparation and operating times for each procedure were measured, as well as the system's usability, using the System Usability Scale (SUS). (3) Results: A total of six LTMs ( n = 6) were analyzed, two extracted from human cadaver head specimens ( n = 2) and four from clinical patients ( n = 4). The average preparation time was 166 ± 44 s, while the operation time averaged 21 ± 5.9 min. The overall mean SUS score was 79.1 ± 9.3. When analyzed separately, the usability score categorized the AR-guidance system as "good" in clinical patients and "best imaginable" in human cadaver head procedures. (4) Conclusions: This translational study analyzed the first successful and functionally stable application of the HoloLens technology for complex LTM extraction in clinical patients. Further research is needed to refine the technology's integration into clinical practice to improve patient outcomes., Competing Interests: The authors declare no conflicts of interest. The funders had no role in the design of this study; in the collection, analyses, or interpretation of data; in the writing of this manuscript; or in the decision to publish the results.
- Published
- 2024
- Full Text
- View/download PDF
205. [Evaluation of augmented reality technology in the recognizing of oral and maxillofacial anatomy].
- Author
-
Tang Z, Hu L, Chen Z, Yu Y, Zhang W, and Peng X
- Subjects
- Humans, Anatomy education, Mouth anatomy & histology, Software, Augmented Reality, Imaging, Three-Dimensional methods
- Abstract
Objective: To evaluate the outcome of Augmented reality technology in the recognizing of oral and maxillofacial anatomy., Methods: This study was conducted on the undergraduate students in Peking University School of Stomatology who were learning oral and maxillofacial anatomy. The image data were selected according to the experiment content, and the important blood vessels and bone tissue structures, such as upper and lower jaws, neck arteries and veins were reconstructed in 3D(3-dimensional) by digital software to generate experiment models, and the reconstructed models were encrypted and stored in the cloud. The QR (quick response) code corresponding to the 3D model was scanned by a networked mobile device to obtain augmented reality images to assist experimenters in teaching and subjects in recognizing. Augmented reality technology was applied in both the theoretical explanation and cadaveric dissection respectively. Subjects' feedback was collected in the form of a post-class questionnaire to evaluate the effectiveness of augmented reality technology-assisted recognizing., Results: In the study, 83 undergraduate students were included as subjects in this study. Augmented reality technology could be successfully applied in the recognizing of oral and maxillofacial anatomy. All the subjects could scan the QR code through a connected mobile device to get the 3D anatomy model from the cloud, and zoom in/out/rotate the model on the mobile. Augmented reality technology could provide personalized 3D model, based on learners' needs and abilities. The results of likert scale showed that augmented reality technology was highly recognized by the students (9.19 points), and got high scores in terms of forming a three-dimensional sense and stimulating the enthusiasm for learning (9.01 and 8.85 points respectively)., Conclusion: Augmented reality technology can realize the three-dimensional visualization of important structures of oral and maxillofacial anatomy and stimulate students' enthusiasm for learning. Besides, it can assist students in building three-dimensional space imagination of the anatomy of oral and maxillofacial area. The application of augmented reality technology achieves favorable effect in the recognizing of oral and maxillofacial anatomy.
- Published
- 2024
206. Experimental Setup for Evaluating Depth Sensors in Augmented Reality Technologies Used in Medical Devices.
- Author
-
Stadnytskyi V and Ghammraoui B
- Subjects
- Humans, Virtual Reality, Augmented Reality
- Abstract
This paper presents a fully automated experimental setup tailored for evaluating the effectiveness of augmented and virtual reality technologies in healthcare settings for regulatory purposes, with a focus on the characterization of depth sensors. The setup is constructed as a modular benchtop platform that enables quantitative analysis of depth cameras essential for extended reality technologies in a controlled environment. We detail a design concept and considerations for an experimental configuration aimed at simulating realistic scenarios for head-mounted displays. The system includes an observation platform equipped with a three-degree-of-freedom motorized system and a test object stage. To accurately replicate real-world scenarios, we utilized an array of sensors, including commonly available range-sensing cameras and commercial augmented reality headsets, notably the Intel RealSense L515 LiDAR camera, integrated into the motion control system. The paper elaborates on the system architecture and the automated data collection process. We discuss several evaluation studies performed with this setup, examining factors such as spatial resolution, Z-accuracy, and pixel-to-pixel correlation. These studies provide valuable insights into the precision and reliability of these technologies in simulated healthcare environments.
- Published
- 2024
- Full Text
- View/download PDF
207. Mental health in the virtual world: Challenges and opportunities in the metaverse era.
- Author
-
López Del Hoyo Y, Elices M, and Garcia-Campayo J
- Abstract
Current rates of mental illness are worrisome. Mental illness mainly affects females and younger age groups. The use of the internet to deliver mental health care has been growing since 2020 and includes the implementation of novel mental health treatments using virtual reality, augmented reality, and artificial intelligence. A new three dimensional digital environment, known as the metaverse, has emerged as the next version of the Internet. Artificial intelligence, augmented reality, and virtual reality will create fully immersive, experiential, and interactive online environments in the metaverse. People will use a unique avatar to do anything they do in their "real" lives, including seeking and receiving mental health care. In this opinion review, we reflect on how the metaverse could reshape how we deliver mental health treatment, its opportunities, and its challenges., Competing Interests: Conflict-of-interest statement: All authors declare having no conflicts of interest., (©The Author(s) 2024. Published by Baishideng Publishing Group Inc. All rights reserved.)
- Published
- 2024
- Full Text
- View/download PDF
208. Microscope-Based Augmented Reality: A New Approach in Intraoperative 3D Visualization in Microvascular Decompression?
- Author
-
Tanrikulu L
- Abstract
Neurovascular compression (NVC) syndromes such as trigeminal neuralgia (TN) are causally treated with microvascular decompression (MVD). Semiautomatic segmentation of high-resolution magnetic resonance imaging (MRI) data and constructive interference in steady state (CISS)/time-of-flight (TOF) sequences are utilized for the three-dimensional (3D) visualization of underlying causative vessels at the root entry zones of the relevant cranial nerves. Augmented reality (AR) of neurovascular structures was introduced especially in the resection of brain tumors or aneurysmatic operations. In this report, the potential feasibility of the implementation of microscope-based AR into the intraoperative microsurgical set-up of MVD was investigated. This article recommends the preoperative evaluation of 3D visualization besides the microscopical view of the surgeon. The implementation of multiple imaging data by AR into the operating microscope may afflict the experienced surgeon's view, which should be examined prospectively., Competing Interests: Human subjects: Consent was obtained or waived by all participants in this study. Ethics Committee of the University of Marburg issued approval Az.: 23-32-RS. Animal subjects: All authors have confirmed that this study did not involve animal subjects or tissue. Conflicts of interest: In compliance with the ICMJE uniform disclosure form, all authors declare the following: Payment/services info: All authors have declared that no financial support was received from any organization for the submitted work. Financial relationships: All authors have declared that they have no financial relationships at present or within the previous three years with any organizations that might have an interest in the submitted work. Other relationships: All authors have declared that there are no other relationships or activities that could appear to have influenced the submitted work., (Copyright © 2024, Tanrikulu et al.)
- Published
- 2024
- Full Text
- View/download PDF
209. Virtual/augmented reality-based human-machine interface and interaction modes in airport control towers.
- Author
-
Bagassi S, Corsi M, De Crescenzio F, Santarelli R, Simonetti A, Moens L, and Terenzi M
- Subjects
- Humans, Virtual Reality, Aviation, Airports, Man-Machine Systems, Augmented Reality, User-Computer Interface
- Abstract
The concept of an innovative human-machine interface and interaction modes based on virtual and augmented reality technologies for airport control towers has been developed with the aim of increasing the human performances and situational awareness of air traffic control operators. By presenting digital information through see-through head-mounted displays superimposed over the out-of-the-tower view, the proposed interface should stimulate controllers to operate in a head-up position and, therefore, reduce the number of switches between a head-up and a head-down position even in low visibility conditions. This paper introduces the developed interface and describes the exercises conducted to validate the technical solutions developed, focusing on the simulation platform and exploited technologies, to demonstrate how virtual and augmented reality, along with additional features such as adaptive human-machine interface, multimodal interaction and attention guidance, enable a more natural and effective interaction in the control tower. The results of the human-in-the-loop real-time validation exercises show that the prototype concept is feasible from both an operational and technical perspective, the solution proves to support the air traffic controllers in working in a head-up position more than head-down even with low-visibility operational scenarios, and to lower the time to react in critical or alerting situations with a positive impact on the human performances of the user. While showcasing promising results, this study also identifies certain limitations and opportunities for refinement, aimed at further optimising the efficacy and usability of the proposed interface., (© 2024. The Author(s).)
- Published
- 2024
- Full Text
- View/download PDF
210. A prospective, single-centre study of the feasibility of the use of augmented reality for improving the safety and traceability of injectable investigational cancer drug compounding.
- Author
-
Lecoutre A, Vasseur M, Courtin J, Hammadi S, Decaudin B, and Pascal O
- Abstract
The compounding of injectable cancer drugs for clinical trials often requires specific procedures, with limited access to the starting materials and especially the active compound. These characteristics prevent the application of qualitative or quantitative analyses and quality control techniques. Hence, for some very complex compounding operations, double visual inspection is considered to be less reliable, more time-consuming and more human-resource-intensive than other methods. The compounding team at Lille University Hospital (Lille, France) has equipped one of its preparation areas with a new device: augmented reality (AR) eyewear connected to an oncology drug management system, as a support tool for compounding and quality control. The tool has been tested, adapted and improved within the unit and is now used for investigational drug compounding on a routine basis. In a prospective, single-centre study, we evaluated the feasibility of the implementation of this novel AR approach for the compounding of injectable investigational cancer drugs. During the 6-month study period, 564 clinical trial compounding operations were performed with the AR eyewear. The proportion of poor-quality photos taken with the AR eyewear fell over time, as users became more familiar with the tool. A user satisfaction survey highlighted a very high level of uptake and a wish to broaden the scope of the compounding performed with AR support. The AR eyewear constitutes an innovative, cost-effective tool that increased the level of safety without disrupting the unit's operating procedures. The tool's flexibility enabled its integration into a variety of working environments. The various improvements now being developed should help to further boost the added value of this novel device., Competing Interests: The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.Michele VASSEUR reports was provided by Lille University Hospital Center. ARTHUR LECOUTRE reports a relationship with Lille University Hospital Center that includes: employment. There is no conflict of interest to disclaim in this article., (© 2024 The Authors.)
- Published
- 2024
- Full Text
- View/download PDF
211. Medical Extended Reality for Radiology Education and Training.
- Author
-
Lang M, Ghandour S, Rikard B, Balasalle EK, Rouhezamin MR, Zhang H, and Uppot RN
- Abstract
Medical extended reality (MXR), encompassing augmented reality, virtual reality, and mixed reality (MR), presents a novel paradigm in radiology training by offering immersive, interactive, and realistic learning experiences in health care. Although traditional educational tools in the field of radiology are essential, it is necessary to capitalize on the innovative and emerging educational applications of extended reality (XR) technologies. At the most basic level of learning anatomy, XR has been extensively used with an emphasis on its superiority over conventional learning methods, especially in spatial understanding and recall. For imaging interpretation, XR has fostered the concepts of virtual reading rooms by enabling collaborative learning environments and enhancing image analysis and understanding. Moreover, image-guided interventions in interventional radiology have witnessed an uptick in XR utilization, illustrating its effectiveness in procedural training and skill acquisition for medical students and residents in a safe and risk-free environment. However, there remain several challenges and limitations for XR in radiology education, including technological, economic, and ergonomic challenges and and integration into existing curricula. This review explores the transformative potential of MXR in radiology education and training along with insights on the future of XR in radiology education, forecasting advancements in immersive simulations, artificial intelligence integration for personalized learning, and the potential of cloud-based XR platforms for remote and collaborative training. In summation, MXR's burgeoning role in reshaping radiology education offers a safer, scalable, and more efficient training model that aligns with the dynamic healthcare landscape., (Copyright © 2024 American College of Radiology. Published by Elsevier Inc. All rights reserved.)
- Published
- 2024
- Full Text
- View/download PDF
212. Beyond the visible: preliminary evaluation of the first wearable augmented reality assistance system for pancreatic surgery.
- Author
-
Javaheri H, Ghamarnejad O, Bade R, Lukowicz P, Karolus J, and Stavrou GA
- Abstract
Purpose: The retroperitoneal nature of the pancreas, marked by minimal intraoperative organ shifts and deformations, makes augmented reality (AR)-based systems highly promising for pancreatic surgery. This study presents preliminary data from a prospective study aiming to develop the first wearable AR assistance system, ARAS, for pancreatic surgery and evaluating its usability, accuracy, and effectiveness in enhancing the perioperative outcomes of patients., Methods: We developed ARAS as a two-phase system for a wearable AR device to aid surgeons in planning and operation. This system was used to visualize and register patient-specific 3D anatomical models during the surgery. The location and precision of the registered 3D anatomy were evaluated by assessing the arterial pulse and employing Doppler and duplex ultrasonography. The usability, accuracy, and effectiveness of ARAS were assessed using a five-point Likert scale questionnaire., Results: Perioperative outcomes of five patients underwent various pancreatic resections with ARAS are presented. Surgeons rated ARAS as excellent for preoperative planning. All structures were accurately identified without any noteworthy errors. Only tumor identification decreased after the preparation phase, especially in patients who underwent pancreaticoduodenectomy because of the extensive mobilization of peripancreatic structures. No perioperative complications related to ARAS were observed., Conclusions: ARAS shows promise in enhancing surgical precision during pancreatic procedures. Its efficacy in preoperative planning and intraoperative vascular identification positions it as a valuable tool for pancreatic surgery and a potential educational resource for future surgical residents., (© 2024. The Author(s).)
- Published
- 2024
- Full Text
- View/download PDF
213. Requirement analysis for an AI-based AR assistance system for surgical tools in the operating room: stakeholder requirements and technical perspectives.
- Author
-
Cramer E, Kucharski AB, Kreimeier J, Andreß S, Li S, Walk C, Merkl F, Högl J, Wucherer P, Stefan P, von Eisenhart-Rothe R, Enste P, and Roth D
- Abstract
Purpose: We aim to investigate the integration of augmented reality (AR) within the context of increasingly complex surgical procedures and instrument handling toward the transition to smart operating rooms (OR). In contrast to cumbersome paper-based surgical instrument manuals still used in the OR, we wish to provide surgical staff with an AR head-mounted display that provides in-situ visualization and guidance throughout the assembly process of surgical instruments. Our requirement analysis supports the development and provides guidelines for its transfer into surgical practice., Methods: A three-phase user-centered design approach was applied with online interviews, an observational study, and a workshop with two focus groups with scrub nurses, circulating nurses, surgeons, manufacturers, clinic IT staff, and members of the sterilization department. The requirement analysis was based on key criteria for usability. The data were analyzed via structured content analysis., Results: We identified twelve main problems with the current use of paper manuals. Major issues included sterile users' inability to directly handle non-sterile manuals, missing details, and excessive text information, potentially delaying procedure performance. Major requirements for AR-driven guidance fall into the categories of design, practicability, control, and integration into the current workflow. Additionally, further recommendations for technical development could be obtained., Conclusion: In conclusion, our insights have outlined a comprehensive spectrum of requirements that are essential for the successful implementation of an AI- and AR-driven guidance for assembling surgical instruments. The consistently appreciative evaluation by stakeholders underscores the profound potential of AR and AI technology as valuable assistance and guidance., (© 2024. The Author(s).)
- Published
- 2024
- Full Text
- View/download PDF
214. Computer Vision and Augmented Reality for Human-Centered Fatigue Crack Inspection.
- Author
-
Mojidra R, Li J, Mohammadkhorasani A, Moreu F, Bennett C, and Collins W
- Subjects
- Humans, Image Processing, Computer-Assisted methods, Algorithms, Augmented Reality
- Abstract
A significant percentage of bridges in the United States are serving beyond their 50-year design life, and many of them are in poor condition, making them vulnerable to fatigue cracks that can result in catastrophic failure. However, current fatigue crack inspection practice based on human vision is time-consuming, labor intensive, and prone to error. We present a novel human-centered bridge inspection methodology to enhance the efficiency and accuracy of fatigue crack detection by employing advanced technologies including computer vision and augmented reality (AR). In particular, a computer vision-based algorithm is developed to enable near-real-time fatigue crack detection by analyzing structural surface motion in a short video recorded by a moving camera of the AR headset. The approach monitors structural surfaces by tracking feature points and measuring variations in distances between feature point pairs to recognize the motion pattern associated with the crack opening and closing. Measuring distance changes between feature points, as opposed to their displacement changes before this improvement, eliminates the need of camera motion compensation and enables reliable and computationally efficient fatigue crack detection using the nonstationary AR headset. In addition, an AR environment is created and integrated with the computer vision algorithm. The crack detection results are transmitted to the AR headset worn by the bridge inspector, where they are converted into holograms and anchored on the bridge surface in the 3D real-world environment. The AR environment also provides virtual menus to support human-in-the-loop decision-making to determine optimal crack detection parameters. This human-centered approach with improved visualization and human-machine collaboration aids the inspector in making well-informed decisions in the field in a near-real-time fashion. The proposed crack detection method is comprehensively assessed using two laboratory test setups for both in-plane and out-of-plane fatigue cracks. Finally, using the integrated AR environment, a human-centered bridge inspection is conducted to demonstrate the efficacy and potential of the proposed methodology.
- Published
- 2024
- Full Text
- View/download PDF
215. The evolution of augmented reality to augment physical therapy: A scoping review.
- Author
-
Hsu PY, Singer J, and Keysor JJ
- Abstract
Augmented reality is increasingly used in health care, yet little is known about how AR is being used in physical therapy practice and what clinical outcomes could occur with technology use. In this scoping review, a broad literature review was conducted to gain an understanding of current knowledge of AR use and outcomes in physical therapy practice. A structured literature search of articles published between 2000 to September 2023 that examined the use of AR in a physical therapy context was conducted. Reference lists of articles for full review were searched for additional studies. Data from articles meeting inclusion criteria were extracted and synthesized across studies. 549 articles were identified; 40 articles met criteria for full review. Gait and balance of neurological and older adult populations were most frequently targeted, with more recent studies including orthopedic and other populations. Approximately half were pilot or observational studies and half are experimental. Many studies found within group improvements. Of studies reporting between group differences, AR interventions were more effective in improving function almost half of the time, with 20%, 27% and 28% showing efficacy in disability, balance, and gait outcomes. AR in physical therapy holds promise; however, efficacy outcomes are unclear., Competing Interests: The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article., (© The Author(s) 2024.)
- Published
- 2024
- Full Text
- View/download PDF
216. A novel portable augmented reality surgical navigation system for maxillofacial surgery: technique and accuracy study.
- Author
-
Li B, Wei H, Yan J, and Wang X
- Abstract
Surgical navigation, despite its potential benefits, faces challenges in widespread adoption in clinical practice. Possible reasons include the high cost, increased surgery time, attention shifts during surgery, and the mental task of mapping from the monitor to the patient. To address these challenges, a portable, all-in-one surgical navigation system using augmented reality (AR) was developed, and its feasibility and accuracy were investigated. The system achieves AR visualization by capturing a live video stream of the actual surgical field using a visible light camera and merging it with preoperative virtual images. A skull model with reference spheres was used to evaluate the accuracy. After registration, virtual models were overlaid on the real skull model. The discrepancies between the centres of the real spheres and the virtual model were measured to assess the AR visualization accuracy. This AR surgical navigation system demonstrated precise AR visualization, with an overall overlap error of 0.53 ± 0.21 mm. By seamlessly integrating the preoperative virtual plan with the intraoperative field of view in a single view, this novel AR navigation system could provide a feasible solution for the use of AR visualization to guide the surgeon in performing the operation as planned., (Copyright © 2024. Published by Elsevier Inc.)
- Published
- 2024
- Full Text
- View/download PDF
217. Unveiling the impact of the SMARTCLAP project on habilitation.
- Author
-
Bonello M, Buhagiar N, Farrugia P, and Mercieca J
- Abstract
This report summarises the SMARTCLAP research project, which employs a user-centred design approach to develop a revolutionary smart product service system. The system offers personalised motivation to encourage children with cerebral palsy to actively participate more during their occupational therapy sessions, while providing paediatric occupational therapists with an optimal tool to monitor children's progress from one session to another. The product service system developed includes of a smart wearable device called DigiClap used to interact with a serious game in an Augmented Reality environment. The report highlights the research methodology used to advance the technology readiness level from 4 to 6, acknowledging the contribution of the consortium team and funding source. As part of the technology's maturity process, DigiClap and the respective serious game were evaluated with target users, to identify the system's impact in supporting the children's overall participation and hand function, and to gather feedback from occupational therapists and caregivers on this novel technology. The outcomes of this study are discussed, highlighting limitations and lessons learned. The report also outlines future work and further funding for the sustainability of the project and to reach other individuals who have upper limb limitations. Ultimately, the potential of DigiClap and the overall achievements of this project are discussed., Competing Interests: This statement is to disclose that there are no conflicts of interest in connection with the submitted manuscript entitled “Innovation Report: Unveiling the Impact of the SMARTCLAP Project on Habilitation.” All authors have contributed to the development of the manuscript and are aware of its submission to the Innovation Report section under CSBJ Smart Hospital. The research was funded by the Malta Council for Science and Technology through the Fusion R&I Technology Development Programme 2020 (R&I-2019–003-T) and Fusion R&I Go To Market Accelerator Programme 2022 (R&I-2019–003-A)., (© 2024 The Authors.)
- Published
- 2024
- Full Text
- View/download PDF
218. Three-dimensional planning, navigation, patient-specific instrumentation and mixed reality in shoulder arthroplasty: a digital orthopedic renaissance.
- Author
-
Can Kolac U, Paksoy A, and Akgün D
- Abstract
Accurate component placement in shoulder arthroplasty is crucial for avoiding complications, achieving superior biomechanical performance and optimizing functional outcomes. Shoulder and elbow surgeons have explored various methods to improve surgical understanding and precise execution including preoperative planning with 3D computed tomography (CT), patient-specific instrumentation (PSI), intraoperative navigation, and mixed reality (MR). 3D preoperative planning facilitated by CT scans and advanced software, enhances surgical precision, influences decision-making for implant types and approaches, reduces errors in guide pin placement, and contributes to cost-effectiveness. Navigation demonstrates benefits in reducing malpositioning, optimizing baseplate stability, improving humeral cut, and potentially conserving bone stock, although challenges such as varied operating times and costs warrant further investigation. The personalized patient care and enhanced operational efficiency associated with PSI are not only attractive for achieving desired component positions but also hold promise for improved outcomes in complex cases involving glenoid bone loss. Augmented reality (AR) and virtual reality (VR) technologies play a pivotal role in reshaping shoulder arthroplasty. They offer benefits in preoperative planning, intraoperative guidance, and interactive surgery. Studies demonstrate their effectiveness in AR-guided guidewire placement, providing real-time surgical advice during reverse total shoulder arthroplasty (RTSA). Additionally, these technologies show promise in orthopedic training, delivering superior realism and accelerating learning compared to conventional methods.
- Published
- 2024
- Full Text
- View/download PDF
219. Single-Center Experience in Microsurgical Resection of Acoustic Neurinomas and the Benefit of Microscope-Based Augmented Reality.
- Author
-
Pojskić M, Bopp MHA, Saß B, and Nimsky C
- Subjects
- Humans, Female, Middle Aged, Male, Aged, Adult, Neurosurgical Procedures methods, Microscopy methods, Treatment Outcome, Imaging, Three-Dimensional methods, Microsurgery methods, Neuroma, Acoustic surgery, Augmented Reality
- Abstract
Background and Objectives : Microsurgical resection with intraoperative neuromonitoring is the gold standard for acoustic neurinomas (ANs) which are classified as T3 or T4 tumors according to the Hannover Classification. Microscope-based augmented reality (AR) can be beneficial in cerebellopontine angle and lateral skull base surgery, since these are small areas packed with anatomical structures and the use of this technology enables automatic 3D building of a model without the need for a surgeon to mentally perform this task of transferring 2D images seen on the microscope into imaginary 3D images, which then reduces the possibility of error and provides better orientation in the operative field. Materials and Methods : All patients who underwent surgery for resection of ANs in our department were included in this study. Clinical outcomes in terms of postoperative neurological deficits and complications were evaluated, as well as neuroradiological outcomes for tumor remnants and recurrence. Results : A total of 43 consecutive patients (25 female, median age 60.5 ± 16 years) who underwent resection of ANs via retrosigmoid osteoclastic craniotomy with the use of intraoperative neuromonitoring (22 right-sided, 14 giant tumors, 10 cystic, 7 with hydrocephalus) by a single surgeon were included in this study, with a median follow up of 41.2 ± 32.2 months. A total of 18 patients underwent subtotal resection, 1 patient partial resection and 24 patients gross total resection. A total of 27 patients underwent resection in sitting position and the rest in semi-sitting position. Out of 37 patients who had no facial nerve deficit prior to surgery, 19 patients were intact following surgery, 7 patients had House Brackmann (HB) Grade II paresis, 3 patients HB III, 7 patients HB IV and 1 patient HB V. Wound healing deficit with cerebrospinal fluid (CSF) leak occurred in 8 patients (18.6%). Operative time was 317.3 ± 99 min. One patient which had recurrence and one further patient with partial resection underwent radiotherapy following surgery. A total of 16 patients (37.2%) underwent resection using fiducial-based navigation and microscope-based AR, all in sitting position. Segmented objects of interest in AR were the sigmoid and transverse sinus, tumor outline, cranial nerves (CN) VII, VIII and V, petrous vein, cochlea and semicircular canals and brain stem. Operative time and clinical outcome did not differ between the AR and the non-AR group. However, use of AR improved orientation in the operative field for craniotomy planning and microsurgical resection by identification of important neurovascular structures. Conclusions : The single-center experience of resection of ANs showed a high rate of gross total (GTR) and subtotal resection (STR) with low recurrence. Use of AR improves intraoperative orientation and facilitates craniotomy planning and AN resection through early improved identification of important anatomical relations to structures of the inner auditory canal, venous sinuses, petrous vein, brain stem and the course of cranial nerves.
- Published
- 2024
- Full Text
- View/download PDF
220. Retrospective study comparing the accuracies of handheld infrared stereo camera and augmented reality-based navigation systems for total hip arthroplasty.
- Author
-
Tanaka S, Takegami Y, Osawa Y, Okamoto M, and Imagama S
- Subjects
- Humans, Retrospective Studies, Female, Male, Aged, Middle Aged, Infrared Rays, Arthroplasty, Replacement, Hip instrumentation, Arthroplasty, Replacement, Hip methods, Augmented Reality, Surgical Navigation Systems, Surgery, Computer-Assisted methods, Surgery, Computer-Assisted instrumentation
- Abstract
Background: The use of portable navigation systems (PNS) in total hip arthroplasty (THA) has become increasingly prevalent, with second-generation PNS (sPNS) demonstrating superior accuracy in the lateral decubitus position compared to first-generation PNS. However, few studies have compared different types of sPNS. This study retrospectively compares the accuracy and clinical outcomes of two different types of sPNS instruments in patients undergoing THA., Methods: A total of 158 eligible patients who underwent THA at a single institution between 2019 and 2022 were enrolled in the study, including 89 who used an accelerometer-based PNS with handheld infrared stereo cameras in the Naviswiss group (group N) and 69 who used an augmented reality (AR)-based PNS in the AR-Hip group (group A). Accuracy error, navigation error, clinical outcomes, and preparation time were compared between the two groups., Results: Accuracy errors for Inclination were comparable between group N (3.5° ± 3.0°) and group A (3.5° ± 3.1°) (p = 0.92). Accuracy errors for anteversion were comparable between group N (4.1° ± 3.1°) and group A (4.5° ± 4.0°) (p = 0.57). The navigation errors for inclination (group N: 2.9° ± 2.7°, group A: 3.0° ± 3.2°) and anteversion (group N: 4.3° ± 3.5°, group A: 4.3° ± 4.1°) were comparable between the groups (p = 0.86 and 0.94, respectively). The preparation time was shorter in group A than in group N (p = 0.036). There were no significant differences in operative time (p = 0.255), intraoperative blood loss (p = 0.387), or complications (p = 0.248) between the two groups., Conclusion: An Accelerometer-based PNS using handheld infrared stereo cameras and AR-based PNS provide similar accuracy during THA in the lateral decubitus position, with a mean error of 3°-4° for both inclination and anteversion, though the AR-based PNS required a shorter preparation time., (© 2024. The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature.)
- Published
- 2024
- Full Text
- View/download PDF
221. Automated scoring and augmented reality visualization software program for evaluating tooth preparations.
- Author
-
Mai HN, Ngo HC, Cho SH, and Lee DH
- Subjects
- Humans, Imaging, Three-Dimensional methods, Tooth Preparation, Prosthodontic methods, Reproducibility of Results, Models, Dental, Tooth Preparation methods, Software, Augmented Reality
- Abstract
Statement of Problem: Tooth preparation is an essential part of prosthetic dentistry; however, traditional evaluation methods involve subjective visual inspection that is prone to examiner variability., Purpose: The purpose of this study was to investigate a newly developed automated scoring and augmented reality (ASAR) visualization software program for evaluating tooth preparations., Material and Methods: A total of 122 tooth models (61 anterior and 61 posterior teeth) prepared by dental students were evaluated by using visual assessments that were conducted by students and an expert, and auto assessment that was performed with an ASAR software program by using a 3-dimensional (3D) point-cloud comparison method. The software program offered comprehensive functions, including generating detailed reports for individual test models, producing a simultaneous summary score report for all tested models, creating 3D color-coded deviation maps, and forming augmented reality quick-response (AR-QR) codes for online data storage with AR visualization. The reliability and efficiency of the evaluation methods were measured by comparing tooth preparation assessment scores and evaluation time. The data underwent statistical analysis using the Kruskal-Wallis test, followed by Mann-Whitney U tests for pairwise comparisons adjusted with the Benjamini-Hochberg method (α=.05)., Results: Significant differences were found across the evaluation methods and tooth types in terms of preparation scores and evaluation time (P<.001). A significant difference was observed between the auto- and student self-assessment methods (P<.001) in scoring both the anterior and posterior tooth preparations. However, no significant difference was found between the auto- and expert-assessment methods for the anterior (P=.085) or posterior (P=.14) tooth preparation scores. Notably, the auto-assessment method required significantly shorter time than the expert- and self-assessment methods (P<.001) for both tooth types. Additionally, significant differences in evaluation time between the anterior and posterior tooth were observed in both self- and expert-assessment methods (P<.001), whereas the evaluation times for both the tooth types with the auto-assessment method were statistically similar (P=.32)., Conclusions: ASAR-based evaluation is comparable with expert-assessment while exhibiting significantly higher time efficiency. Moreover, AR-QR codes enhance learning and training experiences by facilitating online data storage and AR visualization., (Copyright © 2024 Editorial Council for The Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.)
- Published
- 2024
- Full Text
- View/download PDF
222. Augmented-reality-based surgical navigation for endoscope retrograde cholangiopancreatography: A phantom study.
- Author
-
Lin Z, Yang Z, Li R, Sun S, Yan B, Yang Y, Liu H, and Pan J
- Subjects
- Humans, Imaging, Three-Dimensional methods, Surgical Navigation Systems, Robotic Surgical Procedures methods, Robotic Surgical Procedures instrumentation, Reproducibility of Results, Phantoms, Imaging, Cholangiopancreatography, Endoscopic Retrograde methods, Augmented Reality, Surgery, Computer-Assisted methods, Surgery, Computer-Assisted instrumentation
- Abstract
Background: Endoscope retrograde cholangiopancreatography is a standard surgical treatment for gallbladder and pancreatic diseases. However, surgeons is at high risk and require sufficient surgical experience and skills., Methods: (1) The simultaneous localisation and mapping technique to reconstruct the surgical environment. (2) The preoperative 3D model is transformed into the intraoperative video environment to implement the multi-modal fusion. (3) A framework for virtual-to-real projection based on hand-eye alignment. For the purpose of projecting the 3D model onto the imaging plane of the camera, it uses position data from electromagnetic sensors., Results: Our AR-assisted navigation system can accurately guide physicians, which means a distance of registration error to be restricted to under 5 mm and a projection error of 5.76 ± 2.13, and the intubation procedure is done at 30 frames per second., Conclusions: Coupled with clinical validation and user studies, both the quantitative and qualitative results indicate that our navigation system has the potential to be highly useful in clinical practice., (© 2024 John Wiley & Sons Ltd.)
- Published
- 2024
- Full Text
- View/download PDF
223. A real-time augmented reality system integrated with artificial intelligence for skin tumor surgery: experimental study and case series.
- Author
-
Huang K, Liao J, He J, Lai S, Peng Y, Deng Q, Wang H, Liu Y, Peng L, Bai Z, Yu N, Li Y, Jiang Z, Su J, Li J, Tang Y, Chen M, Lu L, Chen X, Yao J, and Zhao S
- Subjects
- Humans, Animals, Rabbits, Female, Male, Mohs Surgery, Surgery, Computer-Assisted methods, Middle Aged, Adult, Aged, Manikins, Artificial Intelligence, Skin Neoplasms surgery, Skin Neoplasms pathology, Augmented Reality
- Abstract
Background: Skin tumors affect many people worldwide, and surgery is the first treatment choice. Achieving precise preoperative planning and navigation of intraoperative sampling remains a problem and is excessively reliant on the experience of surgeons, especially for Mohs surgery for malignant tumors., Materials and Methods: To achieve precise preoperative planning and navigation of intraoperative sampling, we developed a real-time augmented reality (AR) surgical system integrated with artificial intelligence (AI) to enhance three functions: AI-assisted tumor boundary segmentation, surgical margin design, and navigation in intraoperative tissue sampling. Non-randomized controlled trials were conducted on manikin, tumor-simulated rabbits, and human volunteers in Hunan Engineering Research Center of Skin Health and Disease Laboratory to evaluate the surgical system., Results: The results showed that the accuracy of the benign and malignant tumor segmentation was 0.9556 and 0.9548, respectively, and the average AR navigation mapping error was 0.644 mm. The proposed surgical system was applied in 106 skin tumor surgeries, including intraoperative navigation of sampling in 16 Mohs surgery cases. Surgeons who have used this system highly recognize it., Conclusions: The surgical system highlighted the potential to achieve accurate treatment of skin tumors and to fill the gap in global research on skin tumor surgery systems., (Copyright © 2024 The Author(s). Published by Wolters Kluwer Health, Inc.)
- Published
- 2024
- Full Text
- View/download PDF
224. Robust tracking of deformable anatomical structures with severe occlusions using deformable geometrical primitives.
- Author
-
Sayols N, Hernansanz A, Parra J, Eixarch E, Xambó-Descamps S, Gratacós E, and Casals A
- Subjects
- Humans, Neural Networks, Computer, Algorithms, Spinal Dysraphism surgery, Spinal Dysraphism diagnostic imaging, Image Processing, Computer-Assisted methods, Robotics, Augmented Reality, Robotic Surgical Procedures methods
- Abstract
Background and Objective: Surgical robotics tends to develop cognitive control architectures to provide certain degree of autonomy to improve patient safety and surgery outcomes, while decreasing the required surgeons' cognitive load dedicated to low level decisions. Cognition needs workspace perception, which is an essential step towards automatic decision-making and task planning capabilities. Robust and accurate detection and tracking in minimally invasive surgery suffers from limited visibility, occlusions, anatomy deformations and camera movements., Method: This paper develops a robust methodology to detect and track anatomical structures in real time to be used in automatic control of robotic systems and augmented reality. The work focuses on the experimental validation in highly challenging surgery: fetoscopic repair of Open Spina Bifida. The proposed method is based on two sequential steps: first, selection of relevant points (contour) using a Convolutional Neural Network and, second, reconstruction of the anatomical shape by means of deformable geometric primitives., Results: The methodology performance was validated with different scenarios. Synthetic scenario tests, designed for extreme validation conditions, demonstrate the safety margin offered by the methodology with respect to the nominal conditions during surgery. Real scenario experiments have demonstrated the validity of the method in terms of accuracy, robustness and computational efficiency., Conclusions: This paper presents a robust anatomical structure detection in present of abrupt camera movements, severe occlusions and deformations. Even though the paper focuses on a case study, Open Spina Bifida, the methodology is applicable in all anatomies which contours can be approximated by geometric primitives. The methodology is designed to provide effective inputs to cognitive robotic control and augmented reality systems that require accurate tracking of sensitive anatomies., Competing Interests: Declaration of competing interest The authors declare the following financial interests/personal relationships which may be considered as potential competing interests: Narcis Sayols reports financial support was provided by Cellex Foundation. Albert Hernansanz reports financial support was provided by Cellex Foundation. Johanna Parra reports financial support was provided by Cellex Foundation. Elisenda Eixarch reports financial support was provided by Cellex Foundation. Eduard Gratacos reports financial support was provided by Cellex Foundation. Alicia Casals reports financial support was provided by Cellex Foundation. Elisenda Eixarch reports financial support was provided by Departament de Salut de la Generalitat de Catalunya., (Copyright © 2024 The Authors. Published by Elsevier B.V. All rights reserved.)
- Published
- 2024
- Full Text
- View/download PDF
225. Registration of preoperative temporal bone CT-scan to otoendoscopic video for augmented-reality based on convolutional neural networks.
- Author
-
Taleb A, Leclerc S, Hussein R, Lalande A, and Bozorg-Grayeli A
- Subjects
- Humans, Temporal Bone diagnostic imaging, Temporal Bone surgery, Augmented Reality, Otoscopy methods, Female, Video Recording, Male, Ear Diseases surgery, Ear Diseases diagnostic imaging, Otologic Surgical Procedures methods, Middle Aged, Algorithms, Surgery, Computer-Assisted methods, Adult, Tympanic Membrane diagnostic imaging, Tympanic Membrane surgery, Malleus diagnostic imaging, Malleus surgery, Endoscopy methods, Neural Networks, Computer, Tomography, X-Ray Computed methods
- Abstract
Purpose: Patient-to-image registration is a preliminary step required in surgical navigation based on preoperative images. Human intervention and fiducial markers hamper this task as they are time-consuming and introduce potential errors. We aimed to develop a fully automatic 2D registration system for augmented reality in ear surgery., Methods: CT-scans and corresponding oto-endoscopic videos were collected from 41 patients (58 ears) undergoing ear examination (vestibular schwannoma before surgery, profound hearing loss requiring cochlear implant, suspicion of perilymphatic fistula, contralateral ears in cases of unilateral chronic otitis media). Two to four images were selected from each case. For the training phase, data from patients (75% of the dataset) and 11 cadaveric specimens were used. Tympanic membranes and malleus handles were contoured on both video images and CT-scans by expert surgeons. The algorithm used a U-Net network for detecting the contours of the tympanic membrane and the malleus on both preoperative CT-scans and endoscopic video frames. Then, contours were processed and registered through an iterative closest point algorithm. Validation was performed on 4 cases and testing on 6 cases. Registration error was measured by overlaying both images and measuring the average and Hausdorff distances., Results: The proposed registration method yielded a precision compatible with ear surgery with a 2D mean overlay error of 0.65 ± 0.60 mm for the incus and 0.48 ± 0.32 mm for the round window. The average Hausdorff distance for these 2 targets was 0.98 ± 0.60 mm and 0.78 ± 0.34 mm respectively. An outlier case with higher errors (2.3 mm and 1.5 mm average Hausdorff distance for incus and round window respectively) was observed in relation to a high discrepancy between the projection angle of the reconstructed CT-scan and the video image. The maximum duration for the overall process was 18 s., Conclusions: A fully automatic 2D registration method based on a convolutional neural network and applied to ear surgery was developed. The method did not rely on any external fiducial markers nor human intervention for landmark recognition. The method was fast and its precision was compatible with ear surgery., (© 2024. The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature.)
- Published
- 2024
- Full Text
- View/download PDF
226. Comparison of subjective evaluations in virtual and real environments for soundscape researcha).
- Author
-
Yang M, Heimes A, Vorländer M, and Schulte-Fortkamp B
- Subjects
- Humans, Female, Male, Adult, Young Adult, Augmented Reality, Acoustic Stimulation, Sound, Reproducibility of Results, Virtual Reality, Auditory Perception
- Abstract
Emerging technologies of virtual reality (VR) and augmented reality (AR) are enhancing soundscape research, potentially producing new insights by enabling controlled conditions while preserving the context of a virtual gestalt within the soundscape concept. This study explored the ecological validity of virtual environments for subjective evaluations in soundscape research, focusing on the authenticity of virtual audio-visual environments for reproducibility. Different technologies for creating and reproducing virtual environments were compared, including field recording, simulated VR, AR, and audio-only presentation, in two audio-visual reproduction settings, a head-mounted display with head-tracked headphones and a VR lab with head-locked headphones. Via a series of soundwalk- and lab-based experiments, the results indicate that field recording technologies provided the most authentic audio-visual environments, followed by AR, simulated VR, and audio-only approaches. The authenticity level influenced subjective evaluations of virtual environments, e.g., arousal/eventfulness and pleasantness. The field recording and AR-based technologies closely matched the on-site soundwalk ratings in arousal, while the other approaches scored lower. All the approaches had significantly lower pleasantness ratings compared to on-site evaluations. The choice of audio-visual reproduction technology did not significantly impact the evaluations. Overall, the results suggest virtual environments with high authenticity can be useful for future soundscape research and design., (© 2024 Acoustical Society of America.)
- Published
- 2024
- Full Text
- View/download PDF
227. Augmented Reality Head-Mounted Device and Dynamic Navigation System for Postremoval in Maxillary Molars.
- Author
-
Martinho FC, Qadir SJ, Griffin IL, Melo MAS, and Fay GG
- Subjects
- Humans, Surgery, Computer-Assisted methods, Feasibility Studies, Molar, Maxilla surgery, Cone-Beam Computed Tomography, Augmented Reality
- Abstract
Introduction: This study evaluates the feasibility of an augmented reality (AR) head-mounted device (HMD) displaying a dynamic navigation system (DNS) in the surgical site for fiber postremoval in maxillary molars and compares it to the DNS technique., Methods: Fifty maxillary first molars were divided into 2 groups: AR HMD + DNS (n = 25) and DNS (n = 25). The palatal canal was restored with RelyX fiber post (3M ESPE) luted with RelyX Unicem (3M ESPE). A core buildup was performed using Paracore (Coltene/Whaledent). Cone beam computed tomography (CBCT) scans were taken before and after postremoval. The drilling trajectory and depth were planned under X-guide software (X-Nav Technologies, Lansdale, PA). For the AR HMD + DNS group, the AR HMD (Microsoft HoloLens 2) displayed the DNS in the surgical site. The three dimensional (3D) deviations (Global coronal deviation [GCD] and global apical deviation [GAD]) and angular deflection (AD) were calculated. The number of mishaps and operating time were recorded., Results: Fiber post was removed from all samples (50/50). The AR HMD + DNS was more accurate than DNS, showing significantly lower GCD and GAD deviations and AD (P < .05). No mishap was detected. The AR HMD + DNS was as efficient in time as DNS (P > .05)., Conclusions: Within the limitations of this in vitro study, the AR HMD can safely display DNS in the surgical site for fiber post-removal in maxillary molars. AR HMD improved the DNS accuracy. Both AR HMD + DNS and DNS were time-efficient for fiber postremoval in maxillary molars., (Copyright © 2024 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.)
- Published
- 2024
- Full Text
- View/download PDF
228. A hybrid robotic system for zygomatic implant placement based on mixed reality navigation.
- Author
-
Fan X, Feng Y, Tao B, Shen Y, Wu Y, and Chen X
- Subjects
- Animals, Humans, Cone-Beam Computed Tomography, Prostheses and Implants, Imaging, Three-Dimensional, Augmented Reality, Surgery, Computer-Assisted methods, Robotic Surgical Procedures, Dental Implants
- Abstract
Backgrounds: Zygomatic implant (ZI) placement surgery is a viable surgical option for patients with severe maxillary atrophy and insufficient residual maxillary bone. Still, it is difficult and risky due to the long path of ZI placement and the narrow field of vision. Dynamic navigation is a superior solution, but it presents challenges such as requiring operators to have advanced skills and experience. Moreover, the precision and stability of manual implantation remain inadequate. These issues are anticipated to be addressed by implementing robot-assisted surgery and achieved by introducing a mixed reality (MR) navigation-guided hybrid robotic system for ZI placement surgery., Methods: This study utilized a hybrid robotic system to perform the ZI placement surgery. Our first step was to reconstruct a virtual 3D model from preoperative cone-beam CT (CBCT) images. We proposed a series of algorithms based on coordinate transformation, which includes image-phantom registration, HoloLens-tracker registration, drill-phantom calibration, and robot-implant calibration, to unify all objects within the same coordinate system. These algorithms enable real-time tracking of the surgical drill's position and orientation relative to the patient phantom. Subsequently, the surgical drill is directed to the entry position, and the planned implantation paths are superimposed on the patient phantom using HoloLens 2 for visualization. Finally, the hybrid robot system performs the processed of drilling, expansion, and placement of ZIs under the guidance of the MR navigation system., Results: Phantom experiments of ZI placement were conducted using 10 patient phantoms, with a total of 40 ZIs inserted. Out of these, 20 were manually implanted, and the remaining 20 were robotically implanted. Comparisons between the actual implanted ZI paths and the preoperatively planned ZI paths showed that our MR navigation-guided hybrid robotic system achieved a coronal deviation of 0.887 ± 0.213 mm, an apical deviation of 1.201 ± 0.318 mm, and an angular deviation of 3.468 ± 0.339° This demonstrates significantly better accuracy and stability than manual implantation., Conclusion: Our proposed hybrid robotic system enables automated ZI placement surgery guided by MR navigation, achieving greater accuracy and stability compared to manual operations in phantom experiments. Furthermore, this system is expected to apply to animal and cadaveric experiments, to get a good ready for clinical studies., Competing Interests: Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper., (Copyright © 2024. Published by Elsevier B.V.)
- Published
- 2024
- Full Text
- View/download PDF
229. Face, content, and construct validity of a novel VR/AR surgical simulator of a minimally invasive spine operation.
- Author
-
Alkadri S, Del Maestro RF, and Driscoll M
- Subjects
- Humans, Spinal Fusion methods, Reproducibility of Results, Virtual Reality, Female, Male, Surveys and Questionnaires, Computer Simulation, Spine surgery, Adult, Augmented Reality, Minimally Invasive Surgical Procedures education
- Abstract
Mixed-reality surgical simulators are seen more objective than conventional training. The simulators' utility in training must be established through validation studies. Establish face-, content-, and construct-validity of a novel mixed-reality surgical simulator developed by McGill University, CAE-Healthcare, and DePuy Synthes. This study, approved by a Research Ethics Board, examined a simulated L4-L5 oblique lateral lumbar interbody fusion (OLLIF) scenario. A 5-point Likert scale questionnaire was used. Chi-square test verified validity consensus. Construct validity investigated 276 surgical performance metrics across three groups, using ANOVA, Welch-ANOVA, or Kruskal-Wallis tests. A post-hoc Dunn's test with a Bonferroni correction was used for further analysis on significant metrics. Musculoskeletal Biomechanics Research Lab, McGill University, Montreal, Canada. DePuy Synthes, Johnson & Johnson Family of Companies, research lab. Thirty-four participants were recruited: spine surgeons, fellows, neurosurgical, and orthopedic residents. Only seven surgeons out of the 34 were recruited in a side-by-side cadaver trial, where participants completed an OLLIF surgery first on a cadaver and then immediately on the simulator. Participants were separated a priori into three groups: post-, senior-, and junior-residents. Post-residents rated validity, median > 3, for 13/20 face-validity and 9/25 content-validity statements. Seven face-validity and 12 content-validity statements were rated neutral. Chi-square test indicated agreeability between group responses. Construct validity found eight metrics with significant differences (p < 0.05) between the three groups. Validity was established. Most face-validity statements were positively rated, with few neutrally rated pertaining to the simulation's graphics. Although fewer content-validity statements were validated, most were rated neutral (only four were negatively rated). The findings underscored the importance of using realistic physics-based forces in surgical simulations. Construct validity demonstrated the simulator's capacity to differentiate surgical expertise., (© 2024. International Federation for Medical and Biological Engineering.)
- Published
- 2024
- Full Text
- View/download PDF
230. Transforming Nursing Education: Developing Augmented Reality Procedural Training.
- Author
-
Lee D, Bathish MA, and Nelson J
- Subjects
- Humans, Female, Adult, Male, Feasibility Studies, Students, Nursing, Education, Nursing, Graduate methods, Augmented Reality
- Abstract
The shortage of nursing faculty and the scarcity of clinical placements have compelled researchers to investigate innovative solutions for procedural development to bridge the gap between didactic teaching and clinical experiences. This feasibility study uses augmented reality (AR) with Microsoft HoloLens2 and Dynamics 365 Guides to train graduate nursing students on advanced nursing procedures, focusing on lumbar puncture. A convenience sample of 24 nurse practitioner students participated in the study. The System Usability Scale, Acceptability Scale, and Engagement Scale were used to assess participant's experiences and perceptions. The results are positive for the feasibility and acceptance of AR technology for procedural training. Participants found the HoloLens2 device easy to use and showed confidence in its functionality. The step-by-step instructions provided by Microsoft 365 Guides were understandable, useful, and satisfactory. The students reported high levels of engagement and found the AR experience to be helpful and motivating for learning. Faculty time was significantly reduced using the HoloLens2 for procedural training compared to traditional methods. This study demonstrates the potential for AR as an effective and efficient modality for nursing education. The findings support the integration of AR technology to enhance procedural development, address the challenges of limited clinical sites, and provide students with an immersive and self-paced learning experience. Additional studies will need to explore the impact of AR on clinical competency, patient outcomes, and cost-effectiveness. Overall, the use of AR technology may be useful and effective for nursing pedagogy.
- Published
- 2024
- Full Text
- View/download PDF
231. Feasibility of a novel augmented reality overlay for cervical screw placement in phantom spine models.
- Author
-
Olexa J, Shear B, Han N, Sharma A, Trang A, Kim K, Schwartzbauer G, Ludwig S, and Sansur C
- Abstract
Study Design: Feasibility study., Purpose: A phantom model was used to evaluate the accuracy of a novel augmented reality (AR) system for cervical screw placement., Overview of Literature: The use of navigation systems is becoming increasingly common in spine procedures. However, numerous factors limit the feasibility of regular and widespread use of navigation tools during spine surgery. AR is a new technology that has already demonstrated utility as a navigation tool during spine surgery. However, advancements in AR technology are needed to increase its adoption by the medical community., Methods: AR technology that uses a fiducial-less registration system was tested in a preclinical cervical spine phantom model study for accuracy during spinal screw placement. A three-dimensional reconstruction of the spine along with trajectory lines was superimposed onto the phantom model using an AR headset. Participants used the AR system to guide screw placement, and post-instrumentation scans were compared for accuracy assessment., Results: Twelve cervical screws were placed under AR guidance. All screws were placed in an acceptable anatomic position. The average distance error for the insertion point was 2.73±0.55 mm, whereas that for the endpoint was 2.71±0.69 mm. The average trajectory angle error for all insertions was 2.69°±0.59°., Conclusions: This feasibility study describes a novel registration approach that superimposes spinal anatomy and trajectories onto the surgeon's real-world view of the spine. These results demonstrate reasonable accuracy in the preclinical model. The results of this study demonstrate that this technology can assist with accurate screw placement. Further investigation using cadaveric and clinical models is warranted.
- Published
- 2024
- Full Text
- View/download PDF
232. A Systematic Review of the Application of Computational Technology in Microtia.
- Author
-
Zhou J, Cui R, and Lin L
- Subjects
- Humans, Artificial Intelligence, Data Mining, Augmented Reality, Tomography, X-Ray Computed, Virtual Reality, Surgery, Computer-Assisted methods, Congenital Microtia surgery, Computer-Aided Design
- Abstract
Microtia is a congenital and morphological anomaly of one or both ears, which results from a confluence of genetic and external environmental factors. Up to now, extensive research has explored the potential utilization of computational methodologies in microtia and has obtained promising results. Thus, the authors reviewed the achievements and shortcomings of the research mentioned previously, from the aspects of artificial intelligence, computer-aided design and surgery, computed tomography, medical and biological data mining, and reality-related technology, including virtual reality and augmented reality. Hoping to offer novel concepts and inspire further studies within this field., Competing Interests: The authors report no conflicts of interest., (Copyright © 2024 by Mutaz B. Habal, MD.)
- Published
- 2024
- Full Text
- View/download PDF
233. Use of Virtual Reality and 3D Models in Contemporary Practice of Cardiology.
- Author
-
Minga I, Al-Ani MA, Moharem-Elgamal S, Md AVH, Md ASA, Masoomi M, and Mangi S
- Subjects
- Humans, Augmented Reality, Virtual Reality, Imaging, Three-Dimensional, Cardiology trends
- Abstract
Purpose of Review: To provide an overview of the impact of virtual and augmented reality in contemporary cardiovascular medical practice., Recent Findings: The utilization of virtual and augmented reality has emerged as an innovative technique in various cardiovascular subspecialties, including interventional adult, pediatric, and adult congenital as well as structural heart disease and heart failure. In particular, electrophysiology has proven valuable for both diagnostic and therapeutic procedures. The incorporation of 3D reconstruction modeling has significantly enhanced our understanding of patient anatomy and morphology, thereby improving diagnostic accuracy and patient outcomes. The interactive modeling of cardiac structure and function within the virtual realm plays a pivotal role in comprehending complex congenital, structural, and coronary pathology. This, in turn, contributes to safer interventions and surgical procedures. Noteworthy applications include septal defect device closure, transcatheter valvular interventions, and left atrial occlusion device implantation. The implementation of virtual reality has been shown to yield cost savings in healthcare, reduce procedure time, minimize radiation exposure, lower intravenous contrast usage, and decrease the extent of anesthesia required. These benefits collectively result in a more efficient and effective approach to patient care., (© 2024. The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.)
- Published
- 2024
- Full Text
- View/download PDF
234. From Augmented to Virtual Reality in Plastic Surgery: Blazing the Trail to a New Frontier.
- Author
-
Sullivan J, Skladman R, Varagur K, Tenenbaum E, Sacks JL, Martin C, Gordon T, Murphy J, Moritz WR, and Sacks JM
- Subjects
- Humans, Surgery, Plastic education, Microsurgery, Virtual Reality, Augmented Reality, Plastic Surgery Procedures methods
- Abstract
Background: Augmented reality (AR) and virtual reality (VR)-termed mixed reality-have shown promise in the care of operative patients. Currently, AR and VR have well-known applications for craniofacial surgery, specifically in preoperative planning. However, the application of AR/VR technology to other reconstructive challenges has not been widely adopted. Thus, the purpose of this investigation is to outline the current applications of AR and VR in the operative setting., Methods: The literature pertaining to the use of AR/VR technology in the operative setting was examined. Emphasis was placed on the use of mixed reality technology in surgical subspecialities, including plastic surgery, oral and maxillofacial surgery, colorectal surgery, neurosurgery, otolaryngology, neurosurgery, and orthopaedic surgery., Results: Presently, mixed reality is widely used in the care of patients requiring complex reconstruction of the craniomaxillofacial skeleton for pre- and intraoperative planning. For upper extremity amputees, there is evidence that VR may be efficacious in the treatment of phantom limb pain. Furthermore, VR has untapped potential as a cost-effective tool for microsurgical education and for training residents on techniques in surgical and nonsurgical aesthetic treatment. There is utility for mixed reality in breast reconstruction for preoperative planning, mapping perforators, and decreasing operative time. VR has well- documented applications in the planning of deep inferior epigastric perforator flaps by creating three-dimensional immersive simulations based on a patient's preoperative computed tomography angiogram., Conclusion: The benefits of AR and VR are numerous for both patients and surgeons. VR has been shown to increase surgical precision and decrease operative time. Furthermore, it is effective for patient-specific rehearsal which uses the patient's exact anatomical data to rehearse the procedure before performing it on the actual patient. Taken together, AR/VR technology can improve patient outcomes, decrease operative times, and lower the burden of care on both patients and health care institutions., Competing Interests: Disclosure The authors report no conflicts of interest in this work., (Thieme. All rights reserved.)
- Published
- 2024
- Full Text
- View/download PDF
235. Virtual and Augmented Reality in Management of Phantom Limb Pain: A Systematic Review.
- Author
-
Eldaly AS, Avila FR, Torres-Guzman RA, Maita KC, Garcia JP, Serrano LP, Emam OS, and Forte AJ
- Subjects
- Humans, Virtual Reality, Pain Management methods, Amputation, Surgical, Virtual Reality Exposure Therapy methods, Phantom Limb therapy, Augmented Reality, Pain Measurement
- Abstract
Upper and lower limb amputations are frequently associated with phantom limb pain (PLP). Recently, virtual reality (VR) and augmented reality (AR) have been reported as a potential therapy of PLP. We have conducted a systematic review of literature to evaluate the efficacy of VR and AR in managing PLP. Four databases were searched: PubMed, EMBASE, Cumulative Index to Nursing and Allied Health Literature, and Web of Science. We utilized the Preferred Reporting Items for Systematic Reviews and Meta-Analysis for our organization. The initial search resulted in 164 results. After title, abstract, and full-text screening, 9 studies were included. One study was of good quality and 8 studies were of fair to poor quality. Seven studies utilized VR and 2 studies utilized AR. The number of treatment sessions ranged from 1 to 28 and the duration ranged from 10 minutes to 2 hours. Several pain scales were used to evaluate PLP pre- and postintervention including Numeric Rating Scale, Pain Rating Index, McGill Pain Questionnaire, and Visual Analog Scale. All the studies reported improvement of PLP on one or more of pain scales after one or more sessions of VR or AR. Despite the promising results reported by literature, we cannot recommend using VR or AR for PLP. Most of the studies are of poor design and have limited sample size with high bias levels. Therefore, no substantial evidence can be derived from them. However, we do believe further research with high-quality randomized controlled trials should take place to increase the knowledge of the potential advantages., Competing Interests: Declaration of Conflicting InterestsThe author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
- Published
- 2024
- Full Text
- View/download PDF
236. Dissecting human anatomy learning process through anatomical education with augmented reality: AEducAR 2.0, an updated interdisciplinary study.
- Author
-
Neri I, Cercenelli L, Marcuccio M, Lodi S, Koufi FD, Fazio A, Marvi MV, Marcelli E, Billi AM, Ruggeri A, Tarsitano A, Manzoli L, Badiali G, and Ratti S
- Subjects
- Female, Humans, Male, Young Adult, Computer-Assisted Instruction methods, Curriculum, Interdisciplinary Studies, Surveys and Questionnaires statistics & numerical data, Anatomy education, Augmented Reality, Education, Medical, Undergraduate methods, Educational Measurement statistics & numerical data, Learning, Students, Medical psychology, Students, Medical statistics & numerical data
- Abstract
Anatomical education is pivotal for medical students, and innovative technologies like augmented reality (AR) are transforming the field. This study aimed to enhance the interactive features of the AEducAR prototype, an AR tool developed by the University of Bologna, and explore its impact on human anatomy learning process in 130 second-year medical students at the International School of Medicine and Surgery of the University of Bologna. An interdisciplinary team of anatomists, maxillofacial surgeons, biomedical engineers, and educational scientists collaborated to ensure a comprehensive understanding of the study's objectives. Students used the updated version of AEducAR, named AEducAR 2.0, to study three anatomical topics, specifically the orbit zone, facial bones, and mimic muscles. AEducAR 2.0 offered two learning activities: one explorative and one interactive. Following each activity, students took a test to assess learning outcomes. Students also completed an anonymous questionnaire to provide background information and offer their perceptions of the activity. Additionally, 10 students participated in interviews for further insights. The results demonstrated that AEducAR 2.0 effectively facilitated learning and students' engagement. Students totalized high scores in both quizzes and declared to have appreciated the interactive features that were implemented. Moreover, interviews shed light on the interesting topic of blended learning. In particular, the present study suggests that incorporating AR into medical education alongside traditional methods might prove advantageous for students' academic and future professional endeavors. In this light, this study contributes to the growing research emphasizing the potential role of AR in shaping the future of medical education., (© 2024 The Authors. Anatomical Sciences Education published by Wiley Periodicals LLC on behalf of American Association for Anatomy.)
- Published
- 2024
- Full Text
- View/download PDF
237. Simulator-Based Versus Traditional Training of Fundus Biomicroscopy for Medical Students: A Prospective Randomized Trial.
- Author
-
Deuchler S, Dail YA, Berger T, Sneyers A, Koch F, Buedel C, Ackermann H, Flockerzi E, and Seitz B
- Abstract
Introduction: Simulation training is an important component of medical education. In former studies, diagnostic simulation training for direct and indirect funduscopy was already proven to be an effective training method. In this prospective controlled trial, we investigated the effect of simulator-based fundus biomicroscopy training., Methods: After completing a 1-week ophthalmology clerkship, medical students at Saarland University Medical Center (n = 30) were block-randomized into two groups: The traditional group received supervised training examining the fundus of classmates using a slit lamp; the simulator group was trained using the Slit Lamp Simulator. All participants had to pass an Objective Structured Clinical Examination (OSCE); two masked ophthalmological faculty trainers graded the students' skills when examining patient's fundus using a slit lamp. A subjective assessment form and post-assessment surveys were obtained. Data were described using median (interquartile range [IQR])., Results: Twenty-five students (n = 14 in the simulator group, n = 11 in the traditional group) (n = 11) were eligible for statistical analysis. Interrater reliability was verified as significant for the overall score as well as for all subtasks (≤ 0.002) except subtask 1 (p = 0.12). The overall performance of medical students in the fundus biomicroscopy OSCE was statistically ranked significantly higher in the simulator group (27.0 [5.25]/28.0 [3.0] vs. 20.0 [7.5]/16.0 [10.0]) by both observers with an interrater reliability of IRR < 0.001 and a significance level of p = 0.003 for observer 1 and p < 0.001 for observer 2. For all subtasks, the scores given to students trained using the simulator were consistently higher than those given to students trained traditionally. The students' post-assessment forms confirmed these results. Students could learn the practical backgrounds of fundus biomicroscopy (p = 0.04), the identification (p < 0.001), and localization (p < 0.001) of pathologies significantly better with the simulator., Conclusions: Traditional supervised methods are well complemented by simulation training. Our data indicate that the simulator helps with first patient contacts and enhances students' capacity to examine the fundus biomicroscopically., (© 2024. The Author(s).)
- Published
- 2024
- Full Text
- View/download PDF
238. Augmented 360° Three-Dimensional Virtual Reality for Enhanced Student Training and Education in Neurosurgery.
- Author
-
Truckenmueller P, Krantchev K, Rubarth K, Früh A, Mertens R, Bruening D, Stein C, Vajkoczy P, Picht T, and Acker G
- Subjects
- Humans, Female, Prospective Studies, Male, Neurosurgical Procedures education, Neurosurgical Procedures methods, Augmented Reality, Adult, Young Adult, Imaging, Three-Dimensional methods, Video Recording, Virtual Reality, Neurosurgery education, Students, Medical
- Abstract
Objective: This prospective study assesses the acceptance and usefulness of augmented 360° virtual reality (VR) videos for early student education and preparation in the field of neurosurgery., Methods: Thirty-five third-year medical students participated. Augmented 360° VR videos depicting three neurosurgical procedures (lumbar discectomy, brain metastasis resection, clipping of an aneurysm) were presented during elective seminars. Multiple questionnaires were employed to evaluate conceptual and technical aspects of the videos. The analysis utilized ordinal logistic regression to identify crucial factors contributing to the learning experience of the videos., Results: The videos were consistently rated as good to very good in quality, providing detailed demonstrations of intraoperative anatomy and surgical workflow. Students found the videos highly useful for their learning and preparation for surgical placements, and they strongly supported the establishment of a VR lounge for additional self-directed learning. Notably, 81% reported an increased interest in neurosurgery, and 47% acknowledged the potential influence of the videos on their future choice of specialization. Factors associated with a positive impact on students' interest and learning experience included high technical quality and comprehensive explanations of the surgical steps., Conclusions: This study demonstrated the high acceptance of augmented 360° VR videos as a valuable tool for early student education in neurosurgery. While hands-on training remains indispensable, these videos promote conceptual knowledge, ignite interest in neurosurgery, and provide a much-needed orientation within the operating room. The incorporation of detailed explanations throughout the surgeries with augmentation using superimposed elements, offers distinct advantages over simply observing live surgeries., (Copyright © 2024 The Author(s). Published by Elsevier Inc. All rights reserved.)
- Published
- 2024
- Full Text
- View/download PDF
239. Novel Use of Virtual Reality and Augmented Reality in Temporomandibular Total Joint Replacement Using Stock Prosthesis.
- Author
-
Niloy I, Liu RH, Pham NM, and Yim CMR
- Subjects
- Humans, Middle Aged, Imaging, Three-Dimensional, Mandibular Condyle surgery, Mandibular Condyle diagnostic imaging, Prosthesis Design, Surgery, Computer-Assisted methods, Temporomandibular Joint surgery, Temporomandibular Joint diagnostic imaging, Arthroplasty, Replacement methods, Arthroplasty, Replacement instrumentation, Augmented Reality, Joint Prosthesis, Temporomandibular Joint Disorders surgery, Virtual Reality
- Abstract
This technical innovation demonstrates the use of ImmersiveTouch Virtual Reality (VR) and Augmented Reality (AR)-guided total temporomandibular joint replacement (TJR) using Biomet stock prosthesis in 2 patients with condylar degeneration. TJR VR planning includes condylar resection, prosthesis selection and positioning, and interference identification. AR provides real-time guidance for osteotomies, placement of prostheses and fixation screws, occlusion verification, and flexibility to modify the surgical course. Radiographic analysis demonstrated high correspondence between the preoperative plan and postoperative result. The average differences in the positioning of the condylar and fossa prosthesis are 1.252 ± 0.269 mm and 1.393 ± 0.335 mm, respectively. The main challenges include a steep learning curve, intraoperative technical difficulties, added surgical time, and additional costs. In conclusion, the case report demonstrates the advantages of implementing AR and VR technology in TJR's using stock prostheses as a pilot study. Further clinical trials are needed prior to this innovation becoming a mainstream practice., (Published by Elsevier Inc.)
- Published
- 2024
- Full Text
- View/download PDF
240. The Utility and Feasibility of Smart Glasses in Spine Surgery: Minimizing Radiation Exposure During Percutaneous Pedicle Screw Insertion.
- Author
-
Hiranaka Y, Takeoka Y, Yurube T, Tsujimoto T, Kanda Y, Miyazaki K, Ohnishi H, Matsuo T, Ryu M, Kumagai N, Kuroshima K, Kuroda R, and Kakutani K
- Abstract
Objective: Spine surgeons are often at risk of radiation exposure due to intraoperative fluoroscopy, leading to health concerns such as carcinogenesis. This is due to the increasing use of percutaneous pedicle screw (PPS) in spinal surgeries, resulting from the widespread adoption of minimally invasive spine stabilization. This study aimed to elucidate the effectiveness of smart glasses (SG) in PPS insertion under fluoroscopy., Methods: SG were used as an alternative screen for fluoroscopic images. Operators A (2-year experience in spine surgery) and B (9-year experience) inserted the PPS into the bilateral L1-5 pedicles of the lumbar model bone under fluoroscopic guidance, repeating this procedure twice with and without SG (groups SG and N-SG, respectively). Each vertebral body's insertion time, radiation dose, and radiation exposure time were measured, and the deviation in screw trajectories was evaluated., Results: The groups SG and N-SG showed no significant difference in insertion time for the overall procedure and each operator. However, group SG had a significantly shorter radiation exposure time than group N-SG for the overall procedure (109.1 ± 43.5 seconds vs. 150.9 ± 38.7 seconds; p = 0.003) and operator A (100.0 ± 29.0 seconds vs. 157.9 ± 42.8 seconds; p = 0.003). The radiation dose was also significantly lower in group SG than in group N-SG for the overall procedure (1.3 ± 0.6 mGy vs. 1.7 ± 0.5 mGy; p = 0.023) and operator A (1.2 ± 0.4 mGy vs. 1.8 ± 0.5 mGy; p = 0.013). The 2 groups showed no significant difference in screw deviation., Conclusion: The application of SG in fluoroscopic imaging for PPS insertion holds potential as a useful method for reducing radiation exposure.
- Published
- 2024
- Full Text
- View/download PDF
241. Effects of Multimodal Exercise With Augmented Reality on Cognition in Community-Dwelling Older Adults.
- Author
-
Ferreira S, Raimundo A, Pozo-Cruz JD, Bernardino A, Leite N, Yoshida HM, and Marmeleira J
- Subjects
- Humans, Male, Aged, Female, Cognition physiology, Augmented Reality, Aged, 80 and over, Executive Function, Independent Living, Exercise Therapy methods
- Abstract
Objectives: This study aims to investigate the effects of an exercise intervention using multimodal exercise with augmented reality and multimodal exercise-only on cognitive function in older adults living in a community dwelling., Design: Quasi-experimental research study., Setting and Participants: In this control study, 78 participants were divided into 2 experimental groups (with sessions 3 times a week for 12 weeks) and a control group (CG)., Methods: EG1 participated in a multimodal exercise-only intervention program, EG2 participated in a multimodal exercise program with augmented reality exergames, and CG continued its usual activities. Participants were assessed at baseline and postintervention after 12 weeks., Results: Comparison between baseline and postintervention at 12 weeks showed significant improvements in executive functions, verbal fluency, choice reaction time, and dual task in EG1, whereas there were improvements in general cognition, executive functions, verbal fluency, discrimination reaction time, and depression in EG2 (P ≤ .05). The clinical effect sizes of the interventions were large for overall cognition, executive functions, and reaction time on single- and dual-task reaction time in EG1 and for overall cognition, executive functions, and verbal fluency in EG2., Conclusion and Implications: The intervention programs showed significant improvements in several cognitive domains. The multimodal exercise-only showed improvements in more variables than the multimodal exercise with augmented reality, but the augmented reality group showed greater changes between baseline and postintervention., Competing Interests: Disclosure The authors declare no conflict of interest., (Copyright © 2024. Published by Elsevier Inc.)
- Published
- 2024
- Full Text
- View/download PDF
242. Augmented and Virtual Reality Applications in Facial Plastic Surgery: A Scoping Review.
- Author
-
Chou DW, Annadata V, Willson G, Gray M, and Rosenberg J
- Subjects
- Humans, Surgery, Plastic education, Surgery, Plastic methods, Virtual Reality, Augmented Reality, Plastic Surgery Procedures methods, Face surgery
- Abstract
Objectives: Augmented reality (AR) and virtual reality (VR) are emerging technologies with wide potential applications in health care. We performed a scoping review of the current literature on the application of augmented and VR in the field of facial plastic and reconstructive surgery (FPRS)., Data Sources: PubMed and Web of Science., Review Methods: According to PRISMA guidelines, PubMed and Web of Science were used to perform a scoping review of literature regarding the utilization of AR and/or VR relevant to FPRS., Results: Fifty-eight articles spanning 1997-2023 met the criteria for review. Five overarching categories of AR and/or VR applications were identified across the articles: preoperative, intraoperative, training/education, feasibility, and technical. The following clinical areas were identified: burn, craniomaxillofacial surgery (CMF), face transplant, face lift, facial analysis, facial palsy, free flaps, head and neck surgery, injectables, locoregional flaps, mandible reconstruction, mandibuloplasty, microtia, skin cancer, oculoplastic surgery, rhinology, rhinoplasty, and trauma., Conclusion: AR and VR have broad applications in FPRS. AR for surgical navigation may have the most emerging potential in CMF surgery and free flap harvest. VR is useful as distraction analgesia for patients and as an immersive training tool for surgeons. More data on these technologies' direct impact on objective clinical outcomes are still needed., Level of Evidence: N/A Laryngoscope, 134:2568-2577, 2024., (© 2023 The American Laryngological, Rhinological and Otological Society, Inc.)
- Published
- 2024
- Full Text
- View/download PDF
243. Technologies Used for Telementoring in Open Surgery: A Scoping Review.
- Author
-
Hamza H, Al-Ansari A, and Navkar NV
- Subjects
- Humans, Surgical Procedures, Operative education, Surgical Procedures, Operative methods, Mentors, Telemedicine, Mentoring methods
- Abstract
Background: Telementoring technologies enable a remote mentor to guide a mentee in real-time during surgical procedures. This addresses challenges, such as lack of expertise and limited surgical training/education opportunities in remote locations. This review aims to provide a comprehensive account of these technologies tailored for open surgery. Methods: A comprehensive scoping review of the scientific literature was conducted using PubMed, ScienceDirect, ACM Digital Library, and IEEE Xplore databases. Broad and inclusive searches were done to identify articles reporting telementoring or teleguidance technologies in open surgery. Results: Screening of the search results yielded 43 articles describing surgical telementoring for open approach. The studies were categorized based on the type of open surgery (surgical specialty, surgical procedure, and stage of clinical trial), the telementoring technology used (information transferred between mentor and mentee, devices used for rendering the information), and assessment of the technology (experience level of mentor and mentee, study design, and assessment criteria). Majority of the telementoring technologies focused on trauma-related surgeries and mixed reality headsets were commonly used for rendering information (telestrations, surgical tools, or hand gestures) to the mentee. These technologies were primarily assessed on high-fidelity synthetic phantoms. Conclusions: Despite longer operative time, these telementoring technologies demonstrated clinical viability during open surgeries through improved performance and confidence of the mentee. In general, usage of immersive devices and annotations appears to be promising, although further clinical trials will be required to thoroughly assess its benefits.
- Published
- 2024
- Full Text
- View/download PDF
244. Melanoma prevention using an augmented reality-based serious game.
- Author
-
Ribeiro N, Tavares P, Ferreira C, and Coelho A
- Subjects
- Humans, Self-Examination, Surveys and Questionnaires, Melanoma prevention & control, Augmented Reality, Skin Neoplasms prevention & control
- Abstract
Objectives: The purpose of this study was to field-test a recently developed AR-based serious game designed to promote SSE self-efficacy, called Spot., Methods: Thirty participants played the game and answered 3 questionnaires: a baseline questionnaire, a second questionnaire immediately after playing the game, and a third questionnaire 1 week later (follow-up)., Results: The majority of participants considered that the objective quality of the game was high, and considered that the game could have a real impact in SSE promotion. Participants showed statistically significant increases in SSE self-efficacy and intention at follow-up. Of the 24 participants that had never performed a SSE or had done one more than 3 months ago, 12 (50.0%) reported doing a SSE at follow-up., Conclusions: This study provides supporting evidence to the use of serious games in combination with AR to educate and motivate users to perform SSE. Spot seems to be an inconspicuous but effective strategy to promote SSE, a cancer prevention behavior, among healthy individuals., Practice Implications: Patient education is essential to tackle skin cancer, particularly melanoma. Serious games, such as Spot, have the ability to effectively educate and motivate patients to perform a cancer prevention behavior., Competing Interests: Declaration of Competing Interest The authors declare the following financial interests/personal relationships which may be considered as potential competing interests. Nuno Ribeiro reports financial support was provided by FCT (Portuguese Foundation for Science and Technology)., (Copyright © 2024 The Authors. Published by Elsevier B.V. All rights reserved.)
- Published
- 2024
- Full Text
- View/download PDF
245. Tablet-based Augmented reality and 3D printed templates in fully guided Microtia Reconstruction: a clinical workflow.
- Author
-
Díez-Montiel A, Pose-Díez-de-la-Lastra A, González-Álvarez A, Salmerón JI, Pascau J, and Ochandiano S
- Abstract
Background: Microtia is a congenital malformation of the auricle that affects approximately 4 of every 10,000 live newborns. Radiographic film paper is traditionally employed to bidimensionally trace the structures of the contralateral healthy ear in a quasi-artistic manner. Anatomical points provide linear and angular measurements. However, this technique proves time-consuming, subjectivity-rich, and greatly dependent on surgeon expertise. Hence, it's susceptible to shape errors and misplacement., Methods: We present an innovative clinical workflow that combines 3D printing and augmented reality (AR) to increase objectivity and reproducibility of these procedures. Specifically, we introduce patient-specific 3D cutting templates and remodeling molds to carve and construct the cartilaginous framework that will conform the new ear. Moreover, we developed an in-house AR application compatible with any commercial Android tablet. It precisely guides the positioning of the new ear during surgery, ensuring symmetrical alignment with the healthy one and avoiding time-consuming intraoperative linear or angular measurements. Our solution was evaluated in one case, first with controlled experiments in a simulation scenario and finally during surgery., Results: Overall, the ears placed in the simulation scenario had a mean absolute deviation of 2.2 ± 1.7 mm with respect to the reference plan. During the surgical intervention, the reconstructed ear was 3.1 mm longer and 1.3 mm wider with respect to the ideal plan and had a positioning error of 2.7 ± 2.4 mm relative to the contralateral side. Note that in this case, additional morphometric variations were induced from inflammation and other issues intended to be addressed in a subsequent stage of surgery, which are independent of our proposed solution., Conclusions: In this work we propose an innovative workflow that combines 3D printing and AR to improve ear reconstruction and positioning in microtia correction procedures. Our implementation in the surgical workflow showed good accuracy, empowering surgeons to attain consistent and objective outcomes., (© 2024. The Author(s).)
- Published
- 2024
- Full Text
- View/download PDF
246. Smart goggles augmented reality CT-US fusion compared to conventional fusion navigation for percutaneous needle insertion.
- Author
-
Borde T, Saccenti L, Li M, Varble NA, Hazen LA, Kassin MT, Ukeh IN, Horton KM, Delgado JF, Martin C 3rd, Xu S, Pritchard WF, Karanian JW, and Wood BJ
- Abstract
Purpose: Targeting accuracy determines outcomes for percutaneous needle interventions. Augmented reality (AR) in IR may improve procedural guidance and facilitate access to complex locations. This study aimed to evaluate percutaneous needle placement accuracy using a goggle-based AR system compared to an ultrasound (US)-based fusion navigation system., Methods: Six interventional radiologists performed 24 independent needle placements in an anthropomorphic phantom (CIRS 057A) in four needle guidance cohorts (n = 6 each): (1) US-based fusion, (2) goggle-based AR with stereoscopically projected anatomy (AR-overlay), (3) goggle AR without the projection (AR-plain), and (4) CT-guided freehand. US-based fusion included US/CT registration with electromagnetic (EM) needle, transducer, and patient tracking. For AR-overlay, US, EM-tracked needle, stereoscopic anatomical structures and targets were superimposed over the phantom. Needle placement accuracy (distance from needle tip to target center), placement time (from skin puncture to final position), and procedure time (time to completion) were measured., Results: Mean needle placement accuracy using US-based fusion, AR-overlay, AR-plain, and freehand was 4.5 ± 1.7 mm, 7.0 ± 4.7 mm, 4.7 ± 1.7 mm, and 9.2 ± 5.8 mm, respectively. AR-plain demonstrated comparable accuracy to US-based fusion (p = 0.7) and AR-overlay (p = 0.06). Excluding two outliers, AR-overlay accuracy became 5.9 ± 2.6 mm. US-based fusion had the highest mean placement time (44.3 ± 27.7 s) compared to all navigation cohorts (p < 0.001). Longest procedure times were recorded with AR-overlay (34 ± 10.2 min) compared to AR-plain (22.7 ± 8.6 min, p = 0.09), US-based fusion (19.5 ± 5.6 min, p = 0.02), and freehand (14.8 ± 1.6 min, p = 0.002)., Conclusion: Goggle-based AR showed no difference in needle placement accuracy compared to the commercially available US-based fusion navigation platform. Differences in accuracy and procedure times were apparent with different display modes (with/without stereoscopic projections). The AR-based projection of the US and needle trajectory over the body may be a helpful tool to enhance visuospatial orientation. Thus, this study refines the potential role of AR for needle placements, which may serve as a catalyst for informed implementation of AR techniques in IR., (© 2024. This is a U.S. Government work and not under copyright protection in the US; foreign copyright protection may apply.)
- Published
- 2024
- Full Text
- View/download PDF
247. Remotely prescribed, monitored, and tailored home-based gait-and-balance exergaming using augmented reality glasses: a clinical feasibility study in people with Parkinson's disease.
- Author
-
Hardeman LES, Geerse DJ, Hoogendoorn EM, Nonnekes J, and Roerdink M
- Abstract
Background: Exergaming has the potential to increase adherence to exercise through play, individually tailored training, and (online) remote monitoring. Reality Digital Therapeutics (Reality DTx
® ) is a digital therapeutic software platform for augmented reality (AR) glasses that enables a home-based gait-and-balance exergaming intervention specifically designed for people with Parkinson's disease (pwPD)., Objective: The primary objective was to evaluate the feasibility and potential efficacy of Reality DTx® AR exergaming intervention for improving gait, balance, and walking-adaptability fall-risk indicators. The secondary objective was to evaluate the potential superiority of AR glasses [Magic Leap 2 (ML2) vs. HoloLens 2 (HL2)]., Methods: This waitlist-controlled clinical feasibility study comprised three laboratory visits (baseline; pre-intervention; and post-intervention), a home visit, and a 6-week AR exergaming intervention. Five complementary gait-and-balance exergames were remotely prescribed (default five sessions/week of 30 active minutes/session), monitored, and tailored. Feasibility was assessed in terms of safety, adherence, and user experience. During laboratory visits, gait-and-balance capacity was assessed using standard clinical gait-and-balance tests and advanced walking-adaptability fall-risk assessments., Results: In total, 24 pwPD participated. No falls and four near falls were reported. Session adherence was 104%. The User Experience Questionnaire scores for Reality DTx® ranged from above average to excellent, with superior scores for HL2 over ML2 for Perspicuity and Dependability. Intervention effects were observed for the Timed Up and Go test (albeit small), the Five Times Sit to Stand test, and walking speed. Walking-adaptability fall-risk indicators all improved post-intervention., Conclusion: Reality DTx® is a safe, adherable, usable, well-accepted, and potentially effective intervention in pwPD. These promising results warrant future randomized controlled trials on the (cost-)effectiveness of home-based AR exergaming interventions for improving gait, balance, and fall risk., Clinical Trial Registration: ClinicalTrials.gov, identifier NCT05605249., Competing Interests: This study was part of a collaboration between Vrije Universiteit Amsterdam and Strolll Limited, the manufacturer of Reality DTx®, which was formalized in a consortium agreement associated with their joint EUreka Eurostars grant. The Vrije Universiteit Amsterdam transferred IP related to AR cueing and data science to Strolll Limited in return for share options. MR is scientific advisor for Strolll Limited ancillary to his full-time position as Associate Professor Technology in Motion at the Vrije Universiteit Amsterdam. Anonymized information on technical issues, adherence, usability and exergame performance obtained in this study were shared with Strolll Limited for further development of Reality DTx®., (Copyright © 2024 Hardeman, Geerse, Hoogendoorn, Nonnekes and Roerdink.)- Published
- 2024
- Full Text
- View/download PDF
248. Opportunities and Challenges for Augmented Reality in Family Caregiving: Qualitative Video Elicitation Study.
- Author
-
Albright L, Ko W, Buvanesh M, Haraldsson H, Polubriaginof F, Kuperman GJ, Levy M, Sterling MR, Dell N, and Estrin D
- Abstract
Background: Although family caregivers play a critical role in care delivery, research has shown that they face significant physical, emotional, and informational challenges. One promising avenue to address some of caregivers' unmet needs is via the design of digital technologies that support caregivers' complex portfolio of responsibilities. Augmented reality (AR) applications, specifically, offer new affordances to aid caregivers as they perform care tasks in the home., Objective: This study explored how AR might assist family caregivers with the delivery of home-based cancer care. The specific objectives were to shed light on challenges caregivers face where AR might help, investigate opportunities for AR to support caregivers, and understand the risks of AR exacerbating caregiver burdens., Methods: We conducted a qualitative video elicitation study with clinicians and caregivers. We created 3 video elicitations that offer ways in which AR might support caregivers as they perform often high-stakes, unfamiliar, and anxiety-inducing tasks in postsurgical cancer care: wound care, drain care, and rehabilitative exercise. The elicitations show functional AR applications built using Unity Technologies software and Microsoft Hololens2. Using elicitations enabled us to avoid rediscovering known usability issues with current AR technologies, allowing us to focus on high-level, substantive feedback on potential future roles for AR in caregiving. Moreover, it enabled nonintrusive exploration of the inherently sensitive in-home cancer care context., Results: We recruited 22 participants for our study: 15 clinicians (eg, oncologists and nurses) and 7 family caregivers. Our findings shed light on clinicians' and caregivers' perceptions of current information and communication challenges caregivers face as they perform important physical care tasks as part of cancer treatment plans. Most significant was the need to provide better and ongoing support for execution of caregiving tasks in situ, when and where the tasks need to be performed. Such support needs to be tailored to the specific needs of the patient, to the stress-impaired capacities of the caregiver, and to the time-constrained communication availability of clinicians. We uncover opportunities for AR technologies to potentially increase caregiver confidence and reduce anxiety by supporting the capture and review of images and videos and by improving communication with clinicians. However, our findings also suggest ways in which, if not deployed carefully, AR technologies might exacerbate caregivers' already significant burdens., Conclusions: These findings can inform both the design of future AR devices, software, and applications and the design of caregiver support interventions based on already available technology and processes. Our study suggests that AR technologies and the affordances they provide (eg, tailored support, enhanced monitoring and task accuracy, and improved communications) should be considered as a part of an integrated care journey involving multiple stakeholders, changing information needs, and different communication channels that blend in-person and internet-based synchronous and asynchronous care, illness, and recovery., (©Liam Albright, Woojin Ko, Meyhaa Buvanesh, Harald Haraldsson, Fernanda Polubriaginof, Gilad J Kuperman, Michelle Levy, Madeline R Sterling, Nicola Dell, Deborah Estrin. Originally published in JMIR Formative Research (https://formative.jmir.org), 30.05.2024.)
- Published
- 2024
- Full Text
- View/download PDF
249. Surgical Treatment of Calcified Thoracic Herniated Disc Disease via the Transthoracic Approach with the Use of Intraoperative Computed Tomography (iCT) and Microscope-Based Augmented Reality (AR).
- Author
-
Pojskić M, Bopp MHA, Nimsky C, and Saß B
- Subjects
- Humans, Female, Middle Aged, Male, Calcinosis surgery, Calcinosis diagnostic imaging, Adult, Microscopy methods, Treatment Outcome, Magnetic Resonance Imaging methods, Intervertebral Disc Degeneration, Intervertebral Disc Displacement surgery, Intervertebral Disc Displacement diagnostic imaging, Tomography, X-Ray Computed methods, Thoracic Vertebrae surgery, Thoracic Vertebrae diagnostic imaging, Augmented Reality
- Abstract
Background and Objectives : The aim of this study is to present our experience in the surgical treatment of calcified thoracic herniated disc disease via a transthoracic approach in the lateral position with the use of intraoperative computed tomography (iCT) and augmented reality (AR). Materials and Methods: All patients who underwent surgery for calcified thoracic herniated disc via a transthoracic transpleural approach at our Department using iCT and microscope-based AR were included in the study. Results : Six consecutive patients (five female, median age 53.2 ± 6.4 years) with calcified herniated thoracic discs (two patients Th 10-11 level, two patients Th 7-8, one patient Th 9-10, one patient Th 11-12) were included in this case series. Indication for surgery included evidence of a calcified thoracic disc on magnet resonance imaging (MRI) and CT with spinal canal stenosis of >50% of diameter, intractable pain, and neurological deficits, as well as MRI-signs of myelopathy. Five patients had paraparesis and ataxia, and one patient had no deficit. All surgeries were performed in the lateral position via a transthoracic transpleural approach (Five from left side). CT for automatic registration was performed following the placement of the reference array, with a high registration accuracy. Microscope-based AR was used, with segmented structures of interest such as vertebral bodies, disc space, herniated disc, and dural sac. Mean operative time was 277.5 ± 156 min. The use of AR improved orientation in the operative field for identification, and tailored the resection of the herniated disc and the identification of the course of dural sac. A control-iCT scan confirmed the complete resection in five patients and incomplete resection of the herniated disc in one patient. In one patient, complications occurred, such as postoperative hematoma, and wound healing deficit occurred. Mean follow-up was 22.9 ± 16.5 months. Five patients improved following surgery, and one patient who had no deficits remained unchanged. Conclusions : Optimal surgical therapy in patients with calcified thoracic disc disease with compression of dural sac and myelopathy was resectioned via a transthoracic transpleural approach. The use of iCT-based registration and microscope-based AR significantly improved orientation in the operative field and facilitated safe resection of these lesions.
- Published
- 2024
- Full Text
- View/download PDF
250. Use of optical see-through head-mounted devices in dentistry - a scoping review.
- Author
-
Sadilina S, Strauss FJ, Jung RE, Joda T, and Balmer M
- Abstract
Aim: The aim of this scoping review was to identify the scientific evidence related to the utilization of Optical See- Through Head-Mounted Display (OST-HMD) in dentistry, and to determine future research needs., Methods: The research question was formulated using the "Population" (P), "Concept" (Cpt), and "Context" (Cxt) framework for scoping reviews. Existing literature was designated as P, OST-HMD as Cpt, and Dentistry as Cxt. An electronic search was conducted in PubMed, Embase, Web of Science, and CENTRAL. Two authors independently screened titles and abstracts and performed the full-text analysis., Results: The search identified 286 titles after removing duplicates. Nine studies, involving 138 participants and 1760 performed tests were included in this scoping review. Seven of the articles were preclinical studies, one was a survey, and one was a clinical trial. The included manuscripts covered various dental fields: three studies in orthodontics, two in oral surgery, two in conservative dentistry, one in general dentistry, and the remaining one in prosthodontics. Five articles focused on educational purposes. Two brands of OST-HMD were used: in eight studies HoloLens Microsoft was used, while Google Glass was utilized in one article., Conclusions: The overall number of included studies was low; therefore, the available data from this review cannot yet support an evidence-based recommendation for the clinical use of OST-HMDs. However, the existing preclinical data indicate a significant capacity for clinical and educational implementation. Further adoption of these devices will facilitate more reliable and objective quality and performance assessments, as well as more direct comparisons with conventional workflows. More clinical studies must be conducted to substantiate the potential benefits and reliability for patients and clinicians.
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.