Attention: Restrictions on use of AUA, AUAER, and UCF content in third party applications, including artificial intelligence technologies, such as large language models and generative AI.
You are prohibited from using or uploading content you accessed through this website into external applications, bots, software, or websites, including those using artificial intelligence technologies and infrastructure, including deep learning, machine learning and large language models and generative AI.

AUA2022: BEST POSTERS Mixed Reality and 3D Printing Technologies Open New Frontiers for Surgical, Hands-on Remote, Distance Learning

By: Ahmed Ghazi, MD; Vivek Nithipalan | Posted on: 01 Nov 2022

In-person (IP) education that emphasizes learning through clinical experiences, patient encounters, and hands-on training is a core concept of medical education. This concept was recently challenged when maintaining adequate physical distance from others was prioritized. Remote or distance learning where technology is utilized to relay information online between students and educators not physically present in a traditional classroom environment has been an inadequate substitute for mastery of surgical techniques during hands-on experiences.1 As the world prepares for future pandemics given worsening climate change and declining vaccination rates, it is crucial that the surgical community is prepared with effective means to continue training future professionals.2,3 Mixed reality (MR), in contrast to augmented reality that only superimposes a computer-generated image on a user’s view of the real world, is a technology that merges real and virtual worlds allowing physical and digital objects to co-exist and interact in real time, unlocking the connection between participants during a hands-on educational interaction.4 These innovative MR technologies have the potential to transform the delivery of surgical education,5 and help tackle many of the challenges currently faced in the delivery of high-quality, hands-on education globally including quality, consistency, accessibility, and cost.6,7

The Simulation Innovation Laboratory at the University of Rochester, New York, combined 3 technologies: MR cloud-based software to merge 2 video streams in real time, Vuzix M4000 smart glasses to seamlessly view the merged environments, and 3D-printing hydrogel models with incorporated performance metrics to facilitate remote instruction in a risk-free manner.8 To evaluate the efficacy of MR-based remote instruction relative to that of IP instruction during a transrectal ultrasound biopsy (TRUS-Bx), 30 urology trainees reviewed educational videos on relevant anatomy, interpreting ultrasound, and performing the TRUS-Bx prior to randomization into MR and IP groups. Each trainee completed a pre-test, 3 training sessions, and post-test. During test sessions, participants independently administered local anesthesia, measured the prostate, and obtained 14 biopsies on a hydrogel testing model with individually-colored biopsy regions. Accuracy was defined as the percentage of core with the corresponding color for the given biopsy region (Fig. 1, A). During training sessions participants were guided through procedural steps of the TRUS-Bx on training single-colored models. MR sessions utilized Zoom to transmit the ultrasound view to the instructor and Vuzix smart glasses to display the superimposed view of the surgical field with the remote instructor’s guidance to the participant (Fig. 1, B). Post-training surveys assessed trainee perceptions of the session and proctors evaluated trainee performance.

Figure 1. Biopsy of right base during simulated TRUS-Bx using the hydrogel model (lower right) demonstrates a yellow core denoting a 100% core (A). Instructor using mixed reality (MR) to convey instructions during remote learning (B). The lower left panel of B shows a trainee wearing the Vuzix smart glasses, and the lower right panel shows a view of a superimposed instructor’s hand on the real-world view.

Figure 2. (Left) Results of confidence survey. (Right) comparison of pre- and posttest % of correct cores. IP indicates in-person; MR, mixed reality; SIM, simulated; TRUS, transrectal ultrasound.

Pre-test core percentages were similar between groups (MR: 17.9% vs IP: 26.7%, P = .44). While post-test core percentages showed significant improvement in performance for both groups (MR: 75.9% and IP: 62.3%), MR group performance increase between tests was 1.5 times greater than that of IP groups (MR: +58:0% vs IP: +35.9%, P < .01) despite pre-session participant perceptions that remote training may hinder their ability to learn (Fig. 2). Proctor evaluations of participants on transrectal ultrasound manipulation, prostate measurement, anesthetic administration, and biopsy ranging from 1 to 3 (below to above expectations) showed higher averaged post-test performance for the MR group by +0.5, +0.1, +0.8, and +0.5, respectively.

This application of MR technology to remote learning not only contradicted the perceptions of the trainees, but demonstrated its effectiveness in instructing a technical surgical procedure. These results are not only promising for remote learning moving forward, but can also set a new standard for cross-institutional and global surgical instruction.

  1. Ehrlich H, McKenney M, Elkbuli A. We asked the experts: virtual learning in surgical education during the COVID-19 pandemic: shaping the future of surgical education and training. World J Surg. 2020;44(7):2053-2055.
  2. Altizer S, Ostfeld RS, Johnson PTJ, Kutz S, Harvell CD. Climate change and infectious diseases: from evidence to a predictive framework. Science. 2013;341(6145):514-519.
  3. Hong K, Zhou F, Tsai Y, et al. Decline in receipt of vaccines by Medicare beneficiaries during the COVID-19 pandemic: United States, 2020. MMWR Morb Mortal Wkly Rep. 2021;70(7):245-249.
  4. Marr B. The Important Difference Between Virtual Reality, Augmented Reality and Mixed Reality. Forbes. July 19, 2019.
  5. Pennefather P, Krebs C. Exploring the role of XR in visualisations for use in medical education. Adv Exp Med Biol. 2019;1171:15-23.
  6. Wish-Baratz S, Crofton A, Gutierrez J, Henninger E, Griswold M. Assessment of mixed-reality technology use in remote online anatomy education. JAMA Netw Open. 2020;3(9):e2016271.
  7. Ruthberg J, Tingle G, Tan L, et al. Mixed reality as a time-efficient alternative to cadaveric dissection. Med Teach. 2020;42(8):896-901.
  8. Saba P, Shepard L, Nithipalan V, et al. Design and development of a high-fidelity transrectal ultrasound (TRUS) simulation model for remote education and training. Urology Video Journal. 2022;16:100183.

advertisement

advertisement