this work takes place in the futuristic “Augmentarium,” a facility that includes projection displays, AR visors, GPU clusters, human vision and human- computer interaction technologies.
In a recent live demonstration for members of the press, Sarah Murthi, MD, associate professor of surgery at the UMD School of Medicine and trauma surgeon at the Cowley Shock Trauma Center, performed an ultrasound examination of a patient’s heart using AR and the Microsoft HoloLens.
“While imaging has radically evolved, how images are displayed is basically the same as it was in 1950,” Dr. Murthi says. “Visual data are always shown on a 2D flat screen, on displays that force health care providers to look away from the patient, and even away from their own hands while performing procedures.” Dr. Murthi also notes that the images are not displayed from the perspective of the viewer, but rather from that of the imaging device. “AR’s potential ability to concurrently display imaging data and other patient information could save lives and decrease medical errors.”
Potential applications Several of the posters and presentations at the SIR 2018 Annual Scientific Meeting demonstrate the intense interest in adapting AR and VR for use in interventional radiology, including as medical simulation training for students, for patient education, and for planning and conducting procedures. Vikash Gupta, MD, a resident at the University of Maryland Medical Center, provided an overview of the many possibilities in a poster titled, “Virtual and AR in the interventional radiology suite: What the interventionalist should know.”
“Technology has evolved so much during residency,” he says. “We are trying to see whether AR and VR have the same promise of any new technology: to make procedures easier and more efficient.” He sees potential in many areas, including preoperative planning and patient education. An image on his poster demonstrates the “limitless” potential of AR displays, showing six studies displayed simultaneously using Microsoft HoloLens.
Planning procedures. AR can be useful in planning treatment for splenic artery aneurysms and increasing physician confidence in the procedure, as demonstrated by Nishita Kothary, MD, FSIR, and Zlatko Devcic, MD, of Stanford University. “These are some of the most challenging cases in IR,” Dr. Kothary says. She and her colleagues performed a retrospective study using old volumetric data and reconstructed it using True 3D software from EchoPixel. They then queried IRs about their confidence in identifying the inflow/outflow arteries; increased confidence was reported in 93 percent of cases. Among IRs with only one year of experience, confidence increased in 100 percent of cases.
"VR provides much better spatial orientation and a more intuitive understanding of the patient's anatomy. The ability to rotate images and zoom in gives operators a greater appreciation for each patient's unique anatomy," says Dr. Devcic.
“It’s like GPS on steroids,” Dr. Kothary says. “These reconstructed data sets provided us with solid information. When you have a good understanding of the target to relevant arterial structures, everything goes much more smoothly.”
A similar study was conducted at the University of Pennsylvania using Microsoft HoloLens and custom software. Brian Park, MD, MS, and Terence Gade, MD, PhD, implemented 3D holographic volume-rendering workflows using both surface- and direct volume-rendering techniques. Working at the Penn Image- guided Interventions (PIGI) Lab, they assessed the techniques in lung tumor
ablation, liver abscess drainage, pre-Y- 90 radioembolization and foot tumor ablation. Among their most significant findings was that time to render the direct volume, or 3D, images was 1–2 minutes, compared with up to 45 minutes for surface rendering, though DVR is more computationally intensive. Every faculty member who participated reported enhanced confidence in procedural anatomy.
“The models are accurate to scale with the ability to rotate, translate and magnify holograms in real time,” Dr. Park points out. “The dynamic cutting plane allows virtual dissection and exploration of internal volume contents from any angle.” He also notes that virtual needle tracks can be placed to visualize the optimal approach. Full implementation of the technology could help to optimize the procedural approach and avoid potential complications.
Performing procedures. Dr. Gade, co-founder of the PIGI Lab, and his team are investigating how these technologies can be integrated into clinical workflows in meaningful ways. It’s critical that providers feel comfortable with the technology and see the potential benefit, he says. “We must answer questions like, Does it offer something clinicians don’t have? and Can they use it without it interrupting their work?”
AR offers the possibility of 3D, co-registered projections that could represent a new way of performing image-guided procedures. Co-registration takes previous medical imaging and links it with real-time images. Dr. Gupta and his colleagues
Dr. Sarah Murthi tests an augmented reality prototype that overlays ultrasound imaging data directly on the patient. This allows constant visual contact with the patient and imaging sensor, as opposed to repeatedly looking away toward a monitor. Photo courtesy of Maryland Blended Reality Center
sirweb.org/irq | 29
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40