Virtually Unstoppable
By Kathy Hardy
Radiology Today
Vol. 23 No. 2 P. 18

AR, VR Shape Radiology Training, Diagnostics, Clinical Decision Support

Situational awareness on the highway is about knowing where you are, what hazards may lie ahead, and how to maneuver to your destination. The same can be said for situational awareness in the surgical or IR suite, where operators need a reliable map of where they are in the patient’s anatomy and how to avoid any threats to health and safety.

That’s where tools such as virtual reality (VR) and augmented reality (AR) can aid in guidance and visualization. While each technology has its individual nuances—VR enhances a fictional reality and AR enhances virtual and real-world views—each gives providers a clearer view of the situation in front of them.

With any technology, advances continue to fine-tune that vision. Radiologists are often the drivers of this new technology. Diagnostic neuroradiology specialist Wendell Gibby, MD, founder and CEO of Novarad, sees radiology at the nexus of surgical and minimally invasive procedures, with radiologists acting as the curators of technology such as this. They bring knowledge of advanced image processing, normal anatomy, and imaging of disease states to the table.

“Radiologists see the segmentation and outline of the anatomy and need to be involved in surgical planning, even if they’re not interventional radiologists,” Gibby says. “Oftentimes, radiologists have a better grasp of vital structures and pathology.”

Gibby recounts that AR and VR initiatives began 12 years ago, as fast rendering and 3D imaging was coming into play, “before metaverse was cool.” Both of Novarad’s AR products—VisAR for surgical guidance and OpenSight for surgical navigation—render 2D, 3D, and 4D digital images from any modality into a hologram. Used in conjunction with the Microsoft HoloLens 2 system, each offers large field of vision, high resolution, and control of holograms, including placement of the hologram onto the patient.

OpenSight is designed to visualize 3D holograms of the patient’s internal anatomy directly on the patient. The 3D holograms are created by a refractory system in the device using a combination of the Microsoft HoloLens hardware and OpenSight software. Users wear a headset that looks similar to VR goggles, but with clear lenses, allowing augmented images to be comingled with reality.

“The system uses images like a 3D printer, but to virtually operate and dissect on a patient,” he says. “Because the image is coregistered to the patient, you could also use this for surgical guidance and IR. Surgeons and interventional radiologists are more used to a tactile approach to surgery planning. With this, they can convert CT to a volume and then interact with that they’re seeing visually.”

With VisAR, AR converts DICOM image data into a 3D CT scan superimposed on the patient, in an overlay view for the surgeon. This allows the surgeon to view and scroll through the imaging studies in their correct anatomical and surgical context. There is no need for the surgeon to look away to view images, then back at the patient.

The VisAR system takes AR a step further by providing guidance and tracking for the surgeon. The image overlay in the correct orientation identifies where incisions should be made, providing a virtual pathway for the surgeon to follow.

Gibby refers to the degree of accuracy with the VisAR solutions in a case published in December in the Journal of Neurosurgery . He and a team of researchers used VisAR as a virtual guide for needle insertion deep into a simulated brain. With the guidance system, results of the study showed a 2.3-degree angle of error and 3-mm margin of error.

“The mean of the radial error and the angle or error with the associated variance measures demonstrate that when using VisAR navigation to direct a small needle to small neural lesions, it may be as or more accurate than current neurosurgical techniques such as landmark-based manual procedures and many stereotactic devices,” Gibby noted in the article.

Registration
Some of that improved accuracy is a result of further developments of the original Novarad technology. A second generation of the product automated alignment of the hologram with the patient by coordinating image visible AprilTags with CT scans. Stickers printed with AprilTags, strategically placed on the patient, are segmented in the 3D CT volume and then registered optically by the device. The result is a coregistered accuracy of 0.3 mm. Images that are more strategically placed can result in a reduced margin of error and time in surgery.

“If you’re trying to fix a broken rib in the operating room and you can’t see the ribs, you have to make a larger incision and then feel for the broken rib,” Gibby says. “It benefits the patient to have an overlay of the image to enhance visibility.”

The insertion of pedicle screws as part of a spinal fusion procedure is another area where AR can provide precision guidance, he says.

“You’re looking at the insertion of 124 pedicle screws with 96% accuracy,” he says. “That’s on a level of what robotics can do, but with a less expensive headset instead of a robot. Using a custom rendering engine, you can process them faster and on a lower-power headset. You can reconstruct, on the fly, on a doctor’s head.”

Diagnostic Imaging Anywhere
Remote diagnostic imaging is another benefit of the latest in virtual technology. In 2021, Canadian developer of immersive medical software Luxsonic received Health Canada authorization for its VR radiology platform, SieVRt. This is the first time that a VR software platform such as this was approved as a Class II Medical Device for diagnostic imaging by a national regulatory agency. SieVRt is a digital twin of the radiology reading room in VR, and enables radiologists to work remotely, in a virtual office capacity, replicating the reading room setting for offsite use.

Luxsonic cofounder and CEO Mike Wesolowski, PhD, called on his early exposure to radiology while considering the incorporation of VR into the field.

“My father is a retired specialist in nuclear medicine, and as a child, I would often watch him work in his reading room in the hospital,” Wesolowski says. “I saw all the cutting-edge medical technology he used to help people, and that sparked my interest in using technology for good.”

After earning his PhD in physics, Wesolowski continued to contemplate what technology could bring to the imaging community.

“During my postdoctoral fellowship in medical imaging, I would hear radiologists say they felt trapped by the reading room,” he says. “The challenge was how to provide them with that familiar workflow outside of the traditional clinical setting. Everything in the reading room is important for diagnostic imaging, from the lighting, to monitor configuration, to software. It’s also very personal to the individual radiologist, not to mention the cost and need for space if someone wanted to set up a reading room of their own.”

Jump ahead to 2016 and the introduction of new VR headsets, and Wesolowski and the Luxsonic team considered the possibility of building a virtual reading room. In early 2021, the company launched several system modules that did not require government regulatory approval—SieVRt Basic, Med Ed, and Collaboration—for use by students, trainees, and physicians as advanced visualization tools.

“These modules can be used as interactive teaching tools, enabling trainees to become fully immersed in standardized training environments,” he says. “The Basic module can be downloaded for free, so that anyone with their own headsets can view and interact with DICOM images in the VR world. For medical education, use the Med Ed module to access 50,000 radiology teaching cases for self-directed learning. Radiology education has been disrupted during COVID, and training via Zoom has not been sufficient to fill the void of hands-on training.”

The Collaboration module enables multiple people to enter a virtual reading room, to converse via avatars, and interact with medical images, and each other, in real time. Wesolowski says this feature is useful for education during interdisciplinary consults and tumor boards.

“With a collaborative approach, you can break down the silos within medicine and bring everyone together virtually,” he says. “This is important as we see more telehealth approaches emerging.”

The SieVRt Diagnostic module followed on the heels of these modules. With its authorization for use in diagnostic imaging, radiologists in Canada can bring a personalized virtual reading room with them wherever they go, using a VR headset and the system’s software, and remotely diagnose patients. Luxsonic is also currently working toward FDA approval for use of SieVRt in the United States.

“Radiologists can’t easily work remotely like many people did during COVID-19,” Wesolowski says. “With Health Canada’s approval of SieVRt, radiologists can bring their own personalized virtual office with them wherever they go. With SieVRt and a portable VR headset, they have all the tools they need to remotely diagnose patients.”

A key to incorporation of virtual and augmented reality into diagnostic radiology, he says, is the ability to show that immersive technology is equivalent to traditional diagnostic equipment. SieVRt has undergone clinical trials to show this equivalence and is actively being used in research trials across North America.

“For adoption into the health care system, immersive technologies need to be treated like any other medical device,” Wesolowski says.

Rural Access to Health Care
A VR software platform such as SieVRt is helpful not only in enabling traditional workflow to go mobile but can also give patients in rural locations access to the same radiology services as patients in urban communities. Wesolowski sees the possibility of bringing together a complete interdisciplinary team without them having to travel to a remote location. This approach is becoming more necessary as telehealth needs increase, particularly in light of the COVID-19 pandemic.

“You have the potential to transform remote and rural health care along with urban health care,” he says. “We have a divide between remote and urban care, and even issues within cities where access to health care is poor. Virtual technology can bridge the gap, connecting resources with the people who need them. For example, with SieVRt connected to a portable X-ray or ultrasound device, you can drop a radiology clinic anywhere in the world. Then you can connect that virtual clinic to experts in real time, dramatically improving access to care. We’re pushing the boundaries of what is possible for health care delivery.”

Next Applications
Luxsonic sees machine learning as a viable module down the road, with tools for training, diagnostics, and clinical decision support within the virtual reading room. At Novarad, a current “work in progress” involves tool tracking. Gibby explains that they’re working on adding an optical code to the biopsy needle, giving users the capability to track the needle while looking through the headset.

“With a system like this, you don’t have to slide the patient into the CT multiple times,” he says. “You can just use AR.”

When it comes to future approaches to VR and AR, Gibby refers to the merging of 3D data sets, ultrasound, or even fluoroscopy as the creation of a “mixed reality” visualization. He uses the example of how a TIPS (transjugular intrahepatic portosystemic shunt) procedure could be enhanced with the use of a combination of visualization tools.

“You conduct the CTA [CT angiography] in advance and the data set shows the vessels,” he says. “You can find the target in the 3D data set preoperatively. Then, you can create a map with MR or CT onto the patient’s body and merge the image with real-time fluoroscopy. You don’t need contrast. You then can take a view of tissue and see where the catheter is located and where it needs to be. The result is a digital roadmap of where you need to go.”

With continued education, pilot programs, and sharing of ideas at users group gatherings, Gibby believes that collaborative efforts are key to further developments in virtual and augmented imaging solutions.

“We need to get into the heads of the users and potential users,” he says. “If they learn more about the technology early on, they will want to use it. We need to get new technology into the hands of the opinion leaders and get feedback to improve.”

— Kathy Hardy is a freelance writer based in Pottstown, Pennsylvania. She is a regular contributor to Radiology Today.