Add to favorites

#Product Trends

Augmented reality ultrasound: putting the focus on patients

Interview with Matthias Noll, Deputy Head of the Visual Healthcare Technologies Competence Center, Fraunhofer Institute for Computer Graphics Research IGD

This is how a conventional ultrasound scan works: patients lie down on a table next to the ultrasound machine. A doctor uses a probe to scan the part of the body in question, while he or she looks at the pictures on a monitor. In other words, the physician either focuses on his/her hand on the patient or the monitor. The Fraunhofer IGD wants to change this process as part of the "sonAR" project.

In this MEDICA-tradefair.com interview, Matthias Noll talks about the system that combines ultrasound with augmented reality, describes the potential of AR and VR, and reveals why some people still tend to shy away from using new technologies.

Mr. Noll, what is the objective of the "sonAR" project?

Matthias Noll: Our goal is to use AR to display ultrasound images right in the physician’s field of view. That means the ultrasound scan or sonogram is superimposed right on the patient, eliminating the need to use a monitor. Essentially, this process allows the doctor to take a virtual look inside the patient. He or she no longer has to identify the anatomical structures and the angles in which they are shown. This also makes hand-eye coordination skills redundant, which physicians typically need for hands-on scanning while looking at a monitor.

Is the AR system limited to specific applications?

Noll: Not really. The technology can be applied wherever ultrasound is performed. However, we primarily focus on ultrasound-guided needle biopsies. The AR system allows physicians to take a biopsy by displaying and aiming at the target area. We are also interested in using the technology in intraoperative imaging. AR visualization could be very effective for applications that must include external structures in the planning process.

What do you need to use the application?

Noll: You need a conventional ultrasound machine, an AR headset and a PC/computer to make the required calculations via our software. Every ultrasound machine typically has a central processing unit, but you need access to the system. You can also use a separate PC to feed data directly to the AR headset.

One of the most important components in this setting is the tracking system, which computes the position of the ultrasound probe in relation to the headset. This is essential to facilitate the topographical representation of ultrasound images on the ultrasound probe and thus ensure the spatial position and orientation relative to the user's field of view. If we did not take this step, the ultrasound plane would end up somewhere in space without correlation to the ultrasound probe. We chose an external optical ultrasound probe/AR glasses tracking system for our purposes. This device uses two cameras to determine the position of the AR headset worn by the user and the position of the ultrasound probe reflected by tracking markers.

What is the current status of the project?

Noll: Right now, the system is a demonstrator model we are hopefully able to use in an OR demonstration laboratory later this year. The next step is to find the right technology partner to join us in advancing this technology or bringing it to market. Parts of the system would have to be integrated into or connected to an ultrasound machine to make this a reality.

What is the hidden potential of the AR system and some possible future developments?

Noll: A conceivable application would be to use artificial intelligence to automatically detect physical anomalies in the ultrasound image. These could be displayed live as virtual content in the physician's headset and indicated via a border or arrows for example. Another idea is to overlay useful real-time information for the user, including the size or volume of the anomaly. I could also envision an automated segmentation of specific structures during scanning, which is subsequently displayed as a stationary 3D model in the presentation space. You could also use this to create a 3D surface model of an organ for example. The technology creates many opportunities and offers many conceivable scenarios and options for further development.

What role will AR and VR technology play in the future of medicine?

Noll: VR is especially helpful when you want to present objects in a virtual environment. This could be part of virtual reality surgical training, allowing surgeons to practice for an actual surgery at a later date by using virtual content. AR makes sense wherever virtual content is meant to enrich reality. Another possible AR application is the indication of tagged lymph nodes as was part of our 3DArile project. By using AR technology and fluorescent dyes, the system visualizes the location of targeted lymph nodes. The physician is able to visually track the process, allowing a targeted lymph node biopsy and removal.

I believe that AR and VR will be an important part of the operating room in the future. Needless to say, this hinges on whether doctors are ready and willing to use these technologies. That is why initial obstacles must be kept to a minimum and prerequisite technology skills must be easy to learn. Even though there is often a willingness to try something new, experience has shown that people ultimately choose a technology that has stood the test of time or that they are familiar with as part of their daily routine.

Matthias Noll, Deputy Head of the Visual Healthcare Technologies Competence Center, Fraunhofer IGD

Details

  • Fraunhoferstraße 5, 64283 Darmstadt, Germany
  • Fraunhofer IGD

    Keywords