Ultrasound is one of the most important imaging processes of them all. It is inexpensive, fast, can be done anywhere, manages without radiation exposure -- and allows for representation in real time without complication. Now, for the first time, a new combined technology is allowing doctors to view correctly positioned ultrasound images with the help of AR glasses: The section plane through the body appears over the represented structures directly on the patient. In order to bring ultrasound image and AR view together, various calibration steps are required. First, the relative position between the markers of the tracking system on the ultrasound probe and the imaged ultrasound plane that is supposed to be shown in the user’s view needs to be determined. Next, the various elements, such as head position, position of the display elements in the glasses relative to the user’s eye, viewing distance, and others are checked. The system is calibrated to each user¾a process that takes three minutes at most.
The topographical representation of the ultrasound images is based on an external optical tracking system, such as the NDI Polaris. It measures the position of the user’s head and that of the ultrasound probe, identifies the relative position of both objects to one another and displays it in the AR glasses. Unlike established registration processes, where CT or MRI imaging data is first obtained and then superimposed on the patient with AR technology, ultrasound makes live images possible. The stationary camera system used for tracking consists of two infrared-capable cameras. Clamping mechanisms hold infrared markers on the ultrasound probe and the AR glasses, allowing the tracking system to detect the two positions. With Fraunhofer IGD’s software, the positions are analyzed and, with the help of the calibrations, converted to make the correctly positioned display possible.
Advantages and areas of application
The new AR system has versatile uses: It can, for example, improve the success rate of ultrasound-guided biopsies and make the usefulness of ultrasound less dependent on the user. The ultrasound plane is directly on the patient, allowing the doctor to see the needle directly on the patient and be able to better adjust the puncture channel.
Until now, around 10 percent of biopsies need to be repeated or performed as an open biopsy because they were not successful. One reason is the display of the ultrasound image on a monitor. The doctor needs to place the ultrasound probe so the best possible image of the area in question is obtained. The doctor must then abstract the spatial position inside the patient and deduce a path for the biopsy needle that will not injure risk structures while also leading to the intended tissue. Even the safety of surgical procedures can be enhanced with this new technology. The complex anatomy of the liver, for example, with its reticulate vessels, is often a tricky situation for surgeons: Were certain arteries to be injured during surgery, certain areas of the liver would go unsupplied. With AR-assisted ultrasound, surgeons could plan their procedures with greater precision. Instead of doing CT or MRI scans ahead of time, ultrasound imaging data can be used for live intraoperative indexing. Vessels extracted live in 3D from the ultrasound image could also be tracked and displayed as a risk structure during the procedure. In the future, even anatomical abnormalities will be visible directly in the AR image. Artificial intelligence could even be used to detect tissue and organ anomalies.