Holger Graf heads up the Virtual & Augmented Reality Competence Center at Fraunhofer IGD. One of his team’s tasks is to harness augmented reality (AR) for quality control in manufacturing. Holger Graf’s vision is to see this technology become the interface for all types of digital information.
Mr. Graf, you are currently seeking to use AR for quality assurance purposes—for example, in association with Daimler. Why does the car maker need the support of Fraunhofer IGD?
Daimler approached us because of the high training effort associated with performing assembly line inspections. In particular, a product assembly is inserted below a fixed-position camera array. Based on training data, a visual quality assurance system then looks for faults—for example, it checks that all components are present and in the correct position. However, the system training phase takes too long. Even a small product assembly has to be photographed from all sides and for all possible scenarios.
And how can AR help?
With the help of AR, we can make a direct comparison with the geometric data of the CAD reference model. As a result, there is absolutely no system training required. And if the production line switches to a different vehicle variant, we can adapt very quickly. We just need to load the corresponding CAD reference model and configure the inspections. However, the solution is capable of much more.
We can accelerate the manufacturing process. With traditional visual methods, the test object has to be moved beneath the camera array, where it remains stationary for a moment while the inspection is performed. AR allows objects to be checked dynamically—so the assembly line does not have to stop. Production could theoretically continue in a seamless flow, without interruption for inspections.
You have also developed a mobile inspection method.
That’s right. We use the camera on a tablet computer to look at the test object from a variety of perspectives and to compare it with the CAD reference model. This makes it possible for the user to perform a visual inspection by hand—for example, at an off-site location such as a supplier’s factory. We are now working on processing information on depth that would allow us to recognize deformations.
AR is an unfamiliar technology. How are users in the production plant adapting to the solution?
Initially, it is a very steep learning curve. But having more time helps the workers. Once they have used the system a couple of times, they notice they can assemble the components more quickly, and it is easier to perform inspections. That eliminates any acceptance issues—which is not an unknown phenomenon. Whenever people are confronted with a new user interface (for example, when a Microsoft Windows system introduces new functions), it is difficult at first to use them—until we notice that it makes our work easier.
What has been your personal experience of working with AR?
I’m a gadget nerd. I find AR technology and the solutions my team develops super cool. But there are occasionally challenges that make me break out in a sweat.
What do you mean?
The technology is already very advanced and we have achieved a great deal in recent years. But we need to turn our ideas into productive tools. And the requirements of manufacturing environments, such as those in the automotive industry, are very challenging.
So you mean it can be difficult to transfer a technology such as AR into the world of manufacturing because we are more familiar with AR in the world of entertainment?
Yes. But I see it as a form of cross-fertilization. We often forget, for instance, that a technology such as VR was initially deployed in industry. It was only adopted and then advanced by the consumer market at a later date. By the same token, we gained fresh impetus from gamers and used it in our work. And the development is ongoing. The same applies to AR.
In what way?
I am thinking of the new user devices that are coming onto the market, such as the iPad Pro and the iPhone 12. They now have small LiDAR scanners. These mobile products can therefore capture images of their entire surroundings, which immediately begs the question: What can we do with the corresponding point clouds? What applications can we develop for production environments?
How do you see AR evolving?
Our vision is for AR to become the natural interface for the digital information space. In other words, AR is set to become a natural part of output units, enabling the real and digital worlds to merge. The user would do everything via AR. They would put on the headset and see everything in 3D, located in the right position, and with the ability to track objects, even those that are changing dynamically. But that is still a long way off.