HANNOVER, Germany –In preparation for mass customization, for starters, Japanese and German tech research officials today committed to expanding their joint work to establish a “social-technical or maybe ‘cyber-social’ environment where ‘digital companions’ and production lines communicate with humans” working in manufacturing, Andreas Dengel said in an interview with Smart Manufacturing magazine here at the CeBIT (Centrum der Büroautomation und Informationstechnologie und Telekommunikation) fair.
“Industry 4.0 has many consequences, not only for environments like factories but also active components, activators agents and human beings,” said Dengel, who heads the smart data & knowledge services department of the German Research Center for Artificial Intelligence (DFKI). “A digital companion is someone who is looking over your shoulder and observing your behavioral aspects—who is looking at what is relevant in a certain situation and tries to overcome complexities.”
End goals include reducing data, process complexity and interactions—“just to concentrate on the most relevant part while you do some jobs.”
This becomes critical in the face of mass customization, Dengel noted.
“If you look at ‘batch size: one’ production, it means individuality in the future. It also means the processes for configuring and maintaining things will change, and this requires a new way of educating: One, it is insight [into issues that require machine learning uncovers best], and two, it is maybe also some companion who is taking over some roles and completive aspects.”
DFKI worked on the technological innovation for the last year with Hitachi, which demonstrated it today at CeBIT.
The pact signed today broadens that work in Japan, starting with expanded use inside Hitachi, and makes it available to German companies that want insight into work activity.
Hitachi’s work looked at humans working on assembly lines, Hisashi Ikeda, GM of Hitachi’s Center for Technology Innovation-Systems Engineering, said in an interview with Smart Manufacturing magazine.
They were outfitted with eye trackers that measure visual attention and sensors on their arms that measure muscle contractions. “With this combination of body attention and visual attention, you can really control what’s going on, and the companion can on one hand monitor what you are doing and measure maybe your work load or cognitive load and on the other hand maybe control whether you are doing the right things,” Dengel said.
The technology should help with the worldwide skilled labor problem in manufacturing, Ikeda said.
It should also help with productivity. “Many manufacturers have a problem with lost costs,” he said. “In test programs, we found we could apply this technology to decrease lost costs.”
All manufacturing segments will benefit from the technology, so long as they involve manual operations, Ikeda said.
“The technology is unlimited,” Dengel added. “Everywhere you use your body to do something, you can measure body activity and visual activity” and bring about improved processes.
Eye-tracking work pays off
DFKI came up with the idea for the social-technical environment inhabited by humans in manufacturing and their “digital companions,” he said.
Part of the effort grew out of DFKI’s work on eye tracking over the last decade.
“The eyes interface with the brain, and how you are looking and whether you are hesitating, whether you are interested—the eye muscles behave differently,” Dengel said. That can tell the manager of an assembly line a lot about a worker’s cognitive load.
By studying eye behavior, one can spot motivation, interest, excitement, problem investigation and comprehension, he said. “You can read all these things in your eyes.”
The work is likely to be used in the future to train robots, as well: After experts outfitted with cameras and eye triggers and muscle sensors show how a task is most efficiently performed, that intelligence can be used to program the bots “on what’s relevant in a certain situation,” Dengel said.
The work undertaken in Japan in the last year also incorporated what DFKI learned from working with graphic processor unit creator NVIDIA on a form of machine learning called “deep learning,” which enables what he described as super-human capability to recognize emotions in pictures and videos.
‘Deep learning’ demonstrated
In manufacturing, deep learning is used along with cameras to check whether a worker is handling the correct object or not, Dengel said.
At the Hitachi booth at CeBIT, Phong Nguyen of the company’s media systems research department, demonstrated this in action today, pulling different colored water bottles out of a brief case. He was outfitted with eye-tracking goggles and a muscle-contraction-sensing arm band.
Alas, the commands that result from the number crunching are not 100 percent task oriented: “With the AI that comes from these tools, you can measure fatigue of the workers and use that to advise them when to take a rest,” Nguyen said.