Collaboration in industry

Mixed teams consisting of humans and robots are becoming increasingly common in the everyday working environment due to new technical developments. Combining the strengths of both enables flexible, efficient and at the same time ergonomical production.

Robots and humans are going to work more closely together in future,” says Johann Hegel, head of technological development assembly at Audi, with conviction during an expert round table within the scope of the Automatica trade fair. “The robot is soon going to be as common for employees as a cordless screwdriver.” The reason for this harmonisation is democratic change, he believes. “Employees in the factories are to be supported even more ergonomically and efficiently,” says Hegel.

Robots noticing humans

With the closer cooperation of humans and robots, the strengths of both can be combined: human beings can observe complex situations and have unsurpassed capabilities for reacting, adapting and improvising. By contrast, robots provide great speed and power with a consistent level of quality. A wide range of product variants are also to be produced economically with shorter lifecycles. However, for this cooperation to work, human beings must not be put at risk by robots. Sensors thus enable collaborative robots to feel and see. This way, risks for human colleagues are excluded – either by the robot simply stopping or reducing its movements to a non-hazardous speed when it is touched or human beings get too close.

Systems already being used

The first market-ready examples already exist today. For example, one of the most recent innovations is the YuMi, a collaborative robot from ABB. It is equipped with a touch-sensitive sensor technology which enables it to stop within milliseconds if it comes into contact with humans. Another example is the sensitive LBR iiwa from Kuka: it is precise, resilient, flexible and equipped with mechanics and drive technology for industrial applications. It can be used to automate sensitive and complex assembly tasks, for which usage of robots has so far not been possible: “Instead of safety fences and separate processes, the robot becomes the direct work assistant of human beings,” says Manfred Gundel, CEO of Kuka Roboter GmbH. The robot can thus function as the “third hand” of the operator – particularly if it is fitted with the upper arm exoskeleton from the BioRobotics Institute of the Scuola Superiore Sant’Anna in Pisa, Italy: the human being can move the lightweight robot like his own arm by means of a sensor-guided exoskeleton via a data connection. The motors of the exoskeleton return the forces being created by the interaction of the iiwa with its environment. This means, for example, that humans feel pressure which is applied to the robot arm. Usage would be possible for telepresence and rehabilitation applications, for example.

Robots reading thoughts

A project of the German Aerospace Center (DLR) takes this one step further: here, robots are even expected to read the thoughts of their human colleagues. For this purpose, the operator wears a cap equipped with electrodes, enabling the system to measure brain activity via electroencephalography (EEG) and to interpret specific changes of brain waves. These changes enable statements, for example,about the processing status of presented information, the intentions of the operator or his cognitive capacity. The interface thus receives important information in order to support humans actively in critical situations or to increase user-specific effectiveness of the control. If the operator has overlooked a warning sent by the robot, for example, the system will notify him again if the user is cognitively overburdened, his strain is reduced. In order to be able to precisely evaluate the intended actions and workload of the operator, the researchers rely on electromyography (EMG) in addition to the EEG for measuring the muscle activity and on eye tracking, which registers the viewing direction. This creates a comprehensive picture of the cognitive state of the user. The interface learns from these data and subsequent actions which sequences in the brain waves mean a perception or an action. This way the system can adapt to changing states of the user in real time and even to new users automatically.

(picture credits: Shutterstock)

Related Posts

  • Smart glasses are the wearables which promise the most benefit for applications in industry. Combined with augmented reality, they will deliver greater…

  • Many companies are today already using technologies in line with the idea behind Industry 4.0. They are often just island solutions, but…

  • Microelectronics is seen as one of the key technologies for Smart Systems. It provides the various modules that give an object intelligence.…