Intelligent technologies are also driving innovations in medical technology. From smart trousers to artificial eyes – all over the world, new systems are being developed for treating illnesses and overcoming handicaps.
Making coffee, doing the laundry, tying shoes: for someone who has lost an arm and is reliant on an artificial limb, life is full of hurdles every single day. In future, intelligent artificial limbs will help people with a handicap to regain their normal movement sequences. Scientists at the Leibniz University in Hanover are researching the basic principles for equipping the technical aids with a level of sensitivity previously only seen in people. If an artificial limb detects an as yet unknown action, it searches the Cloud automatically for a similar pattern and adapts it. An app stores all movements and enables the artificial limbs of all users to learn from each other digitally.
Feeling for artificial limbs
Feeling is important for achieving an artificial hand or arm that works as closely as possible to the “real thing”: humans detect heat, surface textures, pressure and much more by feeling. The skin acts as a large, multi-modal sensor. Teams of researchers all over the world are therefore working to replicate skin artificially in order to give artificial limbs a level of sensitivity that is as close as possible to nature. The first artificial skin in the world was developed by a team led by Korean Professor Kim Dae-Hyeong from the Seoul National University: “The synthetic skin has the sense of feeling that exactly copies human skin. The skin can feel pressure, temperature, strain, humidity.” It is made up of multiple layers: the base is a soft, rubber-like material. This is covered with an ultra-thin layer made from polyimide, followed by silicon. The deformation of integrated gold threads enables pressure and tension to be measured, for example, and capacitors detect moisture. The team of researchers used the skin on an artificial hand, which was then able to shake hands, use a keyboard or hold a ball. For the skin to be truly useful, however, the sensor data must be transferred to the brain of the artificial limb wearer so that commands are carried out in real-time. To this end, the Korean researchers managed to establish a connection between the artificial skin and the brain of test animals by applying an electrode array to a nerve cord. The electrical impulses from the sensors are then sent to the nerve tracts of the artificial skin wearer. “I hope robotic limbs with this synthetic skin can be used by disabled people. And for industrial uses, it can be applied to various types of robots such as humanoid robots.” states Professor Kim.
The brain controls the artificial limb directly
The international MoreGrasp consortium is working on communication between artificial limbs and the brain. “Until now, artificial limbs have been controlled via shoulder movements. In future, we aim to make the whole process more intuitive,” explains Dr Rüdiger Rupp, Head of Experimental Neurorehabilitation at the Spinal Cord Injury Centre at Heidelberg University Hospital. “After all, our hand movements are controlled by the brain. Connections known as brain-computer interfaces are now available, which enable us to detect the intended movements via electrodes on the head. The dream, which we are now trying to make a reality, is to enable paraplegic patients to carry out hand movements using thought alone.” The special advantage of this new neuroprosthesis would be that, for the first time, patients would be able to control both hands at the same time. People with high-level paraplegia – where the elbow and shoulder function is impeded in addition to the hand function – could also benefit from the new MoreGrasp system, as shoulder movements are no longer required to control it.
Robert Greenberg, CEO of Second Sight Medical Products, and his team are also working on a direct connection between artificial limbs and the brain: they aim to develop their Argus II artificial retina further and to place an implant directly into the visual part of the brain. Until now, the bionic eye has consisted of an implant mounted on the damaged retina and a pair of glasses with a camera, which sends visual information to the electrode network in the eye via a handheld computer. To do this, the signals are converted into impulses, which are sent wirelessly via a transmitter on the glasses to the implant which cannot be felt by the wearer. This enables the patient to detect flashes of light, to differentiate between light and dark, as well as areas and movements.
The helping trousers
But technology doesn’t always have to replace limbs – often, Smart Systems simply support people with a disability. In the USA and Germany too, research is being carried out into gloves which can detect and translate the characters used in sign language. The gloves developed at Magdeburg-Stendal University can detect the bending of the wearer’s fingers using sensors, for example, and show the relevant letters on a monitor. This enables deaf people to make themselves understood with people who are not familiar with sign language. Exoskeletons are another example of “helping systems”: They not only support paraplegic patients, but also replicate the movement of the legs using motors. These exoskeletons are cumbersome and heavy, however. British researchers are therefore developing soft robotic clothing – the smart trousers are equipped with artificial muscles and are designed to support the movements of people with disabilities or elderly people. They will also detect when a person loses their balance when walking and actively counteract this loss of balance to prevent falls. The researchers are aiming to have finished their work in three years. Dr Rory O’Connor from the Faculty of Medicine and Health at the University of Leeds is the clinical expert in the team of researchers: “We will be using very sophisticated soft materials with actuators integrated into their fibres, which can be used to move and support parts of the body.” These “motor” fibres are connected with an intelligent control system. They must be able to detect the intentions of the user, as Dr Abbas Dehghani from the School of Mechanical Engineering at the University of Leeds explains. “The system has to be able to work out what the user is trying to do; it wouldn’t be good at all if the trousers tried to help a person walk when they actually want to sit down.”
(picture credits: Shutterstock)