Systems that capture users’ vital parameters represent a form of Human Machine Interface that is relevant not only in medicine. They can be used to monitor a person’s health and ensure that they are fit enough to operate machinery such as a car properly.
Tired? Angry? Inattentive? Or even sick? A person’s state of mind or wellbeing can have a significant impact on their ability to use and operate machinery and on safety. This is why HMIs are being equipped with technology for monitoring a user’s vital parameters in an increasing number of domains. These interfaces are not necessarily used to receive and execute control commands from the user. They are mainly utilised to monitor the user’s condition and trigger an action when certain changes in their vital parameters are registered.
The healthcare sector is the natural home of such HMIs: from the tiny pulse meter clamped on a finger to the highly advanced, sophisticated technology of artificial intelligence – everywhere, HMI technology is an essential part of the assessment, monitoring and treatment of patients. Typically, sensors from various devices are stuck to the patient’s skin to measure brain waves, impedance, motion, blood oxygen and temperature data. A local processor system can create individual warning messages for the patient based on the data obtained and automatically alert a caregiver if it detects unusual changes in the patient’s condition.
Recognising cardiac activity
Thanks to the success of smartphones and smartwatches, many of these parameters can now also be measured in high quality wherever the wearer is. This includes cardiac activity, where two methods have prevailed. The simplest is the single-channel ECG: here, two electrodes are integrated into a smartwatch, for example. The electrode on the back of the device is in contact with the wearer’s arm, and the second electrode on the top of the watch is activated by touching it with a finger on the other hand. In the automotive sector, so-called multi-touch ECGs are used. Here, the ECG sensor technology is integrated into various positions such as the steering wheel, gear lever or armrests. The system automatically detects which electrodes are in contact with the user. Thus, ECG measurements can take place unnoticed in the background, while the human has a high degree of freedom of movement.
Analysing vital parameters through light
Photoplethysmography (PPG), which measures a person’s heart rate optically using infrared light, has an entirely different operating principle. It detects how much light emitted by the system is reflected by the skin. This amount depends on how much blood flows through the superficial capillaries. Since the blood volume in the capillaries increases with each heartbeat, more light is absorbed and less reflected in that moment. The system converts the reflected amount of light into a pulse wave. The heart rate can then be determined through this pulse wave analysis. If RGB cameras are used to capture the light, the respiratory rate and oxygen saturation can also be determined contactlessly by analysing the red, green and blue components in the PPG signals. Recent studies have shown that the pulse wave signal can also be measured with a camera placed a few centimetres to metres away from the skin.
Radar-based sensors
Radar-based sensors can even capture heart and breathing values through clothing and over a distance of several metres. Electromagnetic waves with a frequency of, for example, 60 gigahertz are used, which are reflected by the body. Based on the reflected rays, the sensor detects the vibration of the skin caused by the pulse wave. Such systems are already used to monitor the driver’s condition in trucks, trains or aeroplanes.
Cameras read emotions
Camera systems also offer several possibilities for monitoring vital parameters. In addition to photoplethysmography, they can also recognise a person’s state of consciousness. Special CMOS cameras – usually with a resolution of one to two megapixels – take 30 or 60 frames per second in the infrared spectrum, depending on the model. A downstream system evaluates them and analyses, for example, the driver’s direction of gaze or the frequency of eyelid closure. From this, conclusions can be drawn about a distraction or increasing fatigue of the person, and if necessary, an alarm can be triggered. State-of-the-art solutions are able to recognise – in part thanks to AI – the smallest changes in behaviour, sleepiness, negative emotions and the possible influence of alcohol or drugs. Ultimately, such a system can form a complete picture of a person’s physical and emotional state.