Human Machine Interfaces in the automotive sector

Driver fatigue and distraction are common causes of accidents, which is why modern Human Machine Interfaces in the automotive sector include systems that monitor the condition of the occupants. These systems are an integral part of the testing protocols of Euro NCAP, the new European General Safety Regulation for vehicles and various other regulations around the world.

By 2030, the EU aims to halve the number of traffic deaths and injuries. This ambitious endeavour encompasses everything from the mandate of state-of-the-art vehicle technologies to the modernisation of infrastructure. However, one factor plays a particularly significant role: the human. More than 90 percent of all accidents are caused by human error. In addition to violations such as speeding and driving under the influence of alcohol, it is important whether the driver is tired or distracted. According to the European Commission, 10 to 20 percent of accidents and near-accidents occur as a result of to fatigue.

Mandatory warning systems

To address this problem, the European Commission published a regulation in August 2021 that, since July 2022, mandates the use of Driver Drowsiness and Attention Warning (DDAW) systems. They assess the driver’s vigilance by analysing other vehicle systems such as steering and lane keeping and warn the driver if necessary.

Keeping an eye on the eyes

However, relying solely on data from other vehicle systems is not necessarily sufficient to assess a driver’s condition. Therefore, from mid-2024 in the EU, new vehicles must be equipped with an Advanced Driver Distraction Warning (ADDW) system. The first generation of ADDW solutions primarily relied on the driver’s eye movements: a camera with a CMOS image sensor monitors the driver using invisible infrared light. “The infrared light generates a reflection on the cornea of the eye, which is captured by the camera,” explains Martin Wittmann, marketing director for the sensor division at OSRAM Opto Semiconductors. “By tracking the direction of gaze, we can see whether the driver is looking at the road. The size of the pupil also indicates how awake the driver is. Finally, we can also recognise when the driver becomes tired by the movements of the eyelids.” When this is the case, the system warns the driver and redirects their attention to the road.

Supplementary health monitoring

Fatigue or lack of attention are highly complex states, so the latest generation of solutions capture additional parameters besides eye movement. The company Smart Eye has integrated the capture of vital signs into its driver monitoring software.

Using AI methods, the new function analyses several physiological signals to accurately determine the driver’s heart and respiratory rate. Smart Eye, in particular, uses remote photoplethysmography (rPPG), a contactless, camera-based method that measures fluctuations in light reflection from the skin to estimate heart rate. Another method is micro-movement analysis, which allows the software to detect subtle changes in movements associated with breathing or pulse that are not visible to the human eye. “By integrating heart and respiration rate detection into the driver monitoring system software, we provide an even deeper layer of insight into driver state and health,” says Henrik Lind, Chief Research Officer at Smart Eye. This can be lifesaving if, for example, a driver suffers a heart attack or seizure.

Combining radar, camera and AI

A more accurate capture of the driver’s condition is enabled by multi-sensor systems, such as those being developed jointly by emotion3D, Chuhang Tech and SAT. The “human analysis” software from emotion3D derives information about the driver from camera images, while Chuhang Tech’s radar solutions analyse the driver’s vital parameters. These two measurement methods are combined with SAT’s algorithms for predicting sleep onset. Wogong Zhang, CTO and co-founder of Chuhang Tech says: “We believe that our combined solution, which combines radar technology with advanced imaging algorithms, will revolutionise fatigue detection.”

Safety for automated driving

Driver monitoring systems are becoming increasingly important in view of the increasing automation of driving. As a vehicle becomes more autonomous, better safety systems are needed – for example to monitor whether a driver is ready to take over control of the car in a difficult situation. “Particularly well-functioning systems, especially in areas such as adaptive cruise control and lane keeping, tempt many road users to turn to tasks other than driving,” said Jann Fehlauer, Managing Director of DEKRA Automobil, at the presentation of the DEKRA Road Safety Report 2023. Several serious accidents have already been the result of such a misjudgment.

Driver monitoring still faces resistance

The majority of drivers are still skeptical about electronic monitoring of the driver, known as driver monitoring, for driver condition detection. At least that’s what a study by the insurance company Allianz shows. It reports that only 39 percent of those surveyed agree to camera or infrared scanning of the eyes, face, or head, where the technology anonymously only detects distraction. “We still have work to do in persuading drivers to accept driver monitoring,” says Christoph Lauterwasser, head of the Allianz Center for Technology. “It should not be about patronising, but about support. The latest vehicle and traffic technologies enable us to warn drivers when they are distracted. Just this feedback can contribute to a positive change in behaviour. We should use this to make road traffic safer for all of us.”

1.3 million people

die each year in traffic accidents around the world, according to estimates by the World Health Organization (WHO).

In accidents in Germany in 2021 where distraction played a role, 8,233 people were injured and 117 died, which is just under five percent of all fatalities (2,562).

Source: Federal Statistical Office of Germany

Related Posts

  • Prostheses and exoskeletons are nothing more than machines. However, as they interact very closely with humans, their interfaces have to fulfil a particular…

  • Mobile work machines have undergone various technical advancements in recent years to increase efficiency and safety at work. In agricultural and construction…

  • Collaborative robots work in close proximity to humans – even as a team. The interaction between humans and machines is thus very close…