Artificial intelligence is a crucial technology for autonomous vehicles. Adaptive control systems make it possible to process the immense data sets delivered by the surrounding area sensors, then work out which actions should be taken.
For a vehicle to drive autonomously, it is not enough to simply equip it with a large number of sensors for detecting the immediate surroundings. It must also be able to handle the huge volumes of data, and to do so in real time. This overburdens conventional computer systems. The solution comes from electronics and software that provide the means for imitating the functions of the human brain. Artificial intelligence (AI), cognitive computing and machine learning are terms used to describe different aspects of these types of modern computer systems. “In essence, it is all about emulating, supporting and expanding human perception, intelligence and thinking using computers and special software,” says Dr Mathias Weber, IT Services Section Head at the German digital industry association Bitkom.
Nowadays, artificial intelligence is used as standard; for instance, it is embedded in digital assistants like Siri, Cortana and Echo. The basic assumption of AI is that human intelligence results from a variety of calculations. This allows artificial intelligence itself to be created by different means. There are now systems whose main purpose is to detect patterns and take appropriate actions accordingly. In addition, there are variants known as knowledge-based AI systems. These attempt to solve problems using the knowledge stored in a database. In turn, other systems use methods derived from probability theory to respond appropriately to given patterns. “An artificial-intelligence system continuously learns from experience and by its ability to discern and recognise its surroundings,” says Luca De Ambroggi, Principal Automotive and Semiconductor Analyst at IHS Technology. “It learns, as human beings do, from real sounds, images, and other sensory inputs. The system recognises the car’s environment and evaluates the contextual implications for the moving car.” In terms of AI systems built into infotainment and driver assistance systems alone, IHS expects sales to increase to 122 million units by 2025. By comparison, the 2015 figure was only 7 million.
New Processors for Artificial Intelligence
The roll-out of artificial intelligence also has direct impacts on processor technology: conventional computational cores, CPUs, are being replaced with new architectures. Graphics processing units (GPUs) have thus been viewed as a crucial technology for AI for several years. CPU architectures perform tasks as a consecutive series, whereas GPUs – with their numerous small and efficient computer units – process tasks in parallel, making them much faster where large volumes of data are concerned. The new chips’ control algorithms already contain elements of neural networks, which are used in self-learning machines. A neural network of this type consists of artificial neurons and is based on the human brain in terms of its workings and structure. This enables a neural network to make highly realistic calculations.
Tyres with AI
In 2016, tyre manufacturer Goodyear introduced the concept of a spherical tyre featuring artificial intelligence. With the aid of a bionic “outer skin” containing a sensor network, along with a weather-reactive tread, the tyre can act on the information it collects by directly implementing it in the driving experience. It connects and combines information, processing it immediately via its neural network, which uses self-learning algorithms. This allows the Eagle 360 Urban to make the correct decision every time in standard traffic situations. Its artificial intelligence helps it to learn from previous experiences, enabling it to continuously optimise its performance. Consequently, the tyre adds grooves in wet conditions and retightens when dry.
Adaptive Control Systems
Like human beings, cognitive computing systems can integrate information from their immediate surroundings – though rather than eyes, ears and other senses, they use sensors such as cameras, microphones or measuring instruments for this purpose. The new processor architectures give vehicles the ability to evaluate these huge data volumes, and to constantly improve and expand these evaluations. This machine learning is seen as a key technology on the road to artificial intelligence. Machine learning also includes deep learning, which interprets signals not by relying on mathematical rules, but rather knowledge gained from experience. In this case, the software systems change their programming by experimenting themselves – the behaviour that leads most reliably to a desired result “wins”.
Several automotive suppliers are now offering control systems pre-equipped with deep learning capabilities. Contemporary electronic control units (ECUs) in vehicles generally consist of various processing units, each of which controls a system or a specific function. The computing power of these units will no longer be adequate for autonomous driving. AI-based control units, on the other hand, centralise the control function. All information from the various data sources of an autonomous vehicle – including from infrastructure or from other road users – are gathered here and processed with a high-performance AI computing platform. In this way, the control system comes to “understand” the full 360-degree environment surrounding the vehicle in real time. It knows what is happening around the vehicle and can use this to deduce actions. Jensen Huang, CEO of Nvidia, works with his company to partner with various automotive manufacturers in developing control systems of this type. He is certain of one thing: “Artificial intelligence is the essential tool for solving the incredibly demanding challenge of autonomous driving.”