Sensors, computing power and the ability to learn are the technological foundations of autonomous vehicles. The more functions that are taken over by technology, the higher the level of automation – right up to the completely driverless vehicle.
The roots of autonomous vehicles reach further back than is generally assumed: as early as the start of the 20th century, Elmer Sperry developed the first control system controlled by a gyrocompass, enabling ships to be kept on course automatically. Then, in 1928, the first automated aeroplane control system, developed by Johann Maria Boykow, was showcased at the International Air Exhibition in Berlin. However, true autonomous driving requires far more than simply keeping the vehicle on a set course: the vehicle must be able to reach a specified destination independently, without human control or detailed programming. In doing so, it must be able to respond to both obstacles and unforeseen events.
From assistance systems to self-driving vehicles
The path to a fully autonomous system is gradual, with developments on a sliding scale. A worldwide system with six levels for classifying degrees of automation is now recognised throughout the world – it was defined, among others, by SAE International (Society of Automotive Engineers), but is now also used for other vehicle segments. According to this scale, level 0 corresponds to a vehicle without any assistance system, where the driver is solely responsible for all functions. At level 1, the first assistance systems, such as cruise control, support the driver. Partly-automated vehicles with parking and lane guidance systems, which can already carry out automated steering manoeuvres, constitute level 2. At level 3, the vehicle controls itself for the most part, and the driver no longer has to oversee the vehicle at all times. The fully-automated vehicles classed as level 4 can master even high-risk situations without human help, but are restricted to known sections of road. Only at level 5 do we find completely autonomous driving, in every environment and all situations. Within limited areas, such as agriculture, intralogistics, light-rail systems or mining, highly and fully-automated vehicles classed as levels 3 and 4 have been in use for quite some time. However, only level 3 cars are currently found on the roads. The first series-production cars that can cope without drivers in real road traffic, at least under specific conditions (level 4), are set to be on offer from 2020 onwards.
Sensors pick up on the surroundings
In order for a vehicle to reach its destination autonomously, it requires various capabilities: firstly, it must pick up on the environment through which it is moving – otherwise it would simply fall at the first hurdle. To prevent this, autonomous vehicles are equipped with a very wide range of sensors: ultrasound sensors are required for automated driving, particularly for detecting close-up surroundings up to six metres away and at low speeds, for example when parking. Radar sensors provide important information on the environment at a greater distance, through 360 degrees. The main task of a radar sensor is to detect objects and to measure their speed and position compared with the movement of the vehicle on which it is fitted. A relatively new addition is lidar sensors, which “scan” the environment with invisible laser light and can generate a high-resolution 3D map of the surroundings. Video sensors, above all in stereo-video cameras, supply additional important visual information such as the colour of an object. Each of these sensor systems has its strengths and weaknesses. In order to obtain an image of the environment that is as exact and reliable as possible, multiple sensors are used together in autonomous vehicles – depending on the application – and the data from these is “merged” or drawn together.
High-resolution maps via the cloud
As well as the ability to “see” the surroundings, an autonomous vehicle must also be able to navigate. Thanks to satellite navigation systems such as GPS, the vehicles know where they are currently located and can calculate their route based on this information. In doing so, they rely on high-resolution maps that are kept extremely up to date, with these maps not just showing the topology, but also incorporating current events such as traffic jams in a dynamic manner wherever possible. These maps can be stored locally in the vehicle or in the Cloud. In the latter case, a high-performance communication system is particularly necessary, so that the map data can be updated in real time. For example, the 5G mobile telecommunications standard can form the basis for this. It enables a “tactile Internet” that, in addition to transmission rates in excess of ten gigabits per second, guarantees an ultrafast response with a delay of less than one millisecond. With this type of networking, the almost unlimited resources of Cloud computing can be called on to carry out complex calculations involved in analysing the situation or route finding.
Learning as a basis for the correct response
After all, analysing the huge data volumes that are generated by the vehicle’s sensor systems requires considerable computing capacity, as does interpreting situations. Technologies that are grouped under “artificial intelligence” are becoming increasingly important. Machine learning, in particular, is an essential part of an autonomous system: only with this is it possible for vehicles to act intelligently and independently of humans. Through machine learning, autonomous systems can generate new knowledge from data that has been collected and provided, and are able to constantly extend their knowledge base. Without this independent learning, it would be almost impossible to specify appropriate reactions to all theoretically possible situations in programming.