No person is more attentive in traffic situations than a driver assistance system. Nonetheless, both optical and radar-based sensor systems are required to capture the environment for this purpose much more accurately than an experienced driver would ever be able to do. Using the data from the numerous camera, radar, lidar, and ultrasonic systems installed and networked in the car, algorithms determine control strategies in a fraction of a second in order to optimally control the vehicle in a risky situation. And they do so with exceptional precision. So it is not surprising that driver assistance systems—also known as ADAS (Advanced Driver Assistance Systems) in the jargon—demonstrably reduce the risk of accidents on roads. With each additional ADAS system, automotive developers come one step closer to the vision of accident-free driving. But the journey there is indeed as difficult as one might imagine.
This is especially true for autonomous driving. With the help of agile development methods, engineers have made great strides in development, but are still far from mastering all technical requirements. Nonetheless, in pilot projects on public roads under known and circumscribed conditions, self-driving vehicles demonstrate an economical and safe driving style at low speeds. In contrast to driver assistance systems with their precisely defined tasks, however, an autonomous vehicle must be able to master all driving situations and completely replace the driver. Moreover, the critical conditions for ADAS and autonomous driving are not necessarily the same as for human drivers and are not yet fully understood.
“It would be impossible to perform the necessary tests for ADAS on the road. That’s why we developed PEVATeC.” Frank Sayer, Senior Manager Virtual Vehicle Development
Autonomous driving still requires extensive testing. For example, scientists at the US think tank RAND Corporation assume that fully autonomous vehicles would have to drive hundreds of millions and in some cases hundreds of billions of miles in order to test the individual systems and their interactions in a robust and meaningful way. They claim, for example, that some eleven billion miles would be needed to reduce the risk of a fatal accident caused by an autonomous vehicle by 20 percent over a human driver. If 100 test vehicles were in use 24 hours a day, seven days a week, the test drives would take around 500 years at an average speed of 25 miles per hour and roughly 250 years at an average speed of 50 miles per hour—timeframes and costs that are manifestly incompatible with product development.
Even in the case of semi-autonomous driving functions, a host of engineers would have to test the ADAS systems over a period of several years in order to validate every conceivable scenario. Frank Sayer is well aware that this would be neither economically justifiable nor feasible, not to mention the fact that it would also be extremely dangerous for other road users. “It would be impossible to do this on the road,” explains the Senior Manager Virtual Vehicle Development at Porsche Engineering. The idea, therefore, is to transfer many of those kilometers to the lab through digitalization and extensive computer simulations—namely to the Porsche Engineering Virtual ADAS Testing Center (PEVATeC). In the years to come, PEVATeC will create virtual worlds that will encompass all relevant situations on the road and thus serve as test cases for algorithms and sensors used in driver assistance systems.
Reproducing critical situations
Test drives in a simulated environment are not only cheaper, time-saving, and possible with less organizational effort—they can also reproduce and modify critical situations from real road traffic. Furthermore, simulation can help to discover new critical scenarios that have not yet been understood by the human driver, but are crucial for ensuring safety under any possible use case of sensor-based autonomous driving.
Beyond real-time capability, the virtual realities created must also be able to produce physically realistic effects. Digitally reproduced objects such as roads, sidewalks, house walls, and vehicles must have exactly the same properties as those found in actual road traffic—only then can they provide the camera, lidar, radar, and ultrasound systems with realistic input. The magic words are “physically based rendering”: existing object rendering methods mean that properties such as surface structure, color gradation, and light sources are simulated in a simplifying way that also saves resources. Physically based rendering, by contrast, is a proven method for realistic imaging of light reflection and refraction on three-dimensional objects. The main task here is to represent physically correct distribution patterns of light.
In order to minimize the differences between real and virtual driving tests, the engineers at PEVATeC are working intensively on a physical material definition that is as accurate as possible, as well as algorithms that reproduce the light close to real life. This is important to prevent driver assistance systems from making situational miscalculations due to factors such as dirty camera lenses or multiple reflections of the radar waves. For this reason, the effect of weather conditions and lighting on the camera-based sensors in a vehicle, for example, can be displayed at the touch of a button. “This also includes the effects of a low sun, a wet and reflecting road surface, and a snow-covered road surface,” explains Sayer.
Including dynamic objects
In the future, even the road surface, with all its unevenness, will be capable of being calculated just as realistically as the consequences of a dirty camera lens. Even conducting tests under different conditions on real roads is difficult to achieve in practice. Moreover, developers also have numerous virtual objects such as trees and everyday objects at their disposal in order to make the street environment as realistic as possible. After all, autonomous vehicles have to recognize potential risks even where the course of the road is confusing. This includes the ability to integrate dynamic objects into the simulation, meaning people, cyclists, and other road users, who should move naturally in the digital 3D world.
Simulink, ROS, and OpenDRIVE can be connected to PEVATeC via data interfaces.
If the individual scenarios are now compared with each other in real and virtual driving tests, conclusions can be drawn about the accuracy of the overall simulation. This also leads to the emergence of an ever more precise basis for optimizing the sensor systems in the vehicle through simulation—for example by virtually testing different installation locations for an ultrasonic sensor in the vehicle. This enables rapid validation and calibration of optical and radar-based sensors. Data interfaces to Simulink, ROS, or OpenDRIVE, for example, are available to all departments involved in the development process so that the results can later be integrated into the simulation of the entire vehicle.
Another important task performed by PEVATeC is the classification of objects. The sensor intelligence must be programmed to recognize traffic signs, people, and situations even under the most difficult conditions. This requires training image recognition software, which is done using artificial intelligence and a combination of real and simulated image data. The system is shown countless variations of images or video sequences so that it can be trained with the aid of machine learning to correctly classify objects and situations. High-performance computers carry out this labeling process automatically. In the simulated scenario only, all objects are known and positioned in the game engine. In this way, the objects in the image can be automatically identified, dimensioned, and characterized.
ADAS test center: infrastructure of a high-performance computing center
Because the virtual testing, training, and validation of new vehicle functions require an immense amount of data to be processed in real time, the future infrastructure of the ADAS test center will be similar to that of a high-performance computing center, where a significant number of graphics chips (GPUs) will be required to process the enormous amount of information. GPUs are particularly suitable for applications involving automated driving because mathematical operations run in parallel in them. They are therefore also an essential part of the PEVATeC concept. In addition, there is also storage capacity for a pool of scenarios required for testing and validating different ADAS systems. The determination of valid data is an essential prerequisite for the development of algorithms that bring autonomous driving to the road efficiently and safely. That’s exactly what PEVATeC is supposed to do: the findings from the simulations help the engineers optimally train the control algorithms of the driver assistance systems—in a way that enables the installed ADAS systems independently master the most difficult maneuvers and situations.
In brief
Testing advanced driver assistance systems and autonomous driving functions requires billions of test miles. Road tests alone are not enough to get it done. That’s why Porsche Engineering has developed PEVATeC. A computer system specialized in 3D simulations generates synthetic data that serves as input for the vehicle sensors. The data is so realistic that it cannot be distinguished from reality. This makes it possible to shift many tests from the real to the virtual world.
Info
Text: Andreas Burkert
Contributors: Dr. Clara Martina Martinez, Frank Sayer
Photos: Mihail Onaca
Text first published in the Porsche Engineering Magazine, Issue 2/2019