Machine Learning

Validating the Safety of the Next Generation of Autonomous Driving Systems

Truly autonomous driving systems completely remove the driver from the control loop. Companies developing these systems are in most cases using machine learning/deep learning software to intelligently respond to a potentially infinite variety of driving scenarios without specifically programming for each scenario. After being trained to recognize a particular pattern, such as a pedestrian walking into the street, the software can generalize this capability to objects that it has never been specifically programmed on, such as a small child wearing a baseball cap. A deep learning system’s ability to recognize patterns is produced by algorithms running on neural networks. This type of system can provide pattern recognition capabilities far beyond traditional, rules-based software. Machine learning/deep learning has also been applied to other challenges involved in developing autonomous driving systems such as motion planning and motion execution.

Man vs. machine

Concerns arise from the fact that machine learning/deep learning’s powerful pattern recognition process is not something humans can intuitively understand. Furthermore, machine learning/deep learning systems do not possess the requirements, architecture and detailed subsystems to validate conventional safety software. So how can we demonstrate the ability of these systems to achieve the extremely high levels of safety required to release them for general sale to the public? Clearly, road testing is part of the answer to this question. But road testing mostly consists of very simple situations that are not challenging for either human or automated drivers. By some estimates, it would take billions of miles of road testing to validate an autonomous driving system. And each time you changed the code you would very possibly need to restart the test from scratch.

Closed-loop simulation

What’s necessary is a much faster and more efficient method of evaluating the performance of autonomous driving systems, particularly in difficult scenarios that tax their perception and other systems to the limit. These scenarios might include conditions that limit visibility such as fog or rain, the presence of many other vehicles, pedestrians and other objects found in a crowded downtown streetscape, or situations that alter driving conditions such as wet or icy pavement. Developers of autonomous driving systems are taking a close look at closed-loop simulation in which real autonomous vehicle software drives a virtual vehicle in a realistic virtual world to accurately simulate complete autonomous driving systems.

ANSYS provides a closed-loop simulation platform that the company says can verify the safety of complex autonomous driving systems by integrating physics, electronics, embedded systems and software simulation. For example, ANSYS SPEOS software provides physics-based simulation optical sensors, including cameras for visible and infrared detection, and lidars for a 360-degree 3D view of the driving environment. SPEOS takes optical lenses, mechanics, sensors, materials and light properties into account while merging images obtained by multiple cameras. “Users can arrange objects and events in the simulation, simulate sensor readings, determine whether perception properly detected the objects and events, and whether the autonomous driving system responded properly,” said Bernard Dion, Chief Technical Officer, Systems Business Unit at ANSYS.

adaptive cruise control detecting a pedestrian

Simulation of adaptive cruise control detecting a pedestrian. Image courtesy of Ansys.

Virtual reality for the win

ANSYS’ VRXPERIENCE closed-loop simulation solution addresses the challenge of proving the safety of the perception function through large-scale exposure to difficult driving scenarios. VRXEXPERIENCE executes driving scenarios using reduced order models derived from physics-based simulations of any combination of sensors. By replicating a real-world physical environment in 3D and creating a real-time, virtual-reality–based driving experience, VRXPERIENCE allows product developers to experience an autonomous vehicle under many daytime and nighttime driving scenarios with different road and weather conditions. “This approach makes it possible to drive an autonomous vehicle at an early stage of development on digital test tracks with realistic traffic conditions, including various weather conditions, oncoming vehicles, and pedestrian scenarios to anticipate the vehicle’s reaction to any critical situation,” Dion said.

Safety first

The greatest challenge remaining in the large-scale deployment of autonomous driving systems is testing and debugging machine learning and deep learning algorithms that work without defined requirements and design to ensure their robustness and safety. Today’s new generation of simulation solutions has the potential to substantially contribute to validating the safety and reliability of autonomous vehicles, helping to reduce time to market for these vehicles by reducing the amount of road testing required to demonstrate the vehicle’s safety. Most important, realistic physics-based simulation in real time enables automotive systems engineers to create the safest autonomous driving systems.