Ansys collaborates with Nvidia to improve sensors for autonomous cars


Ansys is working with Nvidia on introducing better sensors for self-driving cars and other autonomous vehicles.

Pittsburgh-based Ansys is a simulation software company that has created the Ansys AVxcelerate Sensors within Nvidia Drive Sim, a scenario-based autonomous vehicle (AV) simulator powered by Nvidia’s Omniverse. This integration aims to enhance the development and validation of advanced driver assistance systems (ADAS) and AV perception systems through high-fidelity sensor simulations.

Ansys is showing the tech — which essentially solves the tough physics calculations in a scene involving multiple moving objects — off at the CES 2024 tech trade show in Las Vegas next week.

The incorporation of Ansys AVxcelerate Sensors into Nvidia Drive Sim provides car makers access to highly accurate sensor simulation outputs. These outputs are instrumental in training and validating perception ADAS/AV systems, ensuring that autonomous vehicles are equipped to navigate roadways safely and reliably, even in critical edge cases.

VB Event

The AI Impact Tour

Getting to an AI Governance Blueprint – Request an invite for the Jan 10 event.

 

Learn More

Engineers working on AV technology face the monumental challenge of ensuring the safety and reliability of sensor suites and software in real-world driving scenarios. Ansys and Nvidia jointly address this challenge by bridging the gap between reality and simulation. The collaboration leverages Ansys’ physics solvers for camera, lidar, radar, and thermal camera sensors within Nvidia Drive Sim’s realistic and scalable 3D environments for scenario generation.

“Integrating Ansys AVxcelerate Sensors simulation with Drive Sim offers developers greater flexibility to
develop, test, and validate their autonomous vehicle software,” said Zvi Greenstein, vice president of
autonomous vehicle infrastructure at Nvidia, in a statement.

Nvidia said that an open ecosystem approach is foundational to collaboration. Nvidia said its Omniverse, which is a metaverse for engineers, enables users to develop OpenUSD-based 3D workflows. OpenUSD’s flexibility and modularity allows developers to not only build scalable simulations but also acts as a data factory for AI model training. Developers can harness Omniverse to build their custom synthetic data generation pipelines and generate annotated data to train the on-board computer vision models. The connection with Ansys AVxcelerate Sensors, built with application programming interfaces (APIs),
facilitates a seamless integration of Ansys physics solvers into Nvidia’s 3D virtual world.

“Perception is crucial for AV systems, and it requires validation through real-world data for the AI to make smart, safe decisions,” said Walt Hearn, senior vice president of worldwide sales and customer excellence at Ansys, in a statement. “Combining Ansys AVxcelerate Sensors with Nvidia Drive Sim, powered by Omniverse, provides a rich playground for developers to test and validate critical environmental interactions without limitations, paving the way for OEMs to accelerate AV technology development.”

Ansys has made simulation software for over 50 years to help companies bridge the gap between design and reality, facilitating advances in fields ranging from sustainable transportation to life-saving medical devices.

Originally appeared on: TheSpuzz

Scoophot
Logo