Nvidia unveils Isaac Sim on Omniverse Cloud for better robot design

Connect with top gaming leaders in Los Angeles at GamesBeat Summit 2023 this May 22-23. Register here.

Nvidia has unveiled its Isaac Sim platform for its Omniverse Cloud, enabling engineers to work together remotely to simulate robots and finish designs more quickly.

Isaac Sim enables global teams to remotely collaborate to build, train, simulate, validate and deploy robots. Nvidia said that its Isaac robotics platform will now be available as Isaac Sim on the Omniverse cloud platform.

“The Isaac Robotics platform is end to end and it really expands our products from cloud to edge,” said Gerard Andrews, product marketing head for robotics at Nvidia, in a press briefing.

Nvidia also said at its Nvidia GTC online event that Microsoft Azure will be used as a host to increase access for managing AI-based robots.


GamesBeat Summit 2023

Join the GamesBeat community in Los Angeles this May 22-23. You’ll hear from the brightest minds within the gaming industry to share their updates on the latest developments.

Register Here

The company also said that a full lineup of Jetson Orin modules is now available, offering a performance leap for edge AI and robotics applications.

“The world’s largest industries make physical things, but they want to build them digitally,” said Nvidia CEO Jensen Huang during the GTC keynote. “Omniverse is a platform for industrial digitalization that bridges digital and physical.”

Isaac Sim on Omniverse Enterprise for Virtual Simulations

Isaac Sim is now on Omniverse Cloud for easier robotics design and simulation.

Building robots in the real world requires creating datasets from scratch, which is time-consuming and expensive and slows deployments.

That’s why developers are turning to synthetic data generation (SDG), pretrained AI models, transfer learning and robotics simulation to drive down costs and accelerate deployment timelines. The Omniverse Cloud platform-as-a-service, which runs on Nvidia OVX servers, puts advanced capabilities into the hands of Azure developers wherever they are.

It enables enterprises to scale robotics simulation workloads, such as SDG, and provides continuous integration and continuous delivery for teams to work in a shared repository on code changes while working with Isaac Sim.

Isaac Sim is a robotics simulation application and SDG tool that drives photorealistic, physically accurate virtual environments.

Making Isaac Sim accessible in the cloud allows teams to work together more effectively with access to the latest robotics tools and software development kits. Omniverse Cloud gives enterprises more options in the cloud with Azure, in addition to the existing cloud-based methods of using Isaac Sim for self-managed containers, or with using it on virtual workstations or fully managed services such as AWS RoboMaker.

And with access to Omniverse Replicator, an SDG engine in Isaac Sim, engineers can build production-quality synthetic datasets to train robust deep-learning perception models. Nvidia’s Replicator tool is included in the Omniverse to assist with synthetic data generation, which Andrews said is a superpower to help improve model performance for all kinds of robots.

Amazon uses Omniverse to automate, optimize and plan its autonomous warehouses with digital twin simulations before deployment into the real world. With Isaac Sim, Amazon Robotics is also improving the capabilities of Proteus, its latest autonomous mobile robot (AMR).

This helps the online retail giant fulfill thousands of orders in a cost- and time-efficient manner.

Working with automation company Idealworks, BMW Group uses Isaac Sim in Omniverse to generate synthetic data and run scenarios for testing and training AMRs and factory robots. Nvidia is developing across the AI tools spectrum — from computing in the cloud with simulation like Isaac Sim to at the edge with the Jetson platform — accelerating robotics adoption across industries.

When it comes to robots, there’s a computer in the robot itself that acts as the brain of the robot, and that’s based on Nvidia’s Jetson robot hardware platform. Then there is a cloud computer that trains the models and creates the synthetic data. Generating synthetic data can take hours because of the scale of the large data sets required for the models.

“When we say cloud to the edge, we really make a complete platform from the robot brain and the software that runs on that local robot as well as a big brain in the sky,” Andrews said.

Nvidia has moved its Omniverse platform, a metaverse for engineers, into the cloud so anyone can access it from any device, like an iPad. By putting Isaac Sim on the cloud, large numbers of engineers can work together to design a robot and simulate it so that it functions properly.

“If the computer is in the sky that is actually in the cloud that’s doing the simulation, then you can access the simulation results on any compute device,” Andrews said.

In addition to Nvidia’s cloud, Nvidia also has another cloud option with Amazon Web Services.

Nvidia Jetson Orin Nano Developer Kit

isaac 3
Nvidia Orin Nano Developer Kit.

Nvidia also announced a new Nvidia Jetson Orin Nano Developer Kit, which developers can use to design and deploy entry-level AI-powered robots, smart drones and intelligent vision systems.

The kit simplifies getting started with the Nvidia Jetson Orin Nano series hardware. It can hit up to 40 tera operations per second (TOPS), and it consists of a Jetson Orin Nano 8 GB module and a reference carrier board that can accommodate all Nvidia Jetson Orin Nano and Nvidia Jetson Orin NX modules.

As such, it’s a platform for prototyping next-generation edge AI products. The Jetson Orin Nano 8 GB module features an Nvidia Ampere architecture GPU with 1024 CUDA cores, 32 third-generation Tensor Cores, and a 6-core Arm CPU, enabling multiple concurrent AI application pipelines and high-performance inference.

The developer kit carrier board boasts a wide array of connectors, including two MIPI CSI connectors supporting camera modules with up to four lanes, enabling higher resolution and frame rates than before. The prior-generation Jetson Nano Developer Kit made AI accessible to everyone. The new Jetson Orin Nano Developer Kit raises the bar for entry-level AI development with 80 times the performance, enabling developers to run any kind of modern AI model, including transformer and advanced robotics models.

On top of that, Jetson Orin Nano also provides 5.4 times the CUDA compute, 6.6 times the CPU performance, and 50 times the performance per watt. It has features like detecting people in a video scene. And it is targeted at edge applications, such as smart retail, smart city intersections, and industrial automation.

Nvidia also said that Nvidia Isaac ROS, a collection of hardware-accelerated packages, makes it easier for ROS 2 developers to build high-performance solutions on the Jetson Orin Nano Developer Kit.

The Jetson Orin Nano Developer Kit is now available for preorder at $499 and will start shipping in April. There are multiple variations available and all are shipping shortly. The Jetson AGX Orin delivers 275 TOPS for advanced autonomous machines.

More than 1 million developers and over 6,000 customers have chosen the Nvidia Jetson platform, including Amazon Web Services, Canon, Cisco, Hyundai Robotics, JD.com, John Deere, Komatsu, Medtronic, Meituan, Microsoft Azure, Teradyne and TK Elevator.

Companies adopting the new Orin-based modules include Hyundai Doosan Infracore, Robotis, Seyeon Tech, Skydio, Trimble, Verdant and Zipline. More than 70 Jetson ecosystem partners are offering Orin-based solutions, with a wide range of support from hardware, AI software and application design services to sensors, connectivity and developer tools.

Metropolis, Nvidia’s platform for connecting a city’s infrastructure, provides visual data to robots that are operating in an environment.

Originally appeared on: TheSpuzz