Did you miss a session at the Data Summit? Watch On-Demand Here.
Nvidia CEO Jensen Huang said today that the Omniverse virtual simulation and its tools will be available in the cloud so that developers can use it on just about any computer.
Huang made the announcement during the virtual Nvidia GTC online event today. And in an interview with VentureBeat, Omniverse platform vice president Richard Kerris said that the Omniverse ecosystem has expanded 10 times in terms of the companies and creators participating in it.
Kerris said that more than 150,000 individuals have downloaded Nvidia Omniverse as a tool to design real-time 3D simulated worlds. Those simulations are being used in everything from games to industrial “digital twins,” where designers test a concept in a virtual design before committing to physical designs. BMW made a digital twin of a car factory before building it in the real world.
The Omniverse is Nvidia’s leading tool for building the metaverse, the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One. Kerris said that the metaverse is the network for the next generation of the web.
GamesBeat Summit 2022
Re-experience the excitement of connecting with your community live at GamesBeat Summit’s in-person event on April 26 in Los Angeles, CA, and virtually April 27-28. 30+ sessions and 500+ attendees are set to arrive, so don’t want to miss this opportunity to expand your network. Early bird pricing ends March 25. Get your pass today!
“And we’re focused on the business and industrial side of virtual worlds, which we’re seeing tremendous feedback already from our customers, and use cases that are applicable today,” he said. “We’re focused on those things like visualization, simulation, digital twins, and collaboration. But there are other people using the platform to create content for virtual worlds in other areas, whether it’s entertainment and games.”
The power of digital twins
The big deal about the digital twins is the feedback loop, Kerris said. The instruments and sensors in the real factory can collect data and feed it back into the Omniverse virtual factory simulation, which can then become more accurate.
“More than ever we believe that virtual worlds are required for the next era of AI,” Kerris said. “So whether it’s training robots in reusing synthetic data generation to autonomous driving, or digital twins of factories, cities, and even the grand project around Earth 2, Omniverse is a tool that goes from the creation to the operation of these virtual worlds.”
As for new examples, Kerris said that Amazon is showing off some amazing robotics, and Pepsico is showing off its warehouse management. There are also updates for Toy Jensen, the avatar of Huang that shows off ways to put human characters in the Omniverse.
“The idea of digital twins is for us such a big part of the next generation of the Industrial Revolution, and it holds true for things like products to factories to cities to the entire Earth,” Kerris said.
Kerris said that lots of developers have offered feedback that they wanted to use the cloud to enable them to use Omniverse on a wide variety of hardware, such as low-end laptops, smartphones, or normal desktops rather than high-end workstations. So Nvidia used the same tech it uses to provide its GeForce Now online gaming service to make Omniverse available in the cloud. Kerris said that makes Omniverse even more accessible to creators, developers, designers, engineers, and researchers worldwide.
“By having all of Omniverse in the cloud, it becomes available to anybody, no matter what kind of platform you’re on, whether you’re on a Chromebook, a Mac or tablet,” Kerris said. “You’ll be able to tap into and using GeForce be able to stream Omniverse right from GeForce Now.”
The first part of the Omniverse journey is design, collaboration, and creating content, whether it’s for use in the worlds or whether it’s creating things like the world, the factory robots, cars, and more, Kerris said.
“And then the second part of that journey comes to the digital twin. Once the first part is complete, the next stage of its life begins. So whether you’re building a building, once you’re done with the building, then the digital twin life begins monitoring all the things that are happening in the building, using sensors, and the holds true in so many different other areas like robots in factories, retail, digital humans, things like that. Our customers give us that feedback.”
Kerris said there has been strong demand for Omniverse from customers that are using non-RTX systems, whether they’re Mac customers or others.
“They want to get their hands on Omniverse, and for the first time they will be able to do that,” Kerris said. “This will be in a beta and early access for a while, but we wanted to reveal this because it is going to be such as gamer changer.”
Right now there are nine regions around the world where Omniverse Cloud will be available.
Nvidia Omniverse Enterprise is helping leading companies enhance their pipelines and creative workflows. New Omniverse Enterprise customers include Amazon, DB Netze, DNEG, Kroger, Lowe’s, and more. There are more than 700 enterprise companies using Omniverse.
Siemens is using Nvidia’s Omniverse and Modulus to create digital twins for its wind farms. Siemens will simulate its wind farms with physics-informed machine learning and run them 4,000 times faster with the latest Nvidia hardware.
Virtual representations of Siemens Gamesa’s wind farms will be built using Omniverse and Modulus, which together comprise Nvidia’s digital twin platform for scientific computing. The platform will help Siemens Gamesa achieve quicker calculations to optimize wind farm layouts, which is expected to lead to farms capable of producing up to 20 percent more power than previous designs.
Kerris said the excitement around the Omniverse is helping to push the pro workstation business to new heights.
Nvidia GTC 2022 takes place virtually from March 21 to March 24, featuring 900 sessions and 1,600 speakers on a variety of technology topics including deep learning, Omniverse, data science, robotics, networking, and graphics.
Leaders from hundreds of organizations will present, including Amazon, Autodesk, Bloomberg, Cisco, DeepMind, Epic Games, Flipkart, Google Brain, Lockheed Martin, Mercedes-Benz, Microsoft, NASA, NFL, Pfizer, Snap, Sony, Stanford University, U.S. Air Force, U.S. Congress, Visa, VMware, Walt Disney, and Zoom.
Kerris said the ecosystem had grown over ten times since the fall, with a lot of growth in virtual simulations of digital twins, robotics, designs, and content creation software. Systems integrators and rendering companies are now supporting the Omniverse platform.
Adobe has just recently announced an update to their own connections to Omniverse of Substance 3D Painter and the materials library,” he said.
The ten-fold number refers to connections that have been built to the Omniverse, and the Adobe example is just one such connection.
“With sensor models, we have uncovered hundreds of opportunities for connections into the platform, whether it’s cameras, microphones, sensors, LiDAR, all kinds of things,” Kerris said. “And we’re starting to see that that area grow tremendously as well. And now with our asset and material libraries, we have hundreds of thousands of assets available in Omniverse, right out of the gate.”
Nvidia has been releasing a lot of updates for Omniverse, such as its interactive viewer, annotation, markup, and presentation tools. Companies such as Deloitte are building teams to create add-ons and customizations for Omniverse and its enterprise marketplace.
“We’ve been doing other things to help democratize the complexity of programming,” Kerris said. The Omniverse XR version will be available as a beta in early April.
Kerris said the size of his own team has doubled, and that’s an indicator of Nvidia’s growing investment in the Omniverse.
“We’re seeing major customers making purchases and we have an incredible pipeline of things coming,” Kerris said. “Jensen’s view is it is time to double down even more.”