How Rgo Robotics aims to improve the vision of mobile robots

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 – 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!

Rgo Robotics announced that it has exited stealth mode with $20 million in funding. The startup intends to enable mobile robots to operate autonomously by understanding their environment through artificial-perception technology. This removes the burden of robot manufacturers to develop this complex technology.

Rgo Robotics says that is has developed an AI-powered perception engine to allow mobile robots to understand complex surroundings by achieving purportedly human-level perception. Rgo says it has tested its technology in “challenging” indoor and outdoor field trials. 

The startup reports that it has achieved design wins worth more than $10 million with leading global robot OEMs across multiple verticals. It envisions applications in logistics, manufacturing, last-mile delivery, service, agriculture and consumers.

Rgo’s vision for mobile robotics

“Most mobile robots today are still blind and unable to navigate intelligently in dynamic and complex environments, and we see firsthand how hard it is for machine and robot manufacturers to develop basic visual perception on their own,” said Amir Bousani, CEO and cofounder, Rgo Robotics. “Our technology changes this. Leveraging the most advanced AI and vision technologies,”

The goal of the perception engine, Bousani says, it to allows mobile machines to understand the world around them so they can move autonomously, safely and intelligently in any environment. “We call this intelligent autonomy.”

Rgo Robotics exited stealth mode with a series A funding in January, providing over $20 million. It aims to expand R&D and commercial teams. The startup was founded in 2018. Rgo further said it was awarded Robotics Business Review’s RBR50 Robotics Innovation Award.

Inside the perception engine

Rgo’s perception engine consists of both software and hardware components. The hardware designed to offer an low-cost and low power reference design, while the software is available as an SDK. The data for the perception engine, through which the robot learns its environment, is provided over an API. The robot control system takes care of path planning and autonomous behaviors, according to the company.

Originally appeared on: TheSpuzz