This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 3099067.
Unity launches ‘metaverse’ platforms to simulate robot deployments
by Ben Wodecki
For virtual testing and analysis
Video game development software specialist Unity has launched Unity Simulation Pro and Unity SystemGraph – two new products that enable users to to simulate AI and robotics deployments.
Unity Simulation Pro renders project simulations, either locally or in the private cloud, while SystemGraph is a node-based editor designed to emulate robotics systems and photosensors, such as lidar and cameras.
The toolkit “advances our commitment to enable organizations to create, deploy and unlock the value of AI,” said Dave Rhodes, general manager of Digital Twins at Unity.
“With Unity SystemGraph, engineers can much more easily mimic sensors, cameras, and even physical robots in a complex system,” he said.
“Then they can test and train these systems at faster than real time rates with Unity Simulation Pro, achieving optimized performance levels at tremendous cost and time savings.”
Enter the metaverse simulation
San Francisco-based Unity started out by developing video game engines. In recent years, it’s pivoted to 3D tools for enterprise applications, including those for making digital twins.
At the recent AI Summit & IoT World Silicon Valley 2021, Danny Lange, SVP of AI at Unity, spoke of how Facebook’s metaverse re-brand should have looked at industrial applications and not solely gaming and video conferencing.
He spoke of the idea of the ‘industrial metaverse’ – for example, applying related technologies to potentially design manufacturing sites before committing to deployments.
The direction outlined by the former head of machine learning at Uber appears to have been followed, with Unity announcing new simulation tools late last week.
Simulation Pro is “purpose-built” for simulating autonomous systems, Rhodes said in the announcement.
“With this product, we are enabling a future where we'll see more developers create and evolve autonomous systems, across different industries, at a quicker, safer, and more cost-effective rate."
Roboticists and engineers can use the platform to test and analyze projects and make optimal design decisions without requiring access to the actual hardware.
The Allen Institute of AI and Carnegie Mellon University were given early access to Unity Simulation Pro as part of a trial program that involves testing and training robots to perform navigation and manipulation tasks.
Using Simulation Pro, the training process was accelerated from 200 fps when using one GPU to more than 5,000 fps with 32 GPUs.
The partners used the software to create a version of AI2-THOR, a simulation environment with a diverse repository of indoor scenes.
“The experiments that used to take weeks to finish, can now finish in just a few days,” said Abhinav Gupta, associate professor at Carnegie Mellon University.
Meanwhile, Volvo has been using SystemGraph in beta to perform high fidelity sensor modeling for its autonomous driving perception software testing.
“SystemGraph is a flexible and convenient development tool that fits well into our simulation work and boosts our software testing,” said Joachim de Verdier, head of safe vehicle automation at Volvo Cars.
Both of the new tools will be displayed at Unity’s AI Summit later this week.
Unity wasn’t the only company to have unveiled metaverse-focused products in the past week, as rival Nvidia showed off Omniverse Avatars – a platform to generate interactive AI-based representations of people.