The chipmaker makes further expansions to its virtual collaboration platform

Ben Wodecki, Jr. Editor

August 9, 2022

3 Min Read
Nvidia

Nvidia has unveiled Omniverse Avatar Cloud Engine (ACE): a suite of cloud-native AI models and services designed to make it easier to build and customize virtual assistants.

ACE lets businesses instantly access the computing power needed to create and deploy assistants and avatars that understand multiple languages, respond to speech prompts and interact with the environment around them.

“Our industry has been on a decades-long journey teaching computers to communicate and carry out complex tasks with ease that humans take for granted,” said Rev Lebaredian, vice president of Omniverse and simulation technology at Nvidia, in a statement.

“NVIDIA ACE brings this within reach. ACE combines many sophisticated AI technologies, allowing developers to create digital assistants that are on a path to pass the Turing test.” The test appraises a computer's ability to exhibit intelligence indistinguishable from a human.

Omniverse: Transforming interactions

Omniverse is Nvidia’s virtual environment platform. It was initially titled Holodeck, named after the VR environment room from Star Trek: The Next Generation. Omniverse can be used to create 3D environments for production teams to work together without the need for in-person meetings or sizable file exchanges.

The tech giant expanded Omniverse to be used as an enterprise service last April, allowing industrial companies to create digital twins of machines and tools used in real-world facilities, among other applications.

Further expansions to Omniverse saw Nvidia showcase Project Tokkio late last year. Essentially, Tokkio allows users to be replicated as an avatar for applications like customer support. The unveiling back in November saw Jensen Huang, co-founder and CEO of Nvidia recreated as a toy-like replica of himself.

And around the same time, the company published Riva Custom Voice, an AI software platform that can create human-like voices at speed.

ACE, Nvidia’s latest Omniverse unveiling, is built atop the company’s Unified Compute Framework, which provides access to software tools and APIs like Riva, Tokkio as well as Maxine SDK for video conferencing.

“The assistants and avatars ACE enables will transform interactions in gaming, entertainment, banking, transportation and hospitality,” according to Nvidia.

Toolkits, 3D cinematics and facial animations

Alongside ACE, Nvidia also published a new range of developer frameworks, tools, apps and plugins for Omniverse.

The expansion of Omniverse includes several AI-powered tools and features so developers can build virtual worlds and connect with 3D applications like Unity, PTC Creo and the Siemens Xcelerator platform.

Notable updates include a toolkit for building native Omniverse extensions, Audio2Face which is an AI tool that can create facial animations directly from an audio file, and Machinima, an app designed to build 3D cinematics.

Also unveiled was Omniverse DeepSearch, designed to helps teams use AI to intuitively and accurately search through massive, untagged 3D asset databases of visuals using natural language

“The metaverse is a multitrillion-dollar opportunity that organizations know they can’t ignore, but many struggle to see a clear path forward for how to engage with it,” said Lebaredian.

 “Nvidia Omniverse closes the gap between the physical and virtual worlds, and these new tools, technologies and collaborations make it possible to leap into the 3D Internet today.”

Some 700 companies are already using Omniverse. Among them is computer hardware manufacturer Supermicro.

AI Business spoke with Supermicro's senior solutions manager, Alok K Srivastav at the recent AI Summit London to discuss technology's role in industry 4.0.

An open USD initiative

The company also published a broad initiative covering everything from open source to extensible language of 3D worlds.

The initiative was made in partnership with animation company Pixar, as well as Adobe, Autodesk and Siemens, among others.

The initiative will see Nvidia pursue a multi-year roadmap to expand Universal Scene Description capabilities beyond visual effects toward more industrial metaverse applications.

The company also announced the development of an open USD Compatibility Testing and Certification Suite that developers can freely use to test their USD builds and certify that they produce an expected result.

About the Author(s)

Ben Wodecki

Jr. Editor

Ben Wodecki is the Jr. Editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to Jr. Editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others. He holds an MSc in Digital Journalism from Middlesex University.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like