Sponsored By

Unity’s Head of Create: AI Will Let Gamers Talk Naturally With CharactersUnity’s Head of Create: AI Will Let Gamers Talk Naturally With Characters

Marc Whitten also addressed a user concern over copyright: Unity is focused on sourcing content ethically

Deborah Yao

June 30, 2023

10 Min Read
An animated image of robots holding guns

Unity Software, the market share leader in games development, is making a big push towards incorporating AI-enabled capabilities throughout its products. The company just announced a closed beta of two AI platforms: Unity Muse and Unity Sentis.

AI Business caught up with Marc Whitten, president of Create at Unity, to discuss the company’s big plans for AI and also address a concern voiced by several users about potential copyright issues arising from the use of generative AI.

What follows is an edited version of that conversation.

AI Business: You’re a big name in games but perhaps not everyone knows what your company does. What does Unity do?

Marc Whitten: Unity builds the services and tools that are behind a huge percentage of the games in the world, but also a bunch of other real time 3D experiences. So it's both games and non-game stuff. If you're a manufacturing company trying to think about how you need real time interactive use of your 3D assets, you use tools like Unity. We build tools for cars, but also all the games you know − well over 50% of games in the world are based on the Unity engine and our tool chain.

AI Business: What exactly do you provide?

Whitten: There's a couple of really critical pieces. First, we provide the editor − the tool that you actually spend time in when you're building a game or other real time 3D experience. Then on top of that, we provide what's known as the Unity Runtime, or the code that is delivered with your game that goes to each individual device. … We have a common runtime so that you can work on the fun part of the game and we'll take care of the platform details to make it easy for you to reach your players. And then we also have lots of services to help you be successful with your players, to help you find players, to keep them engaged and things like that.

AI Business: Tell us more about the two AI platforms that you've announced.

Whitten: It's very clear to us that pretty much every piece of the game is going to be touched by AI technology – from when it's created to … when it's played, what we would call Create time and Runtime. As we thought about … how to make sure we can provide value there, it's been clear to us that we actually needed to develop an AI platform for each of those areas.

We've created this platform known as Unity Muse, which is our AI platform for Creation time tools − all the work we do in the editor and making it easy for teams to build using AI technology − and then Unity Sentis, which is all about when the player is touching the screen or using the game pad, and you want to do AI or inference at runtime in every frame of the game on device.

AI Business: Walk me through Muse, if you will.

Whitten: Many different tools are being used by people to generate content and assets using AI. Today, what we are focused on is how do you integrate AI fundamentally in the workflow of building games and other real time 3D experiences? What we see with that is people need to be able to ask questions and understand where they can get help for such things as code snippets but also as they're creating content and assets, which is a huge part of building games.

They need to move from broad generation – like, ‘Hey, give me some variations of this type of an idea for an asset − to fine-grained controls, so that they can tailor it specifically to the game, and they need the assets to be in the right quality level to be able to implement a game, which is not just the texture, but it's also a thing like a normal map to understand the resolution that they need for a particular material that's going to be used inside of their game, etc.

We're taking generative tools and putting it in (the Muse) platform. It's available both on the web and inside of editor, so that people can use it at whatever stage they are in building their game. Because it always generates game-ready assets, it's easy for you to just drop it into the game and keep moving as you're building and iterating. …

Stay updated. Subscribe to the AI Business newsletter

I personally believe that AI is just a continuum on a set of tools. As an example, obviously you can open up a blank canvas on a tool like Photoshop and start drawing … to create something. People actually do many assets like that. But then also, there's a really large number of procedural tools (that already automate creation at Unity) such as Speedtree.

If you've seen vegetation in a video game, there’s a very high likelihood you’ve seen actual Speedtree. Now with Speedtree, you don’t draw a tree; you tell it the type of biome you're in, and it will procedurally generate vegetation that matches that and then give you controls around it. It’s not AI, but it's procedural generation, where you infer intent and it helps you.

And then you can keep moving back towards AI where you have more and more natural input and language capabilities to describe the type of asset that you're doing … to take that asset a bit forward. We look at all of those and say, ‘how do we continue to make sure there's a really great workflow for an individual or team working together to create assets, (develop) greater variation on those assets, and to be able to use them directly in their games as easily as possible?'

AI Business: The logical question after that is where do you get your sources and are you protected from any kind of copyright issues that might come up?

Whitten: (This issue is) one that we're really focused on. … We have what we call Muse chat: You can ask questions about Unity, and it is trained on some large language models but the actual information that it delivers is based on our proprietary information − that's what makes it better. It knows all of our release notes, our documentation, the differences among versions of Unity, our source code, all that sort of stuff. So when you ask it, it can give you a much more targeted piece of content.

On the generative side, we are focused on making sure that we have ethically sourced assets. … For us as part of being game-ready is you feeling confident that … it's content that you can use in the game that you're actually shipping. We're very focused on using our own datasets and working with the community to make sure that those pieces work.

AI Business: How do you exactly do that? OpenAI still can't figure it out.

Whitten: The generative parts, like texture creation and those elements, we haven't released those. When these come out, we'll talk through them but we're based on finding our own data that we're training … so that we can be sure of the provenance of the content that's created. …

When those pieces come out in the next several weeks, we'll have some more details to share about it − but it's a core focus for us. First off, it's the right thing to do. Secondly, it won’t be a valuable workflow if you don't feel like you can actually use it.

AI Business: Some places are banning AI generated assets, like for example, Steam. How do you get around that?

Whitten: The core focus goes back to the point we made around the source of data. So are we building systems that you can feel confident are commercially licensable and if that's the case, you're going to be able to ship those games anywhere. Now, the world is evolving quickly and we'll obviously continue to adapt. But our focus, especially on these generative issues, is to make sure that we're ethically sourcing content, that creators are in control and that you are getting assets that you have the legal right to ship as part of the game.

AI Business: Let's say you get sued or a creator gets sued for copyright issues. Who takes responsibility?

Whitten: I don't think there's anything different here than what’s been standard across the industry. You use Unity to use a bunch of different assets and tools in a bunch of other places. … What we're focused on is making sure that we can build tools that work with data (in which) we know where that data came from so that you as a creator can use that to build assets that will work in your game.

... So you know, we're (already) training on our own data. One of the (tools) that we're showing a sneak peek of is one that actually allows you to do animation data. A hard thing to do is you want a character that can perform a certain animation in a game. Today, you have to take the character and pose moment by moment … and what we want to do is (for the creator to prompt) ‘I want to get this character to jump rope, and the character will jump up just by using standard generative AI techniques based on training data.

We actually have a very large corpus of extremely high quality animation data that we're using to build our own model … that we can then use to build upon those. Tool by tool, we're thinking about an approach to the data that we can use so that the output is based on ethically sourced content − either our content or content where we understand the provenance.

Obviously, this entire space is an extraordinarily fast evolving one. There's lots of zigs and zags ahead of us. We're going to keep paying attention and make sure that we're learning as fast as the rest of the industry as we go through this, but we're clear that what we want is to create AI that makes it easy for there to be more creators in the world.

AI Business: Can you talk about the second AI platform, Sentis?

Whitten: We've built Sentis to enable people to embed neural networks directly in their game. Now, here's the really key part of that. We don't mean talk to a neural network in the cloud, because obviously you can do that today. But the problem with that is it can be quite expensive. If you suddenly had a million players on your game, and they were all doing an inference in every single frame of the game, that would be quite expensive.

What we do is we make it possible to take that neural network and embed it in your game to target any platform that Unity works for (and avoid potentially high cloud costs). ... What that unlocks is amazing. … Your world comes alive: the NPC character can suddenly respond to you in a natural way. … You do instantaneous inputs. That means that game developers have new ways to give more creativity to players to let them do different things or do effects in their games they could not have otherwise.

… That world is alive and every NPC (nonplayer character) talks to you. You can ask it, ‘What is your favorite breakfast?’ And even if a developer never thought about (programming) that, it'll answer. I think the part that's going to be amazing is when game designers (realize) they can really do any type of inference that they can imagine. We'll have the creativity that will unlock as people come up with new types of games that just can't be done today. That is very fundamental to the future of how AI will impact all games and we're really excited with what we're doing with Unity Sentis.

AI Business: When will the platforms be widely available?

Whitten: We have a target but obviously, part of the goal of the beta is to get feedback from customers and make sure that it's well tuned. As soon as we can get through the beta feedback we'll be ready to go from closed to open beta to GA (general availability).

Read more about:

ChatGPT / Generative AI

About the Author(s)

Deborah Yao


Deborah Yao is a Stanford grad who has worked at Amazon, the Wharton School and The Associated Press.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like