November 8, 2023
Leslie Shannon, head of trend scouting at Nokia, talks about AI, the metaverse and the future of VR on the AI Business podcast. She explains why she thinks the metaverse is not dead and dives into some of the major announcements from Meta Connect 2023.
Listen to the conversation or read the edited transcript.
Give us a brief overview. What is it that you do in trend scouting for Nokia?
I'm a little different from other trend scouts. Usually, trend scouting is linked to either acquisitions or partnering. For me, it's neither. I just scout ideas. I'm looking for companies outside the world of telecommunications that are coming up with innovations that are going to be calling on the network.
I’m particularly interested in things that are going to be calling on the network in ways that I know doesn't currently exist. That’s how I got into augmented reality (AR) and virtual reality (VR) purely from the connectivity aspect. There are a lot of things with AR and consumer-grade AR eyeglasses where connectivity is going to have to be there in a big way and so we need to start building those networks now.
You spoke about AR, VR and the metaverse at this year’s Mobile World Congress. That was in March. What’s changed in that time?
One of the biggest shifts is that generative AI has stolen all the headlines. One of the most common questions that I get when I talk about AR, VR, or even the metaverse is people say, ‘Well, isn't that dead?’
I say no. If you look at the industry itself, it’s thriving - it's just a little bit quiet in the shadows right now. Ori Inbar, who leads the Augmented World Expo conferences, said it best at the end of May - 'Extended reality (XR) is the interface to AI.’ I think that encapsulates it perfectly.
There's a lot of attention on AI. But AI and XR are not in two separate buckets. They interact with each other. And I think Meta demonstrated that well at Meta Connect where they showed that they were bringing AI into their Ray-Ban glasses, which I am personally excited about.
Let’s talk Meta Connect. You attended in VR, how was that experience?
I loved it. I absolutely loved it. I've been to the Meta campus. And I could see that Mark Zuckerberg was standing on a stage that was in the center of the Meta campus. But what they had done in VR was take that big visual of the stage which was 3D and then integrate it into something completely different. The surroundings that you had in VR were forests and it was not like the Meta campus.
They highlighted a few of the Horizons World areas for users to try. I was trying the fishing game, but the controls were not intuitive, and I was a little horrified that they were highlighting this. I thought ‘This is not the experience you want to be giving people if you're trying to sell on Horizons World.'
The event itself and the use of VR, I thought were more successful than any other one of the announcements that they've done.
AI can be used to generate 3D assets for virtual environments. What have you seen in this space?
I saw just recently a quote from a training company, Tailspin, that AI has changed everything - a training scenario that used to take five days to build can now be done in 30 minutes using AI.
With AI and particularly generative AI, you can use natural language to describe what you want, and then voila, you get what you asked for. For me, this reminds me of when I was in high school and pocket calculators became affordable for the first time. There was a huge debate in my calculus, chemistry and physics classes about whether or not students should be allowed to use pocket calculators because the idea was if they weren't doing the math on paper, then they weren't going to get a feel for the underlying mechanics. Fast forward, and the TI 84 is a standard issue for all high school students.
It’s the realization that you're freeing up time to consider higher concepts, and we're going to be seeing a huge amount of that in this advent of AI. It's a tool that helps wipe away the grunt- level work at the bottom that is essential, but not necessarily massively value-adding. And if everybody can access that foundation with AI, then we free up our brain space to build the wonderful value-added stuff on top.
One metaverse-focused company using AI to build assets is Roblox. What’s your view on that?
One of the powerful things about Roblox has been its willingness to read the room and its ability to give creative authorship to its very young user base, teaching them from early days, that they can go out there and flex their creative muscles and benefit personally from it.
Overall, the success of Roblox is a spectacular model for what happens when you build something good, and then give the tools to people to then do more with it, and the element of reward as well. All those things are sewn together so beautifully in the Roblox platform.
You expressed interest in the new Meta Ray Ban smart glasses. Why are you so excited by them?
The ideal iteration of the metaverse is not the fully immersive world, it's augmented reality, where we remain in our beautiful physical world with the wonderful people whom we love. And the amazing enhancements to our life that we get from computing are integrated into that physical world through digital representations.
But how do we get there? There needs to be a product that people will be willing to start using to enable that scenario. What is the service or the product that is going to get people to put computer-enhanced eyeglasses on their heads? It's a big ask, especially if you're talking to people who don't wear glasses at all, so there needs to be some kind of service that is attractive enough and instantly beneficial enough that people can intuitively understand from the start.
One of the things that I've been excited about the possibility of is real-time subtitles, whether translated or not. I think that has huge potential for ending a quite predatory hearing aid market. I think hands-free AI that is available for you to just chat with as you go about your day - something that's a good deal smarter than a voice assistant – now that to me could also be the way in.
Didn’t we already have that with Google Glass some 10 years ago?
Part of the problem with Google Glass was that nobody knew what it was for. It was very minimal in terms of the facial structure with this ginormous camera that was like a little cannon pointed at people.
But that was 10 years ago. That was a time when TikTok didn't exist. Now, nobody questions having a camera facing outward all the time. That whole context has shifted.
Another initial way that I see things going is by looking at heads-up displays in cars. I see minimal heads-up displays becoming integrated with our visual feeds as going to be a powerful thing to start. No, it's not going to be snazzy graphics, it's going to be a little arrow that soon you will not be able to live without. It’s going to be minimal, but effective and problem-solving. And that'll be the thing that starts us getting into wearing all these things.
Let’s talk about the other Meta Connect announcements, anything else that excited you?
I got super excited about the Xbox Game Pass for a technical reason. We’re monitoring AR very closely and the only way that we can have these eyeglasses form factor, computing elements that sit on our face, is if the functionality is the computing functionality that currently resides on these glasses. But if we're going to have glasses that look slim and fashionable, the processing must come off the device, and it has to go somewhere else. And that computing, especially if you're dealing with generative AI, is going to need to go into the network where it can sit on bigger servers. That’s a significant change to the network architecture and my industry needs to start building that architecture.
But here's the thing, telco companies are not going to start sticking servers in their network close to end users, because, for latency reasons, you have to have the computing relatively close to the end users. But how do you monetize that? Because the AR stuff that we're talking about is deep in the future.
Here's the thing, cloud gaming has the same architecture. And that means a server sitting in the network not too far away from you, for latency reasons, the same architecture. And guess what? Cloud gaming is monetizable now, so bringing cloud gaming and integrating it into the whole XR world, actually is the thing that will provide the impetus for the phone companies and the other technology companies to start putting the servers out there, close enough to the end users that ultimately will power AR experiences. Massive, massively wonderful development, it's something that we needed.
What other tech trends are you excited about?
One of the things that I'm seeing for the next decade is implanted connectivity to augment us as human beings. One early example is from the University of California at San Francisco; they've been working with patients who are unable to speak. They’re doing invasive surgery and planting electrodes on the surface of the brain to try and give non-vocal people a voice. The team is measuring the brain activity and then extrapolating from that to give the person a voice via an avatar, with the avatar able to speak for them.
This concept is something that’s done with a wire plugged into the person's head and making that wireless will need very low radiation transmission so that it can be safely worn by humans. This goes well beyond AR and XR, but I think there will be some kind of integration with this. Because if eyeglasses are the form factor for human-computer interface going forward, I wonder if we will be going beyond that and getting straight into brain-sensing, on a commercial level and a personal level.
Finally, what else are you excited about? You said the metaverse isn't dead. Give us a brief summary of what we talked about and your view on tech trends to come.
One of the fundamental questions is why AR? Why metaverse? Why digital anything? And for me, the answer is that right now to access computing, our gaze dead ends on a two-dimensional screen. If you're sitting at your desk and looking at your laptop, that's a functional interface. We don't need to mess with that. But when you're out and about in the world, or even just moving through your own home, it's ridiculous that we're all looking at our smartphones all the time, instead of staring at each other. It's a problem, the fact that we have to divorce ourselves from our physical world to access information and entertainment on these two-dimensional screens.
For me, the main driver is to put your gaze back on the actual physical world and the people in it, where it belongs, hands-free. That's why the inclusion of Meta AI into glasses is so important because that helps demonstrate that. That's why it's like the heads-up display that we get in cars - people find that essential. Once we start getting into these ways, we start getting some of the most functional aspects integrated into our lives in a much more intuitive, context-driven hands-free way. I don't think we're going to be able to come back from it. Once we start getting information visually incorporated into our world, we're not going to want to go back to carrying a little screen and looking at that all the time, and missing the wonderful things that are happening around us in the physical world. That for me is the purpose and the driver.
About the Author(s)
You May Also Like