Listen to this story
Zuckerberg might have just infused life into the Metaverse—by ‘life,’ we mean literally. He recently appeared on a podcast with Lex Fridman, and guess what? It was the first interview to happen in the Metaverse.
The most fascinating thing about the podcast was that the avatars of Fridman and Zuckerberg were not cartoons but photorealistic. Gone are the days of cartoon avatars in the Metaverse.
“Mark and I are hundreds of miles apart in physical space, but it feels like we’re in the same room because we appear to each other as photorealistic,” exclaimed Fridman.
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
One might wonder, how was this made possible? These photorealistic avatars were created using Codec Avatars technology. Meta started working on Codec avatars in 2019 as part of a Reality Labs (RL) Research project.
These are created by capturing the user’s facial expressions and body movements using cameras and sensors. This data is then compressed and transmitted to another device, where it is decoded and used to create a digital avatar of the user. The avatars can be used to interact with other users in VR or AR environments.
Fridman was at a loss for words to describe the Codec avatars in 3D. “It’s hard to put into words how awesome this was for someone like me who values the intimacy of in-person conversation”.
“With spatial audio this technology is incredible. I think it’s the future of how human beings connect to each other in a deeply meaningful way on the internet,” he added.
“These avatars can capture many of the nuances of facial expressions” beamed Fridman. The idea of the Metaverse was always to connect people living far away from each other. The problem was that, with cartoon avatars, one couldn’t really express their emotions.
However, these new avatars are capable of capturing human emotions by projecting small nuances, from the movement of the eyes to the twitching of the eyebrows.
“Eyes are a huge part of it; I mean, there are all the studies that show most communication happens through non-verbal cues, such as expressions. So, we try to capture that with the classical expressive avatar system,” said Zuckerberg.
Furthermore, Zuckerberg added that Meta is also working on a quick and easy way to generate an avatar using your mobile phone, what he calls Instant Codec Avatars.
Codec avatars have the potential to change the way humans interact. First, they can provide people with the opportunity to socialize with others in a virtual environment, even if they are not physically close to each other. This can be especially beneficial for people who live in isolated areas or who have difficulty leaving their homes.
Furthermore, it can help people to feel more connected to others by allowing them to see and hear each other in real-time. This can create a more personal and engaging experience than traditional text-based or audio-based communication. “My family lives across five countries, and I can’t wait for this technology to be widely available,” said Martin Harbech, group director, Meta.
Additionally, Codec avatars can be used to create virtual worlds that are tailored to the individual’s interests. This can provide people with a place to go where they can feel comfortable and accepted, and where they can connect with others who share their interests.
Imagine a world where digital and reality come together. Yes, with Meta AI characters in the Metaverse, it is very much possible to create a world where you would be occupying a space where AI-generated avatars and humans are present together in a shared environment. This idea goes well as Meta recently introduced new AI experiences across a family of apps like Instagram and WhatsApp, including 28 AI characters with unique interests and personalities.
Zuckerberg highly anticipates that in the near future, business meetings will be conducted in the metaverse alongside virtual AI characters. This seems to be the beginning of a newly simulated world.
“We will have meetings in the future where you’re basically sitting there physically, and then you have a couple of other people who are holograms. Then you have someone like Bob, the AI engineer on your team, who’s helping with things and can now be embodied as a realistic avatar and just join the meeting. I think that’s going to be pretty compelling.”
Jim Fan, Senior AI Scientist at NVIDIA, was thoroughly impressed with the podcast and aligned himself with the idea of the metaverse. “The ultimate vision is to realize the scenes in Matrix: full-body, real-time avatars of both humans and AI, sharing the same virtual space, interacting with objects in physically realistic ways, receiving rich multimodal feedback, and forgetting that the world is but a simulation.” he posted on X.
These Codec avatars might completely change the fate of the metaverse. Many internet users claim that the Metaverse is here to stay, and this time, it’s real. The fact that Zuckerberg didn’t abandon the Metaverse speaks volumes about his firm belief in his pet project. With Quest 3 on the horizon, an AR/VR headset, Meta might change the way we interact with others.
“I’m a big believer in the metaverse and always have been,” said Aleksa Gordic, formerly with Microsoft and Google DeepMind, who worked on this technology back in 2018-19 as part of the Microsoft HoloLens project.