Metaverse has become the talk of the town lately. Last month, Facebook CEO Mark Zuckerberg announced that the social media giant plans to become a metaverse company. Recently, alongside Blender and Adobe, NVIDIA revealed its major expansion of NVIDIA Omniverse, the world’s first simulation and collaboration platform that delivers the foundation of the metaverse.
What is metaverse?
Metaverse is a shared virtual 3D world, or worlds, that are interactive, immersive, and collaborative. For instance, online social games, like battle royale juggernaut Fortnite and user-created virtual worlds like Roblox and Minecraft, reflect some elements of the idea. At the same time, video-conferencing tools, which link remote colleagues together amid the global pandemic, are another hint at what’s to come.
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Rev Lebaredian, vice president of simulation technology at NVIDIA, said that the metaverse would become a platform not tied to any one app or any single place — digital or real. As virtual places will be persistent, so will the objects and identities of those moving through them, allowing digital goods and identities to move from one virtual world to another, even into our world, with augmented reality. “Ultimately, we are talking about creating another reality, another world that’s as rich as the real world,” said Lebaredian.
Enters NVIDIA Omniverse
Thanks to NVIDIA Omniverse, those ideas are already in the works, which, simply put, is a platform for connecting 3D worlds into a shared virtual universe. It is used across industries for projects, which involves collaboration and creating ‘digital twins,’ simulations of real-world buildings and factories. For example, BMW Group uses Omniverse to create a future factory, a perfect ‘digital twin’ designed entirely in digital and simulated in NVIDIA Omniverse.
Richard Kerris, vice president of the Omniverse development platform at NVIDIA, said that Omniverse connects worlds by enabling the vision of the metaverse to become a reality. He said they are advancing this revolutionary platform with inputs from developers, partners, and customers. As a result, everyone from individuals to large enterprises can work with others to build unique virtual worlds that look, feel, and behave just like the physical world.
Lebardian said how hyper-text markup language, or HTML, is for the web, UDS will emerge as the common language to advance and support the metaverse.
In 2016, NVIDIA released an open-source software, USD, that provides a rich, common language for defining, packaging, assembling and editing 3D data for a growing array of industries and applications in the media and entertainment; gaming; architecture; engineering and construction; manufacturing; telecommunications; infrastructure; and automotive.
NVIDIA Omniverse can be divided into three areas.
- The first is Omniverse Nucleus. It is a database engine that connects users and enables the interchange of 3D assets and scene descriptions. Once connected, designers doing shading, animation, lighting, modelling, layout, special effects can collaborate to create a scene. Multiple users can connect to Nucleus.
- The second aspect of Omniverse is the composition, rendering, and animation engine: the virtual world simulation.
- The third part is the NVIDIA CloudXR. It includes client and server software for streaming extended reality content from OpenVR applications to Android and Windows devices, allowing users to dive into and out of Omniverse.
Simulation of the virtual world
The Omniverse has been built from scratch to mimic the real world. Thanks to NVIDIA RTX graphics technologies, it is fully patch traced, simulating how each ray of light bounces around a virtual world in real-time. It simulates physics with NVIDIA PhysX and simulates materials with the NVIDIA MDL, or material definition language.
Omniverse is fully integrated with NVIDIA AI, which is core to advancing robotics, self-driving vehicles, etc. Omniverse is cloud-native, scales across multiple GPUs, runs on any RTX platform, and streams remotely on any device. “You can teleport into the Omniverse with virtual reality, and AI can teleport out of the Omniverse with augmented reality,” as per NVIDIA.
One of the essential features of the Omniverse is that it obeys the laws of physics. As a result, it can simulate particles and fluids, materials and even machines, right down to their springs and cables.
In its blog post, NVIDIA said that modelling the natural world in a virtual one is a fundamental capability for robotics. It allows users to create a virtual world where robots can train. ‘Once the minds of these robots are trained in the Omniverse, roboticists can load those brains onto an NVIDIA Jetson and connect it to a real robot,’ explained the NVIDIA team. “In the future, a factory will be a robot, orchestrating many robots inside, building cars that are robots themselves.”
NVIDIA’s Lebaredian said humans have been exploiting how we perceive the world for millennia, and we have been hacking our senses to construct virtual realities through music, art and literature. “Next, add interactivity and the ability to collaborate,” he added.
Most importantly, better screens, head-mounted displays or VR headsets like Oculus Quest, and mixed reality devices like Microsoft’s Hololens are a step towards a fully immersive experience. “The metaverse is the means through which we can distribute those experiences more evenly,” said Lebaredian.