MITB Banner

Everything You Need To Know About NVIDIA Omniverse

Omniverse is cloud-native, scales across multiple GPUs, runs on any RTX platform, and streams remotely on any device.

Share

Everything You Need To Know About NVIDIA Omniverse

Metaverse has become the talk of the town lately. Last month, Facebook CEO Mark Zuckerberg announced that the social media giant plans to become a metaverse company. Recently, alongside Blender and Adobe, NVIDIA revealed its major expansion of NVIDIA Omniverse, the world’s first simulation and collaboration platform that delivers the foundation of the metaverse.  

What is metaverse? 

Metaverse is a shared virtual 3D world, or worlds, that are interactive, immersive, and collaborative. For instance, online social games, like battle royale juggernaut Fortnite and user-created virtual worlds like Roblox and Minecraft, reflect some elements of the idea. At the same time, video-conferencing tools, which link remote colleagues together amid the global pandemic, are another hint at what’s to come. 

Or, like in the movie Ready Player One, or 1992 sci-fi novel Snow Crash, which goes well beyond video-conferencing and online games. 

Rev Lebaredian, vice president of simulation technology at NVIDIA, said that the metaverse would become a platform not tied to any one app or any single place — digital or real. As virtual places will be persistent, so will the objects and identities of those moving through them, allowing digital goods and identities to move from one virtual world to another, even into our world, with augmented reality. “Ultimately, we are talking about creating another reality, another world that’s as rich as the real world,” said Lebaredian.

Enters NVIDIA Omniverse 

Thanks to NVIDIA Omniverse, those ideas are already in the works, which, simply put, is a platform for connecting 3D worlds into a shared virtual universe. It is used across industries for projects, which involves collaboration and creating ‘digital twins,’ simulations of real-world buildings and factories. For example, BMW Group uses Omniverse to create a future factory, a perfect ‘digital twin’ designed entirely in digital and simulated in NVIDIA Omniverse. 

Richard Kerris, vice president of the Omniverse development platform at NVIDIA, said that Omniverse connects worlds by enabling the vision of the metaverse to become a reality. He said they are advancing this revolutionary platform with inputs from developers, partners, and customers. As a result, everyone from individuals to large enterprises can work with others to build unique virtual worlds that look, feel, and behave just like the physical world.  

Lebardian said how hyper-text markup language, or HTML, is for the web, UDS will emerge as the common language to advance and support the metaverse. 

Inside Omniverse 

NVIDIA Omniverse has integrated the universal scene description (USD) interchange framework developed by Pixar in 2012 with technologies for modelling physics, materials, and real-time path tracking.

In 2016, NVIDIA released an open-source software, USD, that provides a rich, common language for defining, packaging, assembling and editing 3D data for a growing array of industries and applications in the media and entertainment; gaming; architecture; engineering and construction; manufacturing; telecommunications; infrastructure; and automotive. 

Currently, NVIDIA has 12 Omniverse Connectors to major design tools already, with another 40 on the way. The Omniverse Connector SDK sample code is available for download now

(Source: NVIDIA) 

NVIDIA Omniverse can be divided into three areas. 

  • The first is Omniverse Nucleus. It is a database engine that connects users and enables the interchange of 3D assets and scene descriptions. Once connected, designers doing shading, animation, lighting, modelling, layout, special effects can collaborate to create a scene. Multiple users can connect to Nucleus. 
  • The second aspect of Omniverse is the composition, rendering, and animation engine: the virtual world simulation.
  • The third part is the NVIDIA CloudXR. It includes client and server software for streaming extended reality content from OpenVR applications to Android and Windows devices, allowing users to dive into and out of Omniverse.

Simulation of the virtual world

The Omniverse has been built from scratch to mimic the real world. Thanks to NVIDIA RTX graphics technologies, it is fully patch traced, simulating how each ray of light bounces around a virtual world in real-time. It simulates physics with NVIDIA PhysX and simulates materials with the NVIDIA MDL, or material definition language. 

Omniverse is fully integrated with NVIDIA AI, which is core to advancing robotics, self-driving vehicles, etc. Omniverse is cloud-native, scales across multiple GPUs, runs on any RTX platform, and streams remotely on any device. “You can teleport into the Omniverse with virtual reality, and AI can teleport out of the Omniverse with augmented reality,” as per NVIDIA. 

NVIDIA Omniverse: Exploring DeepSearch with Activision 

Training robots 

One of the essential features of the Omniverse is that it obeys the laws of physics. As a result, it can simulate particles and fluids, materials and even machines, right down to their springs and cables. 

In its blog post, NVIDIA said that modelling the natural world in a virtual one is a fundamental capability for robotics. It allows users to create a virtual world where robots can train. ‘Once the minds of these robots are trained in the Omniverse, roboticists can load those brains onto an NVIDIA Jetson and connect it to a real robot,’ explained the NVIDIA team. “In the future, a factory will be a robot, orchestrating many robots inside, building cars that are robots themselves.” 

Wrapping up

NVIDIA’s Lebaredian said humans have been exploiting how we perceive the world for millennia, and we have been hacking our senses to construct virtual realities through music, art and literature. “Next, add interactivity and the ability to collaborate,” he added. 

Most importantly, better screens, head-mounted displays or VR headsets like Oculus Quest, and mixed reality devices like Microsoft’s Hololens are a step towards a fully immersive experience. “The metaverse is the means through which we can distribute those experiences more evenly,” said Lebaredian.

Share
Picture of Amit Raja Naik

Amit Raja Naik

Amit Raja Naik is a seasoned technology journalist who covers everything from data science to machine learning and artificial intelligence for Analytics India Magazine, where he examines the trends, challenges, ideas, and transformations across the industry.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.