MITB Banner

Microsoft Mesh vs NVIDIA Omniverse: A tale of two metaverses

Microsoft Mesh enables developers to build immersive, multi-user, cross-platform mixed reality apps.
Listen to this story

“As the digital and physical world’s come together, we are creating an entirely new platform layer, which is the metaverse. The metaverse enables us to embed computing into the real world and to embed the real world into computing bringing real presence to any digital space. Mesh for Microsoft Teams will allow you to connect with presence and have a shared immersive experience directly in teams,” said Satya Nadella during Microsoft Ignite 2021.

Challenges in building multi-user MR apps

Simon Skaria, partner director of product, Microsoft Mesh, said mixed reality is primed to go mainstream. He outlined the challenges in building MR apps:

  • It is hard to bring high fidelity 3D models into mixed reality in the file formats the customers have.
  • Representing people in mixed reality with realism requires a lot of time and resources.
  • It is hard to see a hologram stable in a location across time and devices.
  • Synchronising actions and expressions in a geographically distributed session is complex.

What is Mesh exactly ?

Microsoft Mesh, built on Microsoft Azure, helps developers build immersive, multi-user, cross-platform mixed reality apps. Customers can use Mesh for virtual meetings, better remote assistance, virtual learning etc.

Image: Microsoft

How it works

  • Holographic rendering-Mesh allows the choice between local stand-alone rendering or cloud-connected remote rendering seamlessly within your app, for each scene and model. Holographic rendering also supports most 3D file formats to natively render in Mesh-enabled apps.
  • The multi-user sync feature in Mesh allows users to create a hologram’s common perspective in a collaborative session. Microsoft claimed that all the motions, expressions or holographic transformation happens within 100 milliseconds of latency, even if the participant is not in the same physical location.

NVIDIA Omniverse

In late 2020, NVIDIA announced an open beta for Omniverse. With Omniverse, users will be able to create new 3D models of the physical world. The Omniverse Avatar will help developers generate, animate, simulate, and render state-of-the-art interactive avatars, said CEO Jensen Huang at GTC 2021.

What is Omniverse?

NVIDIA describes Omniverse as a scalable, multi-GPU real-time reference development platform for 3D simulation. It is based on Pixar’s Universal Scene Description and NVIDIA RTX technology. 

Omniverse consists of five essential parts.

  • Omniverse Nucleus is the database and collaboration engine of the Omniverse platform. 
  • Omniverse Connectors are plugins that will help client applications to connect to Nucleus and publish and subscribe to individual assets. NVIDIA aims to provide the highest fidelity connections to Omniverse, enabling a live-sync workflow between client applications and Omniverse Apps.
  • Omniverse Kit is a toolkit for developers to build their own extensions, apps, microservices, or plugins for their ecosystem. As per NVIDIA, the SDK can be run headless as a microservice, or with a UI. 
  • The Omniverse platform gives developers access to NVIDIA’s scalable, physically accurate world simulation, powered by NVIDIA’s core physics simulation technologies. 
  • RTX Renderer natively supports renderers compliant with Pixar’s Hydra architecture. Omniverse also has an advanced, multi-GPU scalable renderer accelerated by RTX technology.

Omniverse Cloud

Like GTC 2021, this year’s GTC has a lot of announcements around Omniverse including Omniverse Cloud– a suite of cloud services to provide artists, creators, designers and developers instant access to the NVIDIA Omniverse platform for 3D design collaboration and simulation from billions of devices.

NVIDIA said Nucleus Cloud is a one-click-to-collaborate sharing tool to help artists edit large 3D scenes from anywhere without transferring massive datasets. Omniverse Create is built for technical designers, artists and creators to build 3D worlds in real-time. View App aids non-technical users to check Omniverse scenes streaming full simulation and rendering capabilities.

Access all our open Survey & Awards Nomination forms in one place >>

Picture of Sreejani Bhattacharyya

Sreejani Bhattacharyya

I am a technology journalist at AIM. What gets me excited is deep-diving into new-age technologies and analysing how they impact us for the greater good. Reach me at sreejani.bhattacharyya@analyticsindiamag.com

Download our Mobile App

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
Recent Stories