Listen to this story
“As the digital and physical world’s come together, we are creating an entirely new platform layer, which is the metaverse. The metaverse enables us to embed computing into the real world and to embed the real world into computing bringing real presence to any digital space. Mesh for Microsoft Teams will allow you to connect with presence and have a shared immersive experience directly in teams,” said Satya Nadella during Microsoft Ignite 2021.
Challenges in building multi-user MR apps
Simon Skaria, partner director of product, Microsoft Mesh, said mixed reality is primed to go mainstream. He outlined the challenges in building MR apps:
- It is hard to bring high fidelity 3D models into mixed reality in the file formats the customers have.
- Representing people in mixed reality with realism requires a lot of time and resources.
- It is hard to see a hologram stable in a location across time and devices.
- Synchronising actions and expressions in a geographically distributed session is complex.
What is Mesh exactly ?
Microsoft Mesh, built on Microsoft Azure, helps developers build immersive, multi-user, cross-platform mixed reality apps. Customers can use Mesh for virtual meetings, better remote assistance, virtual learning etc.
How it works
- Mesh provides multi-device support like Microsoft HoloLens, HP Reverb G2, or Oculus Quest 2 for 3-dimensional volumetric experience, phones and tablets ( both iOS and Android).
- Holographic rendering-Mesh allows the choice between local stand-alone rendering or cloud-connected remote rendering seamlessly within your app, for each scene and model. Holographic rendering also supports most 3D file formats to natively render in Mesh-enabled apps.
- The multi-user sync feature in Mesh allows users to create a hologram’s common perspective in a collaborative session. Microsoft claimed that all the motions, expressions or holographic transformation happens within 100 milliseconds of latency, even if the participant is not in the same physical location.
In late 2020, NVIDIA announced an open beta for Omniverse. With Omniverse, users will be able to create new 3D models of the physical world. The Omniverse Avatar will help developers generate, animate, simulate, and render state-of-the-art interactive avatars, said CEO Jensen Huang at GTC 2021.
What is Omniverse?
NVIDIA describes Omniverse as a scalable, multi-GPU real-time reference development platform for 3D simulation. It is based on Pixar’s Universal Scene Description and NVIDIA RTX technology.
Omniverse consists of five essential parts.
- Omniverse Nucleus is the database and collaboration engine of the Omniverse platform.
- Omniverse Connectors are plugins that will help client applications to connect to Nucleus and publish and subscribe to individual assets. NVIDIA aims to provide the highest fidelity connections to Omniverse, enabling a live-sync workflow between client applications and Omniverse Apps.
- Omniverse Kit is a toolkit for developers to build their own extensions, apps, microservices, or plugins for their ecosystem. As per NVIDIA, the SDK can be run headless as a microservice, or with a UI.
- The Omniverse platform gives developers access to NVIDIA’s scalable, physically accurate world simulation, powered by NVIDIA’s core physics simulation technologies.
- RTX Renderer natively supports renderers compliant with Pixar’s Hydra architecture. Omniverse also has an advanced, multi-GPU scalable renderer accelerated by RTX technology.
Like GTC 2021, this year’s GTC has a lot of announcements around Omniverse including Omniverse Cloud– a suite of cloud services to provide artists, creators, designers and developers instant access to the NVIDIA Omniverse platform for 3D design collaboration and simulation from billions of devices.
NVIDIA said Nucleus Cloud is a one-click-to-collaborate sharing tool to help artists edit large 3D scenes from anywhere without transferring massive datasets. Omniverse Create is built for technical designers, artists and creators to build 3D worlds in real-time. View App aids non-technical users to check Omniverse scenes streaming full simulation and rendering capabilities.