New ARCore Capabilities Announced At Google I/O 2021

Google’s augmented reality platform for developers, ARCore, has created compelling experiences for use cases across gaming, navigation, e-commerce and social media since its launch in 2018.

ARCore–used to develop AR experiences for Android–powers more than 850 million smartphone users and is installed in over one billion devices globally. To enable seamless integration of the virtual and real, Google has been building out ARCore skill tree, empowering developers with foundational capabilities around realism, perception and asynchronous interaction. 

On the second day of the Google I/O, Google announced its new capabilities in ARCore. The session was hosted by Rajat Paharia, Product Lead of ARCore; and Jared Finder, Engineering Manager of ARCore. 

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

Realism in AR 

The realism branch of the ARCore skill tree includes foundational AR Capabilities around motion tracking, lighting, and depth. 

  • Realism Capabilities allow putting digital content into the physical world so that users cannot tell the difference between what’s real and what is not. 
  • Motion tracking enables virtual objects to stay in place even when the camera is moving. 

“In the last year, we made AECore’s motion-tracking even more robust with an up to 36 percent decrease in CPU usage and dramatic reduction in the number of tracking resets, in many cases eliminating them completely,” Rajat said. 

  • Lighting: ARCore’s environmental HDR understands the lightning in the scene so that the virtual object(s) can look and behave exactly like the real ones. 
  • Depth API: It uses a single standard smartphone camera to provide a depth map, making experiences more immersive. “Your virtual objects can now interact with the real world, rather than being stickers on the screen,” Rajat said. 

Under Depth API, ARCore offers particle effects, interactivity, occlusion and lighting effects. 

Depth perception 

Perception enables one to detect certain objects in a scene and help the user augment them. It includes capabilities such as instant placement, augmented images, ML models, augmented faces, and ARCore’s latest capability, Raw Depth API.

  • Instant Placement enables users to place virtual content without first having to pan and scan to find a plane. “With Instant Placement, we have seen a 16 percent increase in placement success, and 26 percent decrease in placement time,” Jared said. 
  • Augmented Images API tracks moving images keeping the attached content intact. For users who want to create face effects, ARCore offers an Augmented Faces API. It provides high-quality, 468 point 3D mesh to allow users to add effects. This feature is available on both Android and iOS.
  • ML models: ARCore allows users to integrate their own ML models into AR experiences.

Asynchronous interaction 

This enables users to build AR experiences where users can interact with places and with each other, across space and time. The capabilities include Cloud Anchors, and Recording and Playback. 

  • Cloud Anchors allows users to annotate the world with AR content, allowing users to build their own constantly evolving layers on top of the real world, and create location-based experiences. These annotations can be experienced in both Android and iOS. 

New releases 

Google announced the launch of Raw Depth API, which enables developers to build more accurate ‘measurement, reconstruction and interaction apps’ by giving users access to more detailed point cloud representation than its usual standard depth API. 

The tech giant also announced the launch of the ARCore Recording and Playback API. “For your developer velocity, you can now record a video and then play it back through ARCore,” Jared said. 

He said users or businesses looking to build an experience in a shopping mall need not go to the mall every time they want to test a change. They can record their visit once and test from the comfort of their own desk. Enterprises can enable post-capture AR experiences for users, he added.

More Great AIM Stories

Debolina Biswas
After diving deep into the Indian startup ecosystem, Debolina is now a Technology Journalist. When not writing, she is found reading or playing with paint brushes and palette knives. She can be reached at debolina.biswas@analyticsindiamag.com

Our Upcoming Events

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Rising 2023 | Women in Tech Conference
16-17th Mar, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

Conference, in-person (Bangalore)
Cypher 2023
20-22nd Sep, 2023

3 Ways to Join our Community

Whatsapp group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our newsletter

Get the latest updates from AIM