Google has just unveiled the schedule for their upcoming I/O developer conference, promising sessions that will dive deeper into the much-anticipated Android XR operating system. However, from what we can see, Google isn’t exactly putting Android XR in the spotlight just yet.
Since its introduction last December alongside Samsung’s ‘Project Moohan’ mixed reality headset, Android XR has been keeping a low profile. Both of these innovations are expected to be revealed to the public later this year, although no specific dates have been given.
Google has already teased a few features of Android XR, such as support for passthrough camera access, and has allowed developers access to the Android XR SDK. Yet, the big question remains as to how it will compare with the more established XR ecosystems like Meta’s Horizon OS and Apple’s visionOS.
The Google I/O event will feature several livestreamed keynotes from May 20th to 21st. However, only two developer talks specifically focused on Android XR are planned, and neither will be livestreamed. Nonetheless, a ‘What’s New in Android’ livestream is set to include mentions of Android XR.
Even without real-time broadcasted information, the developer-focused sessions indicate Google’s strategy to engage developers in integrating XR more deeply into the Android ecosystem, albeit quietly. Here’s a peek into what these talks will entail:
As the public release of Android XR approaches this year, Google is gearing up with a new XR toolchain. This new toolkit will merge Jetpack SceneCore and ARCore into the XR-tailored Jetpack. Now in developer preview mode, Jetpack XR is designed for developers creating spatial layouts using 3D models and immersive environments on both mobile and large screens. The integration of ARCore indicates Google’s move to streamline spatial computing tools, providing a cohesive avenue for crafting both AR and VR experiences.
The talks will also cover how to bring XR features to existing apps, with elements like 3D models, hand-tracking, and stereoscopic video. This approach signifies Google’s ambition to extend Android XR beyond game development, syncing it with the wider Android app ecosystem.
Moreover, Google is set to bring Jetpack Compose, its fast-evolving UI toolkit, into the world of XR. This suggests an effort to standardize UI design across devices, from phones to tablets and XR, simplifying the adaptation of UIs for immersive environments.
Interestingly, a second talk discusses upcoming AI features embedded in Android XR, hinting at future capabilities like real-time object recognition, scene understanding, or perhaps even AI-generated environments.
The fact that these sessions aren’t livestreamed may suggest Google is still holding back on making a big noise about Android XR. Meanwhile, we’re keeping our ears open for updates on Samsung’s upcoming ‘Project Moohan’ headset, projected to be the first device to support Android XR.
Either way, we’re ready to catch the livestreams and share any exciting news that comes out of the technical talks.
Building differentiated apps for Android XR with 3D content
Dereck Bridié and Patrick Fuentes, both Developer Relations Engineers, will lead a session that introduces developers to Jetpack SceneCore and ARCore for Jetpack XR. They’ll demonstrate how to enhance existing apps with 3D models, stereoscopic video, and hand-tracking. This will provide developers with insight into new features available in the Android XR SDK developer preview, preparing them for Android XR’s public release later this year.
The future is now, with Compose and AI on Android XR
Cecilia Abadie, Senior Product Manager, and Jan Kleinert, Developer Relations Engineer, will explore the future of immersive tech with Android XR. They’ll reveal updates to the Android XR SDK Beta, launched at I/O, including advancements to Jetpack Compose for XR and new AI features. Attendees will learn to leverage their existing large-screen development work to extend into the exciting realm of Android XR.