The Android Show | XR Edition

Google’s Vision for a Unified XR Ecosystem

December 10, 2025

The Android Show XR Edition by Google

Google presented one of its most ambitious proposals for the future of spatial computing this week during The Android Show | XR Edition. In collaboration with Samsung and Qualcomm, the company outlined a clear path toward a cohesive, scalable XR ecosystem centred on deep integration between hardware, software and artificial intelligence.

For INVELON, where we work daily on immersive solutions for companies and public institutions, this announcement reinforces a clear message: extended reality is entering a phase of maturity in which widespread adoption is finally becoming viable.

Android XR: the new extended reality platform

The main news of the event revolved around the new functionalities of Android XR and its multiplatform approach, consolidating it as Google’s unified operating system for extended reality devices. The goal is to ensure that headsets, smart glasses and intermediate form factors operate under a single framework, simplifying development, compatibility and user experience.

According to Google’s vision, the XR ecosystem will be divided into several categories:

  1. Headsets designed for immersive entertainment and advanced productivity, with high-quality image rendering and robust computational capabilities.
  2. Connected AR glasses that bring immersive experiences into any environment, offering increased portability through collaborations with companies like XREAL
  3. Wireless AR glasses incorporating displays and maximising ergonomics, following examples such as Meta’s recent partnership with Ray-Ban.
  4. AI glasses for everyday use, lightweight and designed in collaboration with leading optical brands.

For enterprises and developers, this represents a decisive step toward a standard capable of consolidating spatial computing across multiple industries.

Samsung announced new features for its first headset, Galaxy XR, a device designed from the ground up to integrate with Android XR and to leverage Gemini as its main assistant. The headset includes broad access to Google Play applications and introduces capabilities aimed at productivity and entertainment.

Highlighted features include:

PC Connect

Allows users to bring windows from their computer directly into the XR environment, opening new possibilities for work that requires extensive virtual screen space.

Likeness

A system that creates a hyper-realistic representation of the user for XR videoconferencing, adding expressiveness and natural presence beyond stylised avatars.

Travel Mode

Generates a stable XR space during travel — particularly useful on flights — enabling comfortable use even in situations with vibration or limited movement.

Auto-Spatialization (coming soon)

Real-time transformation of any 2D content into 3D. This could greatly accelerate spatial content adoption, as it removes the need for app-specific development.

Project Aura: lightweight XR for work anywhere

One of the most compelling demonstrations of the event was Project Aura, XREAL’s lightweight XR device combining optical transparency, gesture-based interaction and a highly portable form factor. It connects via cable to a compact external module containing the battery and computing unit, keeping the glasses light without compromising performance.

Project Aura enables:

  • Viewing multiple floating windows (browser, editing tools, messaging apps) with a 70° field of view.
  • Integrating Gemini as a contextual assistant within any visible application.
  • Extending a laptop screen into spatial windows simultaneously.

This approach opens the door to everyday XR use outside the home or office — an essential factor for large-scale adoption.

AI Glasses: the leap toward truly intelligent eyewear

Google also showcased the progress of its AI Glasses, developed in partnership with Warby Parker and Gentle Monster. Two variants are planned:

  • AI Glasses with camera, microphones and speakers, designed as a hands-free assistant.
  • Display AI Glasses, incorporating a microdisplay that surfaces private information in real time.

Key features demonstrated included:

  • Real-time conversation translation.
  • Object identification and contextual visual search.
  • On-the-fly image creation and editing using the Nano Banana model.
  • Photo capture and action execution via natural voice commands.

These capabilities position smart glasses as practical daily devices, distancing them from the “gadget” perception and bringing them closer to mainstream use.

A more mature SDK for XR developers

Google also introduced Developer Preview 3 of the Android XR SDK, which includes major improvements for building XR experiences and ensuring compatibility with AI glasses and hybrid devices.

Notable updates include:

ARCore Expansion

New geospatial capabilities, dynamic orientation and motion tracking to enable experiences that respond to user movement and the real environment.

Jetpack Glimmer

A new UI library specifically designed for transparent-display glasses, offering clear, unobtrusive visual components.

Jetpack Projected

Allows projecting a mobile app directly onto AI glasses while retaining access to native features like camera, audio and system controls.

XR Emulator for Android Studio

Includes configurable fields of view and XR scenarios for testing applications without physical hardware.

Uber also demonstrated a practical use case to improve passenger pickup experiences: contextual directions in airports, quick access to the vehicle’s license plate and direct communication through the glasses.

What this means for the XR industry?

At INVELON, we identify three immediate impacts:
  1. XR ecosystem standardisation. Android XR could become the reference model for developers and organisations, mirroring the role of Android in mobile.
  2. Deep integration of AI into workflows. Gemini acts as a universal interface linking computer vision, natural conversation and action.
  3. New business opportunities. From spatial productivity to assisted navigation, remote support, enhanced tourism or immersive training.

The core message of the event is clear: XR is no longer a collection of experimental technologies but a new interaction layer that connects the digital and physical worlds.

Google, Samsung and Qualcomm are aligned in a shared vision: an XR ecosystem designed for everyday use, with mature development tools and AI at its core.

Contact us

Do you need more information about any of our solutions or would you like us to assess you on how to integrate Industry 4.0 technologies into your company?