Meta has finally unveiled the long-awaited Passthrough Camera API for the Quest, offering developers direct access to the headset’s RGB passthrough cameras. This move promises a significant leap forward, enabling a new wave of immersive mixed reality experiences.
Previously, the Quest’s passthrough cameras were mostly off-limits, with developers having to make do with Meta’s standard features. Back in September at the Connect event, Meta hinted at releasing the Quest’s Passthrough Camera API, but details about the timeline remained hazy—until now.
In the Meta XR Core SDK v74, the company rolled out the Passthrough Camera API as a Public Experimental API. This provides developers access to the forward-facing RGB cameras on the Quest 3 and Quest 3S.
With this passthrough camera access, developers can enhance lighting and effects in their mixed reality applications. More so, integrating machine learning and computer vision for tasks like detailed object recognition is now an option, making mixed reality content more intuitive and context-aware.
Mark Rabkin, a former Meta VP of VR/AR, highlighted last year that the Quest’s Passthrough API would pave the way for groundbreaking mixed reality experiences. He envisioned capabilities like tracked objects, advanced AI applications, eye-catching overlays, and nuanced scene understanding.
This release marks the debut of the API to the public. Prior to this, Meta had been testing early versions with key partners like Niantic Labs, Creature, and Resolution Games. These partners are currently showcasing their work at GDC 2025 during a Meta presentation titled ‘Merge Realities, Multiply Wonder: Expert Guidance on Mixed Reality Development’.
As it stands, this feature remains experimental, so developers can’t yet publish applications made with the Passthrough Camera API. Meta seems to be refining and perfecting the API through gradual releases.
Additionally, the SDK v74 update introduces other exciting features. Microgestures, which enable intuitive thumb-based interactions like taps and swipes, have been added. There’s also an Immersive Debugger, allowing developers to inspect Scene Hierarchy directly within the headset, along with new tools for friends matchmaking and local matchmaking.