Meta has lastly launched the long-awaited Passthrough Digital camera API for Quest, which provides builders direct entry to the headset’s passthrough RGB cameras for higher scene understanding—kickstarting the following era of extra immersive blended actuality experiences on Quest.
Till now, Quest’s passthrough cameras have been principally locked down, limiting what builders might do past Meta’s built-in functionalities. The corporate talked about again at Join in September it could ultimately launch Quest’s Passthrough Digital camera API, though it wasn’t sure when.
Now, within the Meta XR Core SDK v74, Meta has launch Passthrough Digital camera API as a Public Experimental API, offering entry to Quest 3 and Quest 3S’ forward-facing RGB cameras.
Passthrough digital camera entry will basically assist builders enhance lighting and results of their blended actuality apps, but additionally apply machine studying and pc imaginative and prescient to the digital camera feed for issues like detailed object recognition, making blended actuality content material much less of a guessing sport of what’s in a person’s atmosphere.
When it was introduced final yr, former Meta VP of VR/AR Mark Rabkin mentioned the discharge of Quest’s Passthrough API would allow “all types of cutting-edge MR experiences,” together with issues like tracked objects, AI purposes, “fancy” overlays, scene understanding.
This marks the primary time the API has been publicly accessible, though Meta has beforehand launched early builds with choose companions, together with Niantic Labs, Creature, and Decision Video games—that are presenting at this time at GDC 2025 in a Meta speak entitled ‘Merge Realities, Multiply Surprise: Professional Steering on Blended Actuality Growth’.
Granted, as an experimental function, builders can’t publish apps constructed utilizing the Passthrough Digital camera API simply but, though it’s probably Meta is once more taking an iterative method to the function’s full launch.
The v74 launch additionally consists of Microgestures for intuitive thumb-based gestures (e.g., thumb faucets, swipes), an Immersive Debugger so builders can view and examine Scene Hierarchy instantly throughout the headset, and new constructing blocks, resembling buddies matchmaking and native matchmaking.