At Meta’s flagship conference this week, the highly anticipated unveiling of the Orion prototype AR glasses stole the show, with the company having reportedly devoted nearly five years to their development. Meta’s announcement marks a significant milestone because it seeks to transform a prototype into a commercially viable product, warranting attention beyond its compact size.
Here’s a catch-up on the high-level protection offered by the Orion headset: we’re joined today by Norman Chan from Tested, who had the chance to sit down with Meta CTO Andrew “Boz” Bosworth to explore the Orion project and its ambitious goals. He delves deeply into the fascinating technical intricacies of the headset. Watch his entire video below or scroll further down for a summary of the technical details Chan gleaned from his demonstration and discussion:
Although Orion is not yet geared towards mass production, Meta plans to build approximately 1,000 units for internal testing purposes. At an estimated valuation of $10,000 per prototype, the company is poised to invest a substantial sum of $10 million in hardware procurement to facilitate cost-effective testing and development on a scalable level.
The Orion glasses tip the scales at a mere 98 grams, a weight that comfortably falls beneath Meta’s stipulated threshold of 100 grams for objects to genuinely feel and appear relatively lightweight. For context, the standard Ray-Ban Aviator sunglasses weigh approximately 30 grams, while Meta’s own Ray-Ban smartglasses clock in at around 50 grams. The Orion AR glasses, while passably referred to as glasses, still possess a decidedly chunky demeanor.
Despite its relatively lightweight design, 100 grams pales in comparison when considering that Orion’s feature set rivals Meta’s Quest 3 headset, which weighs in at a substantial 515 grams.
With novel silicon carbide lenses and microLED projectors, Orion boasts a remarkable 70° diagonal field-of-view, unparalleled for its size. According to Meta, their displays will boast thousands of nits of peak brightness. Because the light begins its journey through a delicate and intricate optical pathway, it’s crucial to start with a soft, gentle source – for too much intensity can be lost along the way. By the time your eyes reach this stage, you’ll be accustomed to viewing displays that emit a mere 300-400 nits of brightness.
While that’s slightly brighter than typical VR headsets, it still falls short of being vibrant enough for outdoor use on a sunny day. To achieve optimal outdoor use, you would likely require a screen brightness of approximately 3,000 nits. To achieve widespread adoption, Meta may need to explore more efficient and sustainable ways to power Orion, potentially investing in innovative light sources that reduce energy consumption or optimize beam directionality.
According to Chan, the principal Orion demo boasts a resolution of 13 pixels per degree, a surprisingly high figure. While AR glasses generally boast a narrower field-of-view compared to VR counterparts, this limitation can be offset by spreading available pixels across a smaller area. Despite its 70° field-of-view, Orion’s primary performance metric – pixels per diode (PPD) – lags behind that of Quest 3 by approximately half.
Despite this, Meta showcased an Orion prototype with a rate of 26 pixels per second, albeit at the cost of reduced image luminosity. The corporation has tasked Chan with achieving a goal of 30 parts per day (PPD) by the time the Orion project reaches its intended state as a functioning product. Although that’s removed from a ‘retina’ decision of 60 PPD, it is still sufficient to render the headset useful for text-based tasks.
In Chan’s revealing interview, one captivating aspect was how Orion glasses seamlessly integrate eye-tracking technology.
Utilizing various headsets as inspiration, the method involves casting an array of infrared LEDs onto the individual’s gaze, followed by calibrating a camera to capture the reflection of these IR lights on the eye, thereby reverse-engineering the location of attention based on this visual data. Instead of positioning IR LEDs in a hoop across the lens, Chan innovated by placing tiny LEDs directly within the consumer’s field-of-view, right on the lens itself.
The wiring that powers the LEDs is deliberately disordered, resembling a stray strand of hair on the lens, making it virtually undetectable to the wearer.
While a random sample may appear uninviting, a clearly defined sample often proves far more effective in capturing our attention, echoing the principles behind numerous optical illusions that rely on striking contrasts to pique our interest. Among the seemingly random samples, incredibly fine wires, and their proximity to our gaze, Chan observed that everything appeared altogether invisible upon closer inspection through the lens.
The “compute puck,” designed to alleviate computational burdens on the glasses, leverages a bespoke Wi-Fi 6 protocol to communicate effectively over distances of approximately 10 feet.
The customised protocol reportedly optimises data transmission by pulsing information from the puck, rather than continuously streaming it, thereby reducing both heat generation and energy consumption. Instead of transmitting data continuously from the puck to the glasses, we process information in packets, aggregating outgoing data within a specific time frame before transmission.
While the puck boasts impressive battery life of “all day,” the glasses themselves can currently operate for up to three hours, similar to what you’d expect from a standalone virtual reality headset.
Compared to previous analysis prototypes showcased by Meta, Orion is not solely designed to provide users a glimpse into the expertise the company intends to ultimately deliver. Is Orion an early look at a product that Meta is actively developing?
The corporation claims it will reduce the size of the glasses while making a more informed decision and ensuring affordability. Meta predicts that its advanced AI-powered client device, Orion, is likely to become available by 2030, with a projected price tag of approximately $1,500.
There are many more details contained in Chan’s video that we haven’t covered here. Why settle for just a snippet when you can experience the entire video?