Meta is developing its Aria Gen 2 smart glasses, which come packed with sensors and AI featuresThe smart glasses can track your gaze, movement, and even heart rate to gauge what’s happening around you and your feelings about itThe smart glasses are currently being used to help researchers train robots and build better AI systems that could be incorporated into consumer smart glasses
The Ray-Ban Meta smart glasses are still relatively new, but Meta is already ramping up work with its new Aria Gen 2 smart glasses. Unlike the Ray-Bans, these smart glasses are only for research purposes, for now, but are packed with enough sensors, cameras, and processing power that it seems inevitable some of what Meta learns from them will be incorporated into future wearables.
Project Aria’s research-level tools, like the new smart glasses, are used by people working on computer vision, robotics, or any relevant hybrid of contextual AI and neuroscience that draws Meta’s attention. The idea for developers is to utilize these glasses to devise more effective methods for teaching machines to navigate, contextualize, and interact with the world.
The first Aria smart glasses came out in 2020. The Aria Gen 2s are far more advanced in hardware and software. They’re lighter, more accurate, pack more power, and look much more like glasses people wear in their regular lives, though you wouldn’t mistake them for a standard pair of spectacles.
The four computer vision cameras can see an 80° arc around you and measure depth and relative distance, so it can tell both how far your coffee mug is from your keyboard, or where a drone’s landing gear might be heading. That’s just the beginning of the sensory equipment in the glasses, including an ambient light sensor with ultraviolet mode, a contact microphone that can pick up your voice even in noisy environments, and a pulse detector embedded in the nose pad that can estimate your heart rate.
Future facewear
There’s also plenty of eye-tracking technology, able to tell where you’re looking, when you blink, how your pupils change, and what you’re focusing on. It can even track your hands, measuring joint movement in a way that could help with training robots or learning gestures. Combined, the glasses can figure out what you’re looking at, how you’re holding an object, and if what you’re seeing is getting your heart rate up because of an emotional reaction. If you’re holding an egg and see your sworn enemy, the AI might be able to figure out you want to throw the egg at them, and help you aim it accurately.
As stated, these are research tools. They’re not for sale to consumers, and Meta hasn’t said if they ever will be. Researchers have to apply to get access, and the company is expected to start taking those applications later this year.
But the implications are far larger. Meta’s plans for smart glasses go well beyond checking for messages. They want to link human interactions with the real world to machines, teaching them to do the same. Theoretically, those robots could look, listen, and interpret the world around them like humans do.
It’s not going to happen tomorrow, but the Aria Gen 2 smart glasses prove it’s a lot closer than you might think. And it’s probably only a matter of time before some version of the Aria Gen 2 ends up for sale to the average person. You’ll have that powerful AI brain sitting on your face, remembering where you left your keys and sending a robot to pick them up for you.
You might also like
Meta AI is here to take on ChatGPT and give your Ray-Ban Meta Smart Glasses a fresh AI upgradeWhy Google working with Warby Parker and Gentle Monster gives me confidence about the future of smart glassesWhy 2025 will be the year of the AI smart glasses