One thing is clear in 2025: smart glasses are real. I’ve already been wearing them. Meta’s Ray-Bans not only look normal, but they’re successful: according to Counterpoint Research , over a million have already been sold.
By no means does that mean smart glasses are the next iPhone (or even AirPods), but I’ve found them on my face a lot – and they’ve found their uses, too. The same has also happened to me with watching movies on display glasses like the Xreal One. I’m ready to carry glasses fitted with my prescription in my bag wherever I go.
But will everyone else? This year’s CES showed off a handful of requisite smart glasses once again, many of them promising – no surprise – AI. Many of them look perfectly normal, or normal enough. And acceptance of smart glasses, or at least the way they look, is changing. Most people don’t know I’m wearing Meta Ray-Bans.
But that’s also a concern, too. The man who killed fourteen people in New Orleans on New Year’s Day wore them leading up to his attack, according to the FBI. The glasses don’t do much more than phones already can – they record video, take photos, can be used for calls or music, and have camera-connected AI services – but it’s the beginning of a wave of wearable devices that will have AI services always-on and seeing what we see through their cameras.
At CES, the glasses that could be found looked more real and everyday than they ever have before. Halliday glasses look like something you’d pick up at Lenscrafters, but they also have a tiny monochrome display perched above the frame that can show notifications or AI-delivered information via text. The tiny circular display sits at the top of your field of view, and they can do things like translate language in real time.
The RayNeo X3 Pro, a pair of full AR glasses, have cameras, enable hand tracking, and have dual displays built into the clear lenses, in a smaller size than a pair I wore last year. They work with a wrist-worn neural band made by Mudra that, similar to Meta’s prototype Orion glasses I wore a few months ago, can detect small finger movements and use them to control apps.
It’s all very futuristic, but a big part of the picture is still missing: better connection with our phones. Most smart glasses still need to pair with phones to work, like Meta’s Ray-Bans, or smartwatches. The pairings are the weakest link. With Ray-Bans, I can use Meta AI or play music or sync photos that I take, but the connection can drop…and the glasses can’t access or control anything else on my phone. They’re not deeply linked in like AirPods or the Apple Watch feel, or Google’s Buds or Pixel Watch. That’ll change, slowly, starting later this year.
Google’s Android XR, a planned framework for glasses and VR headsets to deeply link into phones and Google’s Gemini AI, could make these glasses work a lot more fluidly for Android phones. Demos of Google’s own smart glasses I tried in December had always-on AI modes, and promised to connect with phones as well. A mixed-reality headset by Samsung can run Android apps.
Apple could and should do the same thing for iPhones, but nothing’s happened yet. The Apple Vision Pro, oddly enough, doesn’t pair directly with iPhones. Instead, it shares common apps and cloud services. A pair of Apple glasses could have the same sport of deeper phone hook-ins as the Apple Watch and AirPods, but that product is at best a far-off rumor right now.
Google’s taking small steps with Android XR this year, it seems. Samsung’s larger Vision-Pro-like headset will be the first Android XR device, and glasses will follow later. Smart glasses maker Xreal is one of Google’s first Android XR partners, but their most recent Xreal One glasses aren’t meant to be worn all day: they’re more like plug-in displays. Still, they may be among the first to be Android XR connected, along with Samsung’s own smart glasses.
Deep phone integration is what will make any of these glasses start to feel absolutely necessary instead of just a novelty. I love the Meta Ray-Bans, but I do not love Meta’s restricted relationship with my phone…or Meta’s social media policies, for that matter. There should be easier ways for glasses to pick and choose AI services on my phone or act as a peripheral, more like the way earbuds or watches do. I have a feeling that’ll be more on the agenda in 2026 than 2025, though.
For now, though, these glasses really aren’t weird-looking anymore. Seriously. They look good! Now these glasses just need to work better with everything else, too.
Read the full article here