With any new device category comes a whole host of novel and sometimes exhaustingly complex questions. Smartphones, for example, no matter how mundane they seem right now, are still nagging us with existential quandaries. When should we use them? How should we use them? What in God’s name happens to us when we use them, which, last I checked, is literally all the time?
These are important questions, and most of us, even if we’re not spending all day ruminating on them, tackle the complexity in our own way, setting (or resetting) social norms for ourselves and other people as we trudge along. The only thing is, in my experience, we tend to ask these questions mostly in retrospect, which is to say after the cat (or phone, or smartwatch, or earth-shattering portal into the online world) is out of the proverbial bag. It’s easy to look back and say, “That was the time we should have thought about this,” and when I put Meta’s new smart glasses with a screen on, I knew that the time, for smart glasses in particular, was now—like, right f**king now.
In case you missed it, Meta finally unveiled the Meta Ray-Ban Display, which are its first smart glasses with an in-lens display. I flew out to Meta headquarters for its annual Connect conference to try them, and the second I put them on, it was clear: these are going to be big. It probably seems silly from the outside to make a declaration like that. We have screens everywhere all the time—in our hands, on our wrists, and sometimes, regrettably, in our toasters. Why would smart glasses be any different? On one hand, I get that skepticism, but sometimes function isn’t the issue; it’s form. And when it comes to smart glasses, there is no other form like it.
Meta’s Ray-Ban Display aren’t just another wearable. The screen inside them opens up an entirely new universe of capabilities. With these smart glasses and Meta’s wild new “Neural Band,” a wristband that reads the electrical signals in your arm and translates them to inputs, you’re able to do a lot of the stuff you normally do on your phone. You can receive and write messages, watch Reels on Instagram, take voice calls and video calls, record video and take pictures, and get turn-by-turn navigation. You can even transcribe conversations that are happening in real time. You’re doing this on your face in a way that you’ve never done it before—discreetly and, from my experience, fairly fluidly.
If there were any boundaries between you and a device, Meta’s Ray-Ban Display are closing them to a gap that only an iPhone Air could slide through. It’s incredibly exciting in one way, because I can see Meta’s smart glasses being both useful and fun. The ability to swipe through a UI in front of my face by sliding my thumb around like some kind of computer cursor made of meat is wild and, at times, actually thrilling. While not everything works seamlessly yet, the door to smart glasses supremacy feels like it’s been swung wide open. You are going to want a pair of these smart glasses whether you know it or not. These are going to be popular, and as a result, potentially problematic.

We may have a solid grasp on where and when we’re supposed to use phones, but what happens when that “phone” in question becomes perfectly discreet, and the ability to use it becomes almost unnoticeable to those around us? When I use a smartphone, you can see me pick it up—you know there’s a device in my hand. When I use Meta’s Ray-Ban Display, however, there’s almost no indication. Yes, there’s a privacy light that tells outside people that a picture or video is being taken, but there’s also less than 2% light leakage through the lens, meaning you can’t tell when the screen inside the glasses is on. I certainly couldn’t tell when I watched others use them. It’s as ambient as any ambient computing I’ve witnessed so far.
I talked to Anshel Sag, a principal analyst at Moor Insights & Strategy who covers the wearable market, and he says the privacy framework around technology like this is still in flux.
“We are still very much in the infancy of the smart glasses, AI wearable, and AR privacy and etiquette era,” he said. “I think that the reality is that having a wearable with a camera on your face is going to change things, and there are going to be places where these things are banned explicitly.”
Some of those environments, Sag said, are private areas like bathrooms or locker rooms, but it could extend beyond just places where you might catch a glimpse of someone naked. Driving, for example, is a major question. Meta’s Ray-Ban Display have navigation built in, and while the company tells me that the feature is designed for walking right now, it’s not actually preventing anyone from using its smart glasses in the car. Instead, it will provide a warning before you do so by detecting what speed you’re moving at. Other companies like Amazon seem not to have even thought that navigating on smart glasses while driving could be a safety hazard at all. Early reports indicate that Amazon is plowing forward, making smart glasses that are specifically designed for its delivery drivers to use in a car.

While regulators like the NHTSA have issued warnings about people using VR headsets while driving (yes, people were actually doing that), it hasn’t, according to my research or knowledge, addressed the impact of smart glasses, which are much more likely—especially if they become widespread—to enter the equation while driving. I reached out to the NHTSA for comment, but have not yet received a response.
Privacy concerns shouldn’t just stem from the form factor, either. You also have to think about the company that’s making the thing you’re wearing on your face all the time and whether it has shown to be a good steward of your data and privacy. In Meta’s case? Well, without going into an entirely separate diatribe, I think it could do a lot better. And other companies that are also in hot pursuit of screen-clad glasses, like Google? Well, they haven’t been much better.
And makers of smart glasses shouldn’t be surprised if, when these things wind up on people’s faces, they get some shit for it. Google Glass, which came out in 2013, may seem like a different age, and in a lot of ways it is (people’s expectations for privacy are almost nonexistent now), but we also haven’t had to confront the idea of pervasive camera-clad wearables in a long time, so who’s to say things have really changed? Sag says, while he expects some backlash, it may not be like the Glasshole days of yore.

“I think there will be some backlash, but I don’t think it’s gonna be as bad as Google Glass,” he says. “Google Glass had such an invasive appearance. You know, it didn’t really look normal, so it really caught people’s attention more. And I think that’s really what has made these classes more successful, is that they’re just inherently less intrusive in terms of appearance.”
I may not be an industry analyst, but I agree with Sag. I’m not sure there really will be a category-ending backlash like we saw back in the Google days, and a part of me doesn’t want there to be. As I mentioned, I got a chance to use Meta’s Ray-Ban Displays, and the idea all but knocked my socks off. These are the smart glasses that anyone interested in the form factor has been waiting for. What I really want is to be able to live in a world where we can all use them respectfully and responsibly, and one where the companies that are making them give us the same responsibility and respect back. But in my experience, the only way to get toward a more respectful, harmonious world is to try everything else first, and in this case, the first step might be your next pair of Ray-Bans.