A tale of two smart glasses: Echo Frames, Ray-Ban Stories are accessible wearables of the future

Spread the love

Amazon’s Alexa may be the doyen of digital assistants, but she’s not the queen.

Alexa’s sheer talent, evidenced by the myriad of integrations she has with everything from TVs to thermostats to microwaves and more, is a big reason why media and consumers alike anoint her as the best digital assistant. However, in this writer’s tech life, Alexa plays only a supporting role. I use Alexa exclusively in the kitchen, where the aforementioned Amazon Basics microwave and Echo Wall Clock are paired with an old Echo Dot that previously gathered dust in my office. The ability to ask Alexa to turn on the microwave and set timers on the clock with the big LED indicators are big accessibility perks for me when cooking; Alexa to make literally cheap household products and so-called “hacks”. After all, it is better not to make fun of the fact that not everyone can easily use the keyboard in a microwave.

Such a trick has found its way to Amazon wearables, including its $270 Echo Frames glasses. Amazon reckons the glasses help you “save time so you can focus on what matters most” by leveraging Alexa to control your phone and screen. music, smart home devices and more. The premise here is that Amazon is obviously and rightly pushing people to let Alexa be the center of their digital universe.

And therein lies the problem: Alexa has no sovereignty over my life.

Amazon shipped me a pair of Echo Frames (the sunglasses version) earlier this year, and I’ve spent the last few months wearing them. In terms of friendliness and build quality, they are on par with the microwave and wall clock. Unassuming and, in all honesty, unexciting, but totally useful for its intended functionality. In other words, Chanel or Gucci aren’t, but they still get the job done.

On a practical level, Echo Frames are far more interesting from an accessibility standpoint than what they can actually do. Conceptually speaking, the hands-free nature is apparently a boon for people with disabilities, particularly those for whom interacting with traditional screens is difficult or outright impossible. The ability to discreetly ask Alexa to play music or put a pound of butter on her shopping list goes beyond mere convenience; Using your own voice to manage these tasks can be, for people with certain needs and tolerances, much more accessible than even using conventional engine-based accessibility software.

The same can be said for Meta’s Ray-Ban Stories, which essentially does the same thing as Echo Frames but with an obvious emphasis on the Meta family of services. Meta sent me a couple of Stories for testing more recently, a couple of months ago, and I found them to be pretty much identical to the Echo Frames in just about every way imaginable. They both look more or less the same, do more or less the same jobs, and are paid similarly. They’re so similar, in fact, that it’s sometimes been hard to tell which glasses I picked up on the way out of the house. In the case of Meta, the great signifier is the Ray-Ban logo on the side of the temple.

Stories has a camera that you can take photos with, but I have yet to test the feature. Like Amazon, Meta positions glasses as a hands-free way to interact with technology that helps you stay in the moment with other human beings. Again, it’s fascinating to consider the accessibility implications of these products: convenience and fashion factors play second fiddle to questions of What people use technology. The hands-free nature of both devices, and particularly the fact that they are worn on the face, opens up a world of discussion about how the next frontier of wearable technologies will enable access for people with disabilities over the next decade. The possibilities in this space are tantalizing.

The bigger question is whether Echo Frames and Ray-Ban Stories are accessible. As always with accessibility, the answer lies in one’s needs and tolerances, and where one’s loyalties land. If you’re blind or have low vision, for example, it’s fair to wonder if it will really work without solid support like a screen reader. Being so focused on voice-first interaction is inclusive in many ways, but can be exclusionary if your speech patterns stray from typicality. Similarly, neither is particularly pragmatic if not married to an ecosystem. The value proposition of Echo Frames increases exponentially if it’s all included in Alexa; As someone who is fully involved with Apple products, especially HomeKit, the fact that Alexa only serves an esoteric function for me limits the appeal of the Echo Frames.

As an avid supporter of wearable technology via the Apple Watch and AirPods, the interchangeable use of Echo Frames and Stories has led to an interesting realization. Perhaps one of the reasons both devices lack real functional appeal in terms of capacity is because I keep forgetting to charge things. To me, they’re much more stylish versions of the cheap drugstore sunglasses I’ve worn for years. I don’t think of Echo Frames the same way I think of my Apple Watch: a computer that needs power to be useful. Echo Frames (and Stories) are just dumb Sunglasses—something to keep the sun out of my eyes when I’m outdoors. I clearly haven’t internalized the use of a computer on my face, which is what these devices are effectively. Before Apple Watch, I didn’t wear a watch either.

All of this is to say that while Echo Frames and Stories have limited appeal to me (and surely many other people) today, they both feel like immense harbingers of tomorrow. It’s still very early days for face-wear technology, so it’s hard to blame Amazon and Meta for dipping their toes in the water. Products like Echo Frames and Stories work well now to keep the sun out, but it’s not hard to imagine that the day will soon come when they will do much more, and in an affordable way.

#tale #smart #glasses #Echo #Frames #RayBan #Stories #accessible #wearables #future

Leave a Comment

Your email address will not be published. Required fields are marked *