top of page

Ray-Ban Meta Smart Glasses Are a Game Changer for the Visually Impaired

  • Writer: Maria Cholakova
    Maria Cholakova
  • 1 day ago
  • 5 min read

The advancement of smart glasses marks a stride in assistive tech with world-changing benefits for blind and visually impaired individuals. Among the innovations, the Ray-Ban Meta smart glasses stand out. Today, we will not talk about the design or the software, but about the AI capabilities. The same ones that ease daily experiences and grant greater independence for those who need it.


Navigating Life with Meta Ray-Ban Glasses


Developed through a collaboration between Meta and EssilorLuxottica, the Ray-Ban Meta smart glasses integrate advanced features into a familiar eyewear form. The smart glasses have a 12MP ultra-wide camera, open-ear speakers, and five microphones. After the most recent updates, the Ray-Ban Meta glasses now facilitate seamless interaction with the environment.


Let me tell you from personal experience, the overall functionality is surprisingly natural. Like having a discreet assistant right on your face, talking in your ear, but the people around don't hear it (unless they are way too close to you).


ray ban meta glasses set up

The embedded Meta AI assistant responds to voice commands so wearers can:


  • Identify objects and people in real-time

  • Read and translate text aloud

  • Provide detailed descriptions of the surroundings

  • Set reminders and send messages via WhatsApp and Messenger


These features collectively transform the glasses into a versatile tool for navigating daily life.


The "Be My Eyes" Integration


The first noteworthy advancement was the "Be My Eyes" application. The feature connects users to a global network of sighted volunteers. How cool is that? By simply saying, "Hey Meta, Be My Eyes," users initiate a live video call, allowing volunteers to provide real-time assistance through the glasses' camera and speakers. 


The volunteer assistance is useful for tasks requiring human judgment or when AI responses may not suffice. The hands-free interaction is beneficial for navigating unfamiliar environments, reading labels or instructions, and identifying objects or obstacles.



New AI Capabilities for Recognition (Meta AI with Vision)


After the few latest updates, though, users also get built-in AI capabilities. No need to have a live call anymore (even though "Be My Eyes" is still active). The new capabilities allow users to receive real-time information about their surroundings.


By simply asking, "Hey Meta, what's in front of me?" the glasses take a picture and can:


  • Identify objects

  • Read text

  • Provide contextual information without the need for additional applications


The functionality is part of the Meta AI with Vision feature. The glasses interpret visual inputs and respond to user queries about the environment. This is a game changer for individuals who are blind or visually impaired.


Both features (Meta AI with Vision and Be My Eyes) demonstrate the glasses' versatility in providing automated and human-assisted support, catering to a wide range of needs and preferences.


Ray Ban Meta

Practical Applications Enhancing Daily Life


The Ray-Ban Meta smart glasses go far beyond being a tech novelty. For visually challenged users, they are a functional companion to be independent, improve safety, and increase confidence in everyday situations.


Navigating Public Spaces with Confidence


Using Meta AI with Vision, the glasses can describe surroundings, recognize objects, and offer real-time feedback.


For instance, asking “What’s in front of me?” reveals nearby obstacles, signage, or notable landmarks. The open-ear speakers deliver audio cues without blocking ambient sounds, crucial for detecting traffic or conversations. The glasses can also identify entrances, elevators, street signs, or even bus numbers.

Imagine walking into a building and asking the glasses, “Hey Meta, where am I?” — and hearing it respond with the location and nearby points of interest.

Shopping and Object Recognition


From reading product labels to identifying packaging, the AI can read aloud text on boxes, price tags, and nutritional labels. It can distinguish between similar objects and differentiate between a can of beans and a can of soup. A good prompt would be "Hey Meta, what am I holding?" or "Hey Meta, what am I looking at?".


Integration with Be My Eyes means users can call a volunteer if the AI isn’t sure, for example, verifying milk expiry dates or locating a specific item on a shelf.


Reading Documents and Signs


Whether it’s a menu at a café, a letter in the mail, or a bathroom sign, the 12MP camera allows the AI to scan and read text aloud, both printed and handwritten. All with the simple command: "Hey Meta, read the text". The glasses can also translate foreign text instantly, making travel or navigating multilingual areas more accessible.


glasses vs camera

RELATED: AI


Identifying People and Expressions


For social interactions, the smart glasses can recognize familiar faces (if tagged via the Meta app). Future updates aim to support emotion recognition, which may allow users to better understand the tone or expression of the person speaking to them.

Such an update will probably be delayed in the EU due to the data protection policies, but we (EU users) are still excited to get it sooner or later.

Hands-Free Communication


With full voice control, the wearer can send and receive WhatsApp and Messenger messages. With a simple command, for example, "Hey Meta, send a message to Julie on WhatsApp", you can make calls or record short videos to share updates. The hands-free approach is perfect for those who can't or prefer not to interact with a smartphone.


Real-Time Translation


Right now, real-time speech and sign translation supports English, Spanish, French, and Italian. So when visually impaired travelers need it, users can navigate unfamiliar environments. Asking “Translate this sign” or “What does that say?” lets the AI act as a personal interpreter — invaluable at airports, restaurants, or hotels.


Entertainment and Daily Routines


The Ray-Ban Meta glasses are more than just a camera with voice commands. The integration with apps like Spotify, Audible, and Amazon Music means users can stream music or audiobooks hands-free. Reminders and calendar alerts can be set via voice command, too, so users stay organized and on schedule. All it takes is a simple vocal prompt.



In Summary...


To be honest, it's exhilarating when technology is more than just a convenience. As the features get better, I'm excited to see how Ray-Ban Meta and other similar lifestyle smart glasses cater to the people needing vision aid.


Smart wearable devices are a shift in accessibility. A shift in the right direction. People with limited vision can do tasks that once required human assistance or multiple tools. How exciting is that? Smart glasses empower users to participate in the world on their terms, and I'm pretty sure the best is yet to come.

This article is NOT sponsored. All the observations are based on my personal experience with the Ray-Ban Meta smart glasses. - Maria Cholakova

bottom of page