Look And Tell Me This: Does This Feature Work for Users Who Have a Combined Hearing and Vision Loss?
Explore how Meta Ray-Ban smart glasses work for users with combined hearing and vision loss. Learn about the "look and tell" feature's accessibility limitations, audio routing challenges for hearing aid users, and potential improvements for braille users.

By Scott Davert, Lead Research and Training Specialist, Helen Keller National Center
May 12, 2025
Introduction
Meta Ray-Ban smart glasses are not new. For those with or without vision loss, they can appeal because they allow the user to film short videos, make hands free phone and other calls, allow you to dictate messages and many other features. Smart glasses aren’t just for the sighted. With the latest generation of Meta Ray-Ban smart glasses, blind users are beginning to unlock new levels of independence and access partially through a feature called “look and tell” which provides descriptions of the photos taken by the glasses as voice output. From snapping pictures on the go to getting AI-powered descriptions of their surroundings, people are using these glasses to identify grocery items, read signs, capture notes in meetings, and even describe friends’ outfits before a night out. This has already been documented very well over social media and on the blogs of others. This blog post explores what’s possible—and what’s not quite there yet for those who are blind that wear hearing aids or use braille as their primary access method. The suggestions provided also have a universal appeal to all users.
Where We Came From
In the past, it was always possible to go into your history and read the responses of Meta that were generated by voice commands. What this meant was that your voice still was needed to do all the heavy lifting. If you wanted the glasses to look and tell you what they see, the only way to do this has been to activate this feature through your voice. Before this update, it was not possible to ask Meta questions based on the photos taken on your glasses via text through your phone. What this meant was that if you couldn’t hear the speech and did not have a reliable way to speak to the glasses, you were cut off from these features. The only way to access the descriptions themselves was by looking at the items in your history. Several attempts have been made to work around this, such as messaging Meta AI through WhatsApp, but that is a lot of extra setup which I’ve found to only be somewhat stable. For users of hearing aids or cochlear implant processors, it was once possible to route the audio from your Meta glasses to whatever sound source you wanted that was paired with your phone. What this meant was that you could have the glasses’ audio come directly through your hearing aids or cochlear implant processors and did not have to depend on the internal speaker. On iOS at least, this ability was taken away and one must depend on the internal speaker for access to this information.
Last week, Meta Released a new app called MetaAI. This integrated the glasses more into the MetaAI environment. Going forward, users will get updates for the glasses through this new app instead of the Meta View app that users have been utilizing.
Where We Are Now
Currently, it has been my experience that the Meta Ray-Ban smart glasses will always route your phone audio to its internal speaker when active. This happens regardless of where you have your audio set to go to prior to using the glasses. This also means that you are not able to have the audio from the glasses routed to earbuds or other devices. For example, if a user needs visual information in a place where it would be considered rude to make a lot of noise, it would be helpful to not have to use your voice to get that information. Since the audio can’t be routed to earbuds or other sound sources, others may have to hear the audio output when they may have good reason to not be disturbed. For those hearing aid or cochlear implant users, be aware that this will happen. It has been my experience that once this occurs, VoiceOver audio, music, and any other sounds will go through the glasses. From my personal use case, I can understand the speech through the internal speaker on days when my ears decide to cooperate and if I’m in a quiet place. However, in a noisy space, the glasses are as usable to me as they are to those in countries where the glasses do not have support yet for these features.
After the update to the new app, a few improvements have been made for those who use braille. The biggest change is that a braille user, or any user without speech, can ask MetaAI questions about the pictures that have been taken. However, there is still work to be done here, as it is not possible to activate the feature called “look and tell” without the use of your voice. While the user can take a photo by pressing the Capture button located on the right arm of the glasses, the photo will not show up in your gallery until after you have put the glasses on the charger. In order to interact via text with the picture, one must activate the “look and tell” feature with a voice command.
If Meta would create a setting where the user could have the Capture button automatically launch “look and tell,” this feature could then be accessed within the user’s history and no use of a voice is required. This would also appeal to blind users who wish to have the ability to silently interact with the photos captured on their glasses. While it’s not hands-free—DeafBlind people must use our hands for most things—it would only require a quick press of a button to activate the “look and tell” feature, which the user can then interact with using braille on their connected mobile device.
Conclusion
To conclude this short review, for those with hearing loss, the Meta Ray-Ban smart glasses may not be ready for prime time. For those with more severe hearing loss, since there is currently no ability to route the audio from the glasses directly to other devices, it may not be the time to consider such an investment.
I’m hopeful that these features will be taken into consideration by Meta, and hope that others who feel the same will also take the time to politely share their feedback with them. You can do that if you have the Meta app installed. Shaking the device will bring up an option to provide feedback to the developers. Here’s hoping Meta will take this idea into account, as it would be useful for all users of the Meta Ray-Ban glasses that wish to use them more discretely for any reason.