VideoHelen Keller National Center

Using Technology to Increase the Independence of Older Adults with Combined Hearing and Vision Loss

Megan D.: Hello, and welcome to this training, Using Technology to Increase the Independence of Older Adults with Combined Hearing and Vision Loss by the Helen Keller National Center. My name is Megan Dausch and I’m the Accessibility Specialist for Helen Keller Services. I want to ensure that this training is as accessible as possible, so to start off, I will share some visual information about myself and this training format. On the screen, you see an image of me. I’m a white female in my thirties with shoulder length brown hair. In this image, I am seated at a table with a laptop in front of me and my hands are on a braille display. Next to my image is a PowerPoint. Throughout this training, we will include visual descriptions of images on the PowerPoint as I or my colleague walk you through the accessibility options on your devices, screenshots will be displayed to assist, to assist visual learners. In this training, we plan to provide you with an introduction to how combined vision and hearing loss can impact the lives of older adults, and then introduce you to some adaptive technology solutions that may increase the senior’s independence and access to communication. I’m going to now turn it over to my colleague, Stacey. [End of Transcript]

Respectful Interactions

Stacey: Hi, my name is Stacy Sullivan. I’m the acting director of Information Research and Professional Development here at the Helen Keller National Center. I’ve been with the center on and off for more than 20 years in many different roles. I’m hearing and sighted, and I’m a white female in her fifties with long brown hair. I’m seated at a desk, speaking into a camera. And as Megan mentioned, there’s a PowerPoint on the screen and this PowerPoint, if needed, is available upon request.

First, I’d like to talk a little bit about the diversity within the DeafBlind population. It’s essential that we understand the various experiences and challenges that a person may be experiencing to be able to meet their needs and provide the appropriate accommodation. To start with, we’re going to talk a little bit about the DeafBlind population. We use this term, DeafBlind, but not all people within the DeafBlind community are fully Deaf and fully blind. We typically break it down into four different categories. We have hard of hearing with low vision, hard of hearing and blind, deaf with low vision, and fully deaf and fully blind. Now, understanding the status of vision and or hearing loss will help you identify the correct adaptive technology that will best meet the individual’s needs, but also will help you identify the appropriate teaching strategies and communication methods and the accommodations that are gonna be needed to ensure that the individual has access to the training. Now, some other things to consider are the age of onset, that the individual experienced that loss of vision and or hearing. For example, if a person’s born blind, typically their needs and preferences are very different than a person who loses their vision later on in life. They have no visual memories. So, when describing information, you want to keep that in mind. Let’s compare two individuals, one who was born blind and is now losing their hearing and wants to move from speech output to accessing the computer via braille, versus another individual who lost their vision later on in life and is now learning software to either access via speech or braille output, but has that visual memory. The person who has some visual memory may benefit from a technique, such as using tactile maps to describe the layout of the computer. That wouldn’t be beneficial for someone that was born blind because they never experienced the computer visually, so you want to make sure that your instruction is provided in a way that makes sense to them based on their experiences. Also, the age of onset of the person’s hearing loss will play a major role in their communication method. A person who’s born deaf will typically be a native signer. Not always, but there’s more of a likelihood, and a person who loses their hearing later on in life may learn sign language, but will probably not be as fluent as a native signer. This brings up an important point. People who are born deaf, for who ASL is their native language, english is their second language typically. And they may struggle with some grammar or spelling because English is an auditory language. So, please be aware of this, and don’t incorrectly assume that someone’s written English skills have anything to do with their level of intelligence. Just like someone who’s learning sign language later on in life and is a bit awkward with their signs. You would never think that that person is less intelligent because they’re not a fluent signer. This is important to understand because someone who is a native English user may be comfortable communicating with you via text since they’ve lost their vision, their hearing and vision, they may not be able to communicate with you orally, so they may be comfortable in an instructing set situation, communicating back and forth via text on the computer. That may not be appropriate for a person whose native language is ASL. In that situation you’d want to set up a certified sign language interpreter who’s familiar with the person’s communication method, as well as the accommodations that they may need to match that, to meet their needs, whether it be tactile sign language, or any number of accommodations. Ensuring access to communication is a tremendous topic and we can’t cover it today, but we do recommend that you check out HKNC’s website. We have online courses, we have a number of resources with in-depth information on access to communication accommodations that are needed by individuals who are DeafBlind, and even how to work with an interpreter. Another thing you want to consider is any other disabilities or issues that the individual may have. For example, does the older adult have arthritis that may make it difficulty for them to access a standard keyboard? There are other keyboards that you could look into. Does the individual have numbness in their fingers from Diabetic Neuropathy that may impact their ability to access braille? So, you want to look at the whole person, do they have memory or cognitive issues? If so, it’s important to select the appropriate device or software that matches their needs and something that they’re able to use. You also want to identify what type of supports they have in their home area. Do they have a grandchild or a son or daughter that they can rely on to deal with issues that pop up on technology, as they always do. And you may want to think about creating accessible, simple resources that they could refer to, that walks them through a process, so that they could have that on file and use as needed. Be aware that people’s vision and or hearing often fluctuate greatly from day to day, so what works today may not work tomorrow. This can be caused by internal issues, such as fatigue or illness, but it also can be external issues. Maybe they’re dealing with glare and they can’t see the screen, or there’s a lot of background noise. So, be aware and consider the person at that time. Make sure, check in with them, make sure that the accommodations are in place to meet their needs at that given time.

Now, I just want to touch on some general practices and guidelines with regards to respectfully interacting with individuals with combined hearing and vision. First, when approaching an individual with a combined vision hearing loss, it’s best practice to tap them on the shoulder and wait for them to visually or tactually locate you. Remain in one place. Don’t move around and have them searching you. A hearing loss can have problems locating sounds or if you’re calling their name, they may have some difficulty identifying where the sounds coming from, and restricted visual fields will make it hard for them to locate you, so, stay in one place. And the first thing you’re going to do is identify yourself. Never say “guess who.” It’s considered extremely rude and is very frustrating. So, let them know who you are. If the person has some residual hearing, you can say your name, but if the person’s deaf and doesn’t have enough vision to visually access you, you need to come up with a system for identifying yourself. And this is even in situations where you’re using a sign language interpreter. You may have a sign language interpreter for your session, but you should have some way of identifying yourself and communicating some basic niceties to develop some kind of rapport directly with the individual. You can have maybe a name sign. My name is the finger spelling “S C” on my chest, over my heart. That’s my name sign. So you can, you know, they could place their hand on yours and you could sign your name to let them know who you are, or you can set up some kind of signal. There’s my signal, my haptic signal, which we won’t get into right now, but this is a signal that you place on their body. So mine’s like a claw shaped hand pulling away, touching and kind of scratching the person’s arm. That’s my name signal. This way, I can just go up to them and let them know right away who I am. So, you can set up whatever works for you, but I do highly encourage you to set up some kind of system where you can communicate with each other. It might be note writing. It might be using Print on Palm. You can print on the Palm of their hand using block letters to spell out, just, “hi, how are you?” So, it’s important to come up with your own system.

Another thing you want to consider is providing visual environmental information. This includes identifying yourself and letting the person know when people are entering or leaving the room. If you’re used to working with a person with vision loss only, this may not be something that you have to think about, because the person’s compensating for their vision loss with their hearing. They can hear when people are entering or leaving the room. This is not the case for a person with a combined vision hearing loss, so it’s extremely important to give them that information. Imagine talking to someone and not realizing that they’ve left the room or having a private conversation and not realizing another person has joined the room and can overhear you, make sure the individual is comfortable and has full access to the info, to the environment around them. This also includes providing a layout of the room, letting them know what the room looks like and the area that they’re going to be working with. You need to set aside time to allow them to explore, and this may mean tactually exploring, letting them explore the devices that they’re working on or the, the area that they’re going to be working in.

Which brings us to the next topic. Using hand under hand technique. Now it’s very important. Often, you’re going to need to guide a person who’s DeafBlind to a device or chair, any kind of equipment. So, for a blind person, you might say, “oh, it’s right in front of you,” “it’s to your left.” You kind of can vocalize that. You can do that in sign language as well, but often in with people who are, have combined vision hearing loss, we use this technique called hand under hand technique, where you place your hand under theirs and guide them to whatever the object is. It might be to the device. Never, ever take them by the hand and, and force them to the device. In this situation where I have my hand over a person’s wrist, I’m in control. You never want that to be the case. They should be in control. So my hands should be underneath theirs and just guiding them. They have the power to disconnect if they’d like to. But that’s very important and we want you to keep in the back of your mind at all times.

Other things to consider. The need for patience and to know ahead of time that everything’s going to take longer. A person who has a combined vision, hearing loss, everything just takes longer. So, you’re going to need to incorporate breaks because the individual will experience fatigue, whether they’re relying on residual vision or residual hearing, which can cause fatigue, or they’re relying on tactile sign language, which can be extremely fatiguing. All of these things. So, you have to build in breaks. You’re going to allot time for them to tactually explore, because remember, often a person with a combined vision and hearing loss, can’t access an activity while accessing communication at the same time. So, for a blind or for a blind person, they may, you may be explaining what they’re doing while they’re on the computer practicing. That’s not going to be the case because a DeafBlind person is either going to be watching an interpreter, a sign language interpreter. They may be using their hands to access a tactile interpreter, and therefore you need to set aside time for them to explore while you’re quiet, let them do, then they’ll take their hands off and you can explain the next step and they may need to go back on, so be aware of this and set aside extra time.

Now, when communicating with consumer with combined hearing vision loss via speech, there are a few things we want you to remember. First of all, you want to think about your setting. You want to avoid any background noise, because that can be very detrimental to their ability to communicate. So you also want to find a location that’s quiet, but you also want to find a location that’s well lit. And if the individual is relying on residual vision to access speech reading, they’re going to need a visual setting that’s accommodating as well. Also, you want to avoid shouting or over-exaggerating your speech. Sometimes people do that when someone has a hearing loss. Instead, you can speak at a normal tone. You want to be considerate about your pace, but not too slow. Check in with the individual and make sure that they’re hearing you. If they’re struggling with a particular word or phrase, don’t keep repeating the same word and phrase, come up with another way of saying that. If they continue to have problems, don’t frustrate the person. Maybe say one word in using print on Palm or write it in large bold print on a piece of paper with a bold marker. Don’t allow them to become frustrated. Those are just some initial tips. Again, you can go to the HKNC website to find out more information on this topic, but now I’m going to turn it over to Megan, who’s going to talk a little bit about mobile devices and the accessibility features that are available for older adults with combined hearing and vision loss. Thank you. [End of Transcript]

Screen Readers and Magnification

Megan D.: Now I would like to talk about mobile devices. There are two major players in the mobile device field. iOS and Android. Both of these devices have pros and cons. For the purposes of this presentation, we will focus mainly on iOS devices, as we have found, they are the most accessible for people who are DeafBlind. The majority of people who we work with tend to prefer iOS devices due to their accessibility with braille and ease of use of VoiceOver. When you are selecting a device, there are several things you should consider. Think about personal preference and comfort. It is very important that when choosing a device, you allow the consumer to feel, to choose what meets their needs best. It is important to test the devices and let the consumer put their hands on the devices to allow them to determine their comfort level. Another important consideration is accessibility needs. For example, if the consumer is using braille, we would tend to recommend an iOS device due to the fact that right now, braille support is stronger on iOS. That does not mean that Android will not catch up in the future. Technology is always evolving. It is also important to allow the consumer again, to test the devices. Some people feel that the VoiceOver gestures are easier to use and that the VoiceOver gestures are easier, easier to perform with their fingers, so it is important to make sure that whatever you’re choosing, that the piece of technology is something that the user will be comfortable using. Additionally, on Android devices, people who have some usable vision, perhaps may prefer the Android device to the iOS device, but again is important to allow the user to determine what will most benefit them. So you want to practice with Zoom on the iOS device, or expose them to magnification on the Android device, so that you’ll be able to determine what best meets their needs. Also consider how the individual will use the device. Is the person just looking for something to send a quick text message from every once in a while, or do they need full braille support? Do they need to use Google assistant or Siri? So, think about what the intended purpose of the device is to help guide your decision. Both iOS and Android have screen readers. A screen reader is a piece of software that allows the user to hear what’s on the screen verbally. And the screen reader will also allow for the person to control the phone via braille display. On iOS, the screen reader is called VoiceOver. On Android, the screen reader, the most commonly used screen reader is called TalkBack. In terms of visuals, iOS has a feature called Zoom, which allows one to magnify the text. There’s also a feature called “invert colors” and on Android, the feature is called “magnification.” The iPhone and Android also contain built in voice assistance, which allow you to speak to them. On the iPhone, you have an assistant called Siri. You can talk to Siri and ask it things like, “what is the weather?” “What is the time?” You can interact with Siri with certain apps, so Siri can be very powerful. People who are, are newer technology users, may benefit from the use of Siri, because for some people, it can be more simplified, a more simplified way of interacting. There are, of course, some limitations to Siri. You can’t do every single thing on your phone. For example, if you are booking an Uber, you will still need to interact with the phone to an extent. You can’t completely book an Uber using Siri. On Android, there is a feature called Google Assistant that is similar to Siri. Now, I would like to provide a little more detail about VoiceOver, the built-in screen reader on iOS devices. It is a screen reader that allows you to access the contents of the screen via voice or via braille. The way you use VoiceOver, is by a set of gestures. You take one finger and you can flick it across the screen and VoiceOver will read to you what is on the screen. For example, if you’re on your home screen, you can take one finger and flick from left to right, and VoiceOver will read you apps, the names of the apps, as you come across them. When you want to open one app, you can double tap. I personally use VoiceOver all the time. That is how I access my phone. I can read emails with VoiceOver, reply to emails, navigate all of my apps on my phone. I primarily use VoiceOver with a braille display, so that I can read what is on my phone in braille without needing to listen to it. A braille display connects to the iPhone via Bluetooth. You can typically connect your braille display through settings, accessibility, VoiceOver, braille. This is of course, where you would go to turn on any accessibility settings. You would go to settings and then accessibility, you will find all of your different settings there related to Zoom and VoiceOver and other braille settings. Really, anything to do with accessibility, including hearing aid settings, all manner of different things. It is your hub of all of your accessibility on your iPhone. Apple has many resources on getting started with VoiceOver. For example, you can go to Apple’s website and download the manual on how to use VoiceOver, and there are a number of videos available that actually show VoiceOver in use and demonstrate the different flicks, gestures, and also show VoiceOver being used with a braille display as well as a Bluetooth keyboard.

Now, I will turn it over to my colleague, Megan Conway, who will discuss Zoom and other features for visual users.

Stacey: Video image description. Megan is a white, DeafBlind woman in her fifties with short brown hair. She’s wearing glasses and behind the ear hearing aids. Megan is seated at a desk speaking into the camera with an iPhone in her hand.

Megan C.: I wanted to talk about some of the display options on the iPhone. One great tool is Zoom, which you can find under the accessibility menu. Zoom allows you to enlarge the entire iPhone or iPad screen. You can Zoom in and out by touching the little icon, the Zoom icon. You can change the region with Zoom, so you can adjust for just zooming a small portion of the screen or the entire screen. You can also make some color filter adjustments to make the screen easier to see.

Stacey: And here we have a screenshot of an iPhone in which the filter adjustments were set to inverted colors, so the background of the screen is black and the text is white.

Megan C.: So Zoom is a great tool for basically zooming in on the part of the screen that you want to see more clearly, but there are also some other really nice adjustments to the display and the font, and so forth, that you can make also under “accessibility.” So, for general display modifications, the one that I use the most is called “invert brightness,” and that just essentially switches the color scheme, so that you have a dark background with light text. So, a lot of folks find that to be very helpful. You can make all of the text bold, you can make other kinds of color adjustments. If you’re color blind, you can set it up so that there was no color differentiation. You can change the icons so that, while not everything is magnified on the screen, you can just make it so that the icons are more bold and clearer and larger. It’s really a good idea to actually just go in and play around with some of the display modifications. It’s very individual thing, what’s going to help one person or another, so I found that just playing around and turning features off and on will accomplish the best viewing situation for me. [End of Transcript]

Accessible Apps

Stacey: Megan Dausch speaking.

Megan D.: We’re going to now talk about various apps that can help people who are DeafBlind. We will talk about apps for communication, and we will also talk about apps for daily living, as well as speech recognition apps. Technology has really opened up communication for people who are DeafBlind. The first app I would like to talk about is the Messages app. Text messages are very popular. People are really comfortable now communicating via text messages, but people who are DeafBlind can use these to their advantage. Text messages are a fully accessible way for people to communicate. You can text your family and friends and let them know where you are or have full conversations with them. Text messages are fully accessible via braille display. Another handy feature of text messages is that you can share your location with family and friends using the messages app, so you can quickly text them your current location, if you want some assistance, or just would like to share where you are in space.

Stacey: Here we have an image of a young man seated at a table next to a DeafBlind woman. The young man is texting the woman on his iPhone and the woman is smiling while she reads his message on her portable braille display, which is connected via Bluetooth to her iPhone.

Megan D.: Another useful app is the FaceTime app. Again, FaceTime is popular for many, many people across the globe, but FaceTime really benefits people who are DeafBlind as well. You can use FaceTime to sign with one another, because it’s a video call, so you can actually see what other people are doing and sign back and forth. You can also use FaceTime to see people’s lips moving, and so that gives a visual component to the audio, to the FaceTime call. Additionally, FaceTime audio for many people who do have some residual hearing can be more clear than using the regular cellular connection, so that can increase access to communication by making communication more clear and easy. Another very useful feature of FaceTime can be that you can get visual assistance from family, or from, for example, I sometimes use FaceTime if I want to get visual information about what’s around me or maybe something that I am cooking. I will simply FaceTime a family member or friend and ask for information about my surroundings. Another useful communication app is the built in iOS notes app. Being able to write notes can facilitate face-to-face communication. A person who is DeafBlind can take their Bluetooth keyboard or their braille display and utilize that with a notes app. For face-to-face communication, you can hand a trusted person your iPhone, and you can type in braille, or you can hand a trusted person, a Bluetooth keyboard and have the person type back and forth to you using that Bluetooth keyboard, and you can read on your braille display. So, just a simple app, like the notes app can be an option for improving face-to-face communication.

Stacey: And here we have a screenshot of the notes app with a note saying, “Hi, Liz, would you be able to drive me home today?”

Megan D.: Another app that can be very useful for youth in the community is an app called BuzzCards. BuzzCards can allow you to communicate with the cashier or someone in the community who’s taking your order at a restaurant. For example, you can put your Starbucks order in the app and show the card, the virtual card, to the barista, or someone who’s taking your order, and they will know what it says, and you can create cards and place them in different categories. For example, if you want, you can have a category for dining and have your orders planned out, you can have a category for transportation, if you would like to write out, maybe the bus you’re looking for or other information that you may need for transportation. So, the BuzzCards app allows you to have pre-created cards on your phone that you can show to people in the community who might need to assist you.

Stacey: Here we have a screenshot of the BuzzCards app. The background of the screen is yellow with large bold black text. The note on the screen reads, “large coffee with cream and sugar, please.” On the second screenshot, we have a list of templates that you can select from, including categories such as, the coffee shop, fast food restaurant, or requesting assistance locating places such as a restroom or a bus stop.

Megan D.: Now, we will go to my colleague, Megan Conway, who will discuss speech recognition apps.

Megan C.: I’d like to discuss another fun tool that I use, which is Speech to Text recognition application on the iPhone. So, these programs will recognize speech and then transcribe them into text for the user on your device. Some of these apps are designed for multiple people, so that each person requires the app on their device in order to communicate with each other. And then others of these apps are designed for just the single user to be able to, say, put your device on a table, somebody talks, and then you can see what they are saying on your device. So, for example, the one that I like to use is called My Ear, and it’s a pretty simple app. So, for example, I might go into a Starbucks where it’s very noisy, I can’t see the menu, I can’t lip read. If I was an ASL user, obviously that that would be a challenge. So, what I can do is I can put my iPhone down on the counter and then when the person at the window says, “what can I get you today?” That would be picked up by my iPhone microphone, and it would read, “what can I get you today?” If I could speak, I could respond with speech, but it also has a typing capability, so I could type, “I want a double Latte,” and show, kind of tip it up and show that to them, and then we could have a conversation back and forth that way. One thing that I do like about My Ear is that you can adjust the font to make it easier to read so, you can make the text larger, and you can also adjust the background color and the color and the font, so that’s one reason that I like to use that particular application.

Stacey: And here we have a screenshot of the settings page on the My Ear app. The options include auto scroll, split by lines, text size, text color, and speech. Here, the male voice is selected. And now we go back to Megan Dausch speaking.

Megan D.: Now, we will speak about apps for daily living. The iPhone has dramatically increased the availability of information and with the camera you can use a lot of different apps to get information that can help you in your day-to-day life. One of the most popular apps for daily living is an app called Seeing AI, and Seeing AI is kind of like a multi tool because it has so many features. This app will allow you to complete various tasks that you might encounter on a day-to-day basis. For example, you can use the app to take a picture of your mail and sort your mail. This is the app that I personally use nearly every day to sort my mail.

Stacey: Here’s an image of a person using the Seeing AI app to identify a piece of mail. The iPhone camera is held over the address on an envelope. At the bottom of the screen are options for “short text”, which is highlighted. Other options seen on the screen include, “document”, “product” and “person”.

Megan D.: You can also use this app to take a picture of currency. For example, if you have a $1 bill or a $5 bill, and you don’t know what it is, you can lay the bill flat on a surface and hold the camera above the bill, and the app will tell you what the denomination of the bill. You can also of course get the same information from a braille display, so if you’re using a braille display with the iPhone, you will be able to feel the denomination of the bill on your braille display, that is connected, if it’s connected to the iPhone. There is a feature called “document mode”, in Seeing AI, which allows you to take a picture of a document and read it with your braille display, or your VoiceOver speech. There is also a feature called “short text” that kind of lets you get the gist of a document. You hold the phone above the document and it will begin reading whatever is in the camera view. Another fun feature of the Seeing AI app is the “person mode”, which allows you to take a picture of a person and get a very basic description of what the person looks like. Do note though, that this is really not to be relied upon. It will tell you an approximate age of a person, but obviously computer technology and can be very, very off. It will also tell you at times what color hair the person might have. For example, brown hair. It might say, “35 year old woman with brown hair looking happy” as a description of a person. You can also save the photo from the person channel for future recognition, so if you are in a room with this person again, and you hold your phone and you move your phone around and the phone recognizes the person that you have already taken a picture of, it will let you know that that person is there. Another channel that the Seeing AI app offers is called the “product channel.” The product channel allows you to scan a barcode. As you know, many boxes and cans and products have barcodes on them. So, you take your phone and you move your phone until it scans the barcode, and then if the barcode is in the seeing AI database, it will tell you what the product is. In my experience, this can take some practice to do because barcodes can be in different locations. Well, barcodes are in different locations on many products, so sometimes finding that barcode can be a challenge. However, with practice, this is a very useful app for people who might have products in their kitchens that they’d like to scan and to obtain directions from different packages. Another channel that the seeing AI app includes is a color channel. As of this recording, and in my opinion, the color channel is not always accurate, because it is very dependent upon the lighting in your surroundings. However, it is worth a try and may improve. There is another app that can called Boop Light Detector, that will vibrate when the light is on. Seeing AI also has a light detection feature, but it is an audible beep; it plays an audible beep when the light is on. Whereas, Boop Light Detector also gives you a vibration, so if you cannot hear the beep, Boop Light Detector would be a proper, possibly a better choice, because it has a very strong vibration feature. This is very useful, if you cannot see when the light is on. I personally find apps with light detection capabilities, very useful, because if I forget to turn the lights on or off, it is a good way for me to know what the current state of them is. Another app for daily living is called Way Around. Way Around is an app that allows you to affix tags to different items around your home. You have to purchase the tags from the company Way Around, but once you purchase the tags, you can the app to record, to type in a label, make a label of what the product is. Then you affix the tag to the product or piece of clothing around your home, and when you encounter that piece of clothing, you take your phone and you put your phone near the tag and your phone will read you that label that you have already recorded into the app. Other apps for independent living include Be My Eyes and AIRA. These are similar services, both apps work by allowing you to share your video with a sighted person who’s on the other end of the phone. Essentially, they are like FaceTime calls, but not through the FaceTime app. Be My Eyes is a network of volunteers that are readily available to help you. So you use the app, you open up Be My Eyes, and you make a call, and there’s a volunteer waiting at the other end. They will see what you are showing them through the phone’s camera. This is a free service and it is always good to be careful with anything you use on the internet. These are volunteers, so is important to think about what information you’re sharing, and use your best judgment about information sharing with the volunteers.

Stacey: Here, we have a screenshot of an advertisement of the Be My Eyes app while in use. At the top of the screen, it reads, “Lend your eyes to the blind and visually impaired.” Below that is an iPhone with an image of a blue button-down shirt and text reading, “What color is this?”

Megan D.: The service also offers direct connectivity to Microsoft and Google accessibility among other things. So sometimes you can get specialized help through Be My Eyes and directly link to popular services like Google, Microsoft, you can get various specialized help from other companies. Be My Eyes is always adding new companies on board, so it’s worth a look. AIRA is very similar. Again, you have a person who can watch what you are displaying through your camera or video connection to another person. AIRA is a paid service. You can currently, as of this recording, have five minutes for free each day, but that may change. However, it is a paid service, and so, if you would like to research more, you can do so by going to AIRA’s website.

Stacey: Here, we have a screenshot of the AIRA app in use. There are several pictures on the left side of the screen with written descriptions on the right side of the screen. The first reads, “living room with some light coming through the window.”

Megan D.: Both of these services, Be My Eyes and AIRA, you really, you do need to have some usable hearing because there is not a text method of interacting. There is with AIRA. There is an option for texting. However, mostly you communicate with the agents via voice. You can certainly try to communicate with them via texting. That may be an option and may perhaps will evolve but Be My Eyes does not currently have a text interface, it is completely audio only. AIRA has some text functionality. Apps are always changing, so what is relevant today may not be relevant tomorrow. The app landscape is always evolving. Some apps get pulled from the app store because their companies go out of business, new apps get added, which is really exciting, it’s ever changing. So, always explore what’s out there because what I shared with you today may not be relevant in a month, we don’t know. One really great resource for exploring apps is a website called AppleViz.com. You can always go there and find different reviews of apps and become a part of the community yourself. [End of Transcript]

Smart Home Technology

Stacey: This is Stacey speaking. Now we’d like to provide you with a brief introduction to smart home technology. This smart home technology has opened up a world of opportunity for seniors who want to stay in their homes or live more independently in whatever current living arrangement they have. This technology simply allows people to control items and functions around the house with a simple push of a button or a voice command, and for those who do not use voice, most of the technology can be controlled via a typed message on a person’s mobile device, which of course can be accessed via a braille display. Some devices like smart light bulbs and light switches, as well as smart plugs that can connect to small appliances, like a toaster oven, are simple and relatively inexpensive. Others, like full home systems may require a more serious investment of time and money. But smart home technology does not mean that the person has to be very tech savvy. They may need assistance setting up the device or the system, but once that’s completed, the senior can typically control the device using voice or text. Many of these systems are configured to work with one of the voice activated smart home systems like the Amazon Echo or the Apple Home Kit. There are smart devices to ensure safety, to assist with daily tasks, and to allow a caregiver to monitor and check in with the senior when needed. The individual or the caregiver can double check to make sure that appliances are turned off, they can make sure that the doors are locked or the windows are closed and so much more. I’m going to now turn it over to Megan Dausch who’ll share some of her experiences using smart home technology in her home.

Megan D.: Now, I would like to tell you about some of the smart home technology I use in my own home. I use three main kinds of smart home technology. I use lights that I can control by my Alexa and my phone. I also use a thermostat and I also use a doorbell, a Ring doorbell. The thermostat I use allows me to control it from my assistant, such as Google Home or my Echo device, my Alexa, Amazon Echo device. I can use this device to set the temperature in my home and I can also check the current temperature in my home, using my phone, my iPhone with the app or my Alexa or Google device. And I can simply say to the devices like Alexa or Google, what is the thermostat set to? And it will tell me, and I can also tell it to set the thermostat to 70 degrees if I want. These also can be set remotely, so when I’m out and about, I can adjust the temperature in my home from my phone, so if you’re working with someone, maybe you’re working with a person who is new to technology and you want to assist them by setting their thermostat, you can do that with them, also remotely, you can do that for them. I have switches plugged into certain lamps in my home that also work with various assistance. I have the Google and the Alexa assistant and the lights, the lamps that I have, I can tell it to turn on a specific lamp in my home, such as the one in my living room, and it will turn it on for me. It’s really convenient if you forget to turn on the lights, and it’s also security because I, that way I can, I can also set it up to have the lights go on and off at certain times, and I can control that with my phone and my assistants as well. As the next smart home device I have is a doorbell, a Ring doorbell, which allows me to know when someone is at my door, because I get a notification on my phone when someone rings the doorbell. I also can get a notification on my Alexa device or my Google device. Additionally, this device can allow others to remotely look at your Door bell, if you have allowed them access. So, if someone approaches your door and you want to know who it is, or, you know, you need to go back for some reason and look at the previous video that was taken, you can have assistance, you can have someone do that with you, go through who visited your home, you know, go through who rang the doorbell, who might’ve dropped a package off. So, it also can provide an extra sense of security. And I also will sometimes use it to speak to people through the phone. So, when someone rings your doorbell, you can actually talk to the person by using your phone, you can be inside your home and talk on the ring doorbell app to the person through your iPhone app, and they can hear you and you can hear them. So, those are some of the smart home technologies that I use. There’s so much potential with smart home technologies, and it’ll be really interesting to see where it goes.

Stacey: Thank you. We hope you’ve enjoyed this presentation. If you have any questions or are looking for any more information, please contact us at PLD@hknc.org. You can also go to our website, www.HelenKeller.org/HKNC. Thank you. [End of Transcript]

Additional Resources