Video Helen Keller National Center Communication

Haptic Communication to Facilitate Braille Instruction for DeafBlind Adults

This presentation describes a research study designed to demonstrate the effectiveness of using Haptics during braille instruction.

Two women engaging in tactile sign at a table

Video image description: a PowerPoint is on the screen, a sign language interpreter is to the right of the PowerPoint. An image of the presenter speaking at the time is located in the lower corner of the screen.

Hello, and welcome to Haptic Communication to Facilitate Braille Instruction for DeafBlind Adults. My name is Peggy Costello, I’m the supervisor of the Communication Learning Center at the Helen Keller National Center. I’m joined by Megan Conway, who is a research and accessibility specialist at the Helen Keller Center, and we’d also like to mention Deborah Harlin, she is the Director of the Information Research and Professional Development department, or IRPD at the Helen Keller National Center, although she’s not joining us today.

So first, an overview of what we’ll be talking about today. First we’d like to give a description of Haptic Communication or “Haptics.” Tell you a little bit about the origins and benefits of Haptics, and then some information on the research on Haptics and Braille instruction that we’ve been working on as a project at Helen Keller. And it all kind of started from a wish that we could communicate with our DeafBlind students who are tactile signers during braille class, without them having to take their hands off the braille. Then we’re going to talk about implications for research for practice, and we’re going to go to the next slide.

Next we’d like to give a description of Haptics. Haptics is a standardized system for providing visual information and social feedback via touch signals on the body. So for someone who is DeafBlind, who is not getting a lot of the environmental information, for example, they might miss if somebody comes in or leaves the room, they might not be sure if someone is raising their hand during a meeting, so Haptics are way to provide that environmental information as well as provide social feedback. And by that we mean, someone who’s DeafBlind might not see facial expressions. They might not realize that someone is laughing at a story that they’re telling. They might not realize that for example, someone is crying or that someone’s following the conversation, someone is maybe nodding their head and saying, “uh-huh uh-huh”, but for someone who’s DeafBlind to give that information or the social feedback during communication is important. So Haptic signals are specifically designed to be received on the body. Haptics do not replace sign language or spoken language sign language was developed to be received visually. Many DeafBlind adults who are ASL users, American Sign Language users, do receive sign language tactually. And just a word about tactile sign language, with tactile sign language, the signer is signing with his or her own hands and the DeafBlind person places, his or her hands over the signer’s hands and is following the conversation. In tactile sign language, the signer does not take the hands of the DeafBlind person and make the signs and does not provide signs on the person’s body. With tactile sign language, there is slight modification for signs to be received in the hand. Sometimes that’s related to the pace, for some folks, they might need a little extra time to process the sign language, tactually. Going to move on to the next slide.

In this next slide, we’re going to talk about the Places of Articulation, and those would be the five places on the body where Haptics are presented. In Haptics, you’ll hear about a provider and a receiver, and obviously the provider is the person giving the haptic signal. And the receiver is the person receiving it. On this slide, we have pictures of the places of articulation where haptic signals are provided. In the first photo, we have the provider is using an index finger to give a haptic signal on a woman’s back. On the next photo, the provider is again using an index finger to provide the haptic signal on a woman’s upper arm. In the third photo, we have someone using a flat hand and is presenting a haptic signal on someone’s leg, right around the knee area. In the fourth photo, we have the pointer and middle finger used to present a haptic signal on the back of the hand, and in the fifth photo, we have the provider using his foot to tap the receiver’s foot to provide a haptic signal. Now we’re going to go on to the next slide.

On this next slide, we have some examples of haptic signals. In the first photo, is the haptic signal for “yes.” We have the provider using a closed fist and is providing the haptic on a woman’s back. The haptic signal for “yes” is that closed fist with kind of an up and down motion, almost like shaking a head, nodding your head “yes.” And the next slide, we have the haptic signal for “no,” and that’s the flat palm, it’s moved back and forth on the receivers back in kind of an erasing motion, like you’re erasing a mistake. In the third photo, we have the haptic signal for “laugh,” which is kind of a claw- hand with the fingers kind of open and closed, almost like in a little scratching motion — and that’s being presented on a woman’s back and it’s the signal for “laughing.” We’re going to go to the next slide.

We wanted to explain a little bit about the origins and benefits of Haptics. Haptics were created by DeafBlind people in Norway, in the 1990s, and I just want to mention a woman named Trine Naess, who was a DeafBlind woman in Norway. She was an educator, very interested in language and communication, and she was also someone who very much wanted information and access to what was going on in her environment. And she sort of made it a mission to really document haptic signals that were being used. Maybe initially sort of organically, they kind of came up naturally, but she worked with the DeafBlind community and interpreters in documenting the system. Haptic signals can be used in a variety of contexts. On this slide, you’ll see a photo of a gentleman at a meeting, and the gentleman is seated at a table in a meeting, he’s signing to the group and he’s using sign language to follow the meeting, he’s watching an interpreter across the room, but because his visual fields are restricted, he might miss if someone is raising a hand or has entered the room. So in addition to following the sign language interpreter, he’s also receiving haptic signals. In this case, in the photo, he’s getting the haptic signal for “laugh” on his back to indicate that somebody in the meeting is laughing. And in this photo, across the table, you’ll see a woman with her sign language interpreter, she’s receiving sign tactually, but is also receiving Haptic communication. On the second photo on this slide, there are two women seated and they’re talking. Maricar, the woman on the left, is telling a story through sign language and Sonia, the woman on the right, is indicating to Maricar that she’s laughing by presenting by haptic for “laughing” on her leg, and that way Maricar can be telling her story and also, Sonia can indicate she’s laughing without interrupting Maricar to tell her that she’s laughing. The main benefits for Haptics are realtime and discreet access to the information that is going on, i.e.; the visual and environmental information that’s happening, and the social feedback, i.e.; the facial expressions, the “I’m laughing.” Haptics can be used to indicate, “wait, hold on,” they can be used to indicate that someone is entering a room, someone’s knocking on the door. They’ve been really helpful in classes. For example, if I’m with someone who’s DeafBlind and we’re engaged in a conversation and I hear someone knocking, but the student might not, so I can use a haptic signal to just indicate, “Oh, someone’s knocking on the door.” And Haptics provide a way for meaningful inclusion, so Chris, for example, the gentlemen facilitating the meeting. He can participate more fully in the meeting because he’s getting information about the participants, who’s laughing, who has a hand raised. Helen Keller became involved in Haptics after Trine Naess, a woman from Norway passed away. Her work and documenting Haptics continued, and Helen Keller was very fortunate enough to have some representatives from Hapti-Co, which was an organization in Norway who will continue to document and work with Haptics. They came to Helen Keller and did training with the staff atHelen Keller. Currently, Helen Keller is providing training in Haptics in terms of workshops and there’s also an online course in Haptics, and Helen Keller was involved in translating a book describing [an English translation] for books describing haptic signals.

So I’m going to go on to the next slide. So in this next video, you’ll see Maricar explaining how she uses Haptics and how she finds them helpful.

Narrator: How Haptics Has Impacted My Life. Maricar Marquez, Helen Keller National Center. Video image description: A closeup of a woman signing into the camera.

 Interpreter Narration: As a DeafBlind person who is culturally deaf, I was amazed at the impact Haptics has made on my life as I was losing my vision. It has filled the gaps of information that had been missing for so long. In social situations, on the job, and everyday life. I really cannot emphasize enough how much the use of Haptics has enhanced my life. In terms of the information I’ve gotten through touch from my husband, family, friends, interpreters, SSPs, and colleagues in my everyday life. As an active theater goer, I now not only rely on the tactile American Sign Language interpreter who provides me access to the dialogue of the actors, but with the addition of a Haptic communicator, providing information on my back, I am able to envision the show more holistically, so that I can see the whole setting. It thoroughly enhances my understanding and enjoyment of the production. I now use Haptics more consistently in my work life as an instructor, while working with students. Socially, in one-on-one settings, it affords me the ability to converse in real time. The analogy that comes to mind is that life before Haptics is like talking to a wall, that wall is my vision loss — my inability to see the person that I’m talking to. Communication is static, dry, just words. Haptics tears down the wall, and allows me to see all that I’ve been missing. The benefit – astounding.

So now we’d like to talk about the benefits that might lead to improved teaching and learning. Using Haptics can help with making the learning process a little faster and increase the pace of learning. Part of that is because you can get through a lesson, in this case, braille, a little more quickly, because you don’t have to interrupt the lesson to provide information or directions or feedback. Using Haptics to make things less frustrating. For example, as a DeafBlind person in a class, they might want to know about your teacher’s reactions, so using Haptics to indicate that they’re smiling or the laughing is helpful in alleviating some frustration. Also, haptic signals can help improve focus because there’s not as much struggling with back and forth communication. And haptic signals can be an efficient way of giving quick messages and feedback. And the Haptics enables simultaneous access to sensory information. For example, we had a gentleman, named Blaze, during a braille lesson, and he was able to read braille with his right hand. He was signing what he was reading with his left hand and receiving feedback from the instructor on his upper arm using haptic signals.

Narrator: Here’s a short video of Blaze during a braille class with his instructor, Rosemary. Rosemary is seated on his left and is providing feedback and instructions via Haptics on his left upper arm. Blaze is reading braille text with his right hand and signing what he reads with his left hand. This allows Rosemary to provide the feedback and let him know if he’s on the right track, or has made a mistake and needs to back up and try again. Blaze is in the early stages of his braille instruction and struggles sometimes with identifying braille letters and contractions. The use of Haptics is allowed him to access the feedback in real time. He no longer has to disengage from the braille material to receive this information. Here, Blaze starts spelling out what he’s reading and Rosemary lets him know that he’s correct by using the signal for “yes,” but as he continues to finger spell the word, Rosemary lets him know that he’s gotten one of the letters wrong by using the haptic signal for “no.” Blaze then signs “ha ha,” laughing at himself for struggling, and Rosemary lets him know that she’s laughing along with him using the signal for “laugh.” He then signs, “I know I can get this,” and he tries again. Finally, he finger spells the word “untouched” and Rosemary lets him know that he’s correct.

Peggy: We’re going to go to the next slide. Next, you will see a video of a short meeting with three people seated at a table. The two women on the right, Adriana and Faith, are DeafBlind, and are communicating via Tactile Sign Language. The third woman, Stacey is hearing sighted and will be using Haptics to inform the others she is leaving the room. One of the cardinal rules in the DeafBlind world, is that you must inform the person when you have entered or are leaving the room. We all modify what we are saying and how we are saying it, based on who is listening. If you were sharing personal information with a close friend, you would want to know if someone had walked in and was listening in, or watching your conversation, and vice versa. If you’re in the middle of a conversation with two people and one person leaves, you’d want to know. Watch this example of how Haptics is used to easily convey the information without interrupting the activity or conversation, and this can easily be applied to a braille lesson, as we will show later.

Narrator:  Using Haptics to Inform Individuals who are DeafBlind, You are Entering or Exiting a Room. Haptics is a system of providing visual and environmental information to individuals who are DeafBlind, via touch on the body. When in the presence of an individual who is DeafBlind, one must always identify themselves upon entering or leaving the room. Here, Faith’s teacher, Adriana, is letting Faith know that she’s there by placing her name signal on her arm. Adriana then trails her back to let Faith know that she’s walking behind her to a seat beside her. Faith recognizes Adriana immediately and says, “hi, Adriana.” Adriana informs Faith that she is leaving the room using Haptics. Adriana uses her name signal, and then the signal for “leaving” and walks out the door. Faith signs to Stacy, “Adriana just left.” Stacey informs Faith that she’s going to the restroom using Haptics without having to interrupt Adriana and Faith’s conversation. Stacey informs them, she’s running to the restroom using Haptics. She first provides her name signal, followed by a “T” on both of their arms. She drags her finger from the front of Faith’s arm towards the back to indicate that she’s leaving. Faith nods her head and informs Adriana that Stacey’s left to go to the restroom.

Peggy: You’ll also notice in this video that Stacey outlines a letter “T” indicating that she’s going to the bathroom. “T” being for Toilet. So, if you’re in a lesson with a student and you’re running to the restroom, you don’t have to interrupt them to let them know, you can just inform them when you’ve returned. Also, using a name signal, Stacey uses a name signal, which is a quick way to identify yourself as you’re maybe passing someone in the hall.

Hello, this is Megan Conway speaking. I wanted to talk next about research on Haptics and braille instruction that we’ve been doing at the Helen Keller National Center. So, how to you teach braille efficiently and effectively when the student uses tactile ASL in particular, has long been a challenge, even for students who use visual ASL, or who are hard of hearing. The effort to communicate, while also focusing on something that takes full concentration with both of your hands, such as learning braille, it’s difficult and time consuming. So, we became interested in how Haptics could help facilitate communication and feedback and we started using Haptics during braille instruction. We noticed when we were using Haptics during braille instruction, that the students hands and attention seemed to be better focused on the braille and less focused on communication and clarification. We were using Haptics to provide feedback. For example, “you’re doing well, go ahead,” “No, try that again,” “that’s not correct.” We were using it for instructional cues, such as “you skipped a line,” “don’t scrub,” and we were also using Haptics for access to social feedback. So, letting the student know that we were nodding or smiling, laughing, that there was someone coming or going. We decided that we wanted to document what we were seeing anecdotally in the classroom. So we initiated a research project for the purposes of documenting what we were seeing, but also with the goal of improving the effectiveness of what we were doing and for the purposes of replication, so that we could share what we were doing with other departments at Helen Keller and also with other organizations. In 2019 and 2020, we followed six case studies of students who were learning braille and using Haptics. The COVID-19 epidemic did put a pause to our research in the spring of 2020, but by the time things were closed down due to the epidemic, we were able to collect quite a bit of data and enough to draw some conclusions about what we were seeing. Next slide

Three research study questions going in. The first one was, “what is the impact of Haptics on the effectiveness and efficiency of braille instruction with DeafBlind learners?” In other words, in what ways does integrating Haptics address learning issues specific to DeafBlind braille learners? The second question was, “what are the additional benefits of Haptics during braille instruction?” For example, awareness of insturctor emotions, activities such as who is entering or leaving the room. And thirdly, “what are some of the components of delivering Haptics effectively?” We wanted to look at student preferences, how signals are presented, the placement of the instructor and students body, and instructor experience and comfort with using Haptics. Next slide.

On this slide, we have a chart showing some of the demographics of the participants. So again, there were six participants altogether of those half of the participants were male and the other half were female. One of the participants was black and five of them were white. Half of the participants had had previous experience with Haptics, and half had had no experience with Haptics. Five of the participants communicated using tactile ASL primarily, and one of the participants used visual ASL. Four of the participants had beginning braille skill levels, and two of them had intermediate skill levels. Next slide.

Here I’d like to describe the interventions that we did. In other words, what haptic signals we used during braille instruction. The primary signals we used were “yes,” “go ahead,” “no,” “not correct.” The signal for “scrubbing,” which is when the student is moving their fingers up and down over a dot or cell rather than smoothly across the cell. “Go up a line” or “down a line” “braille cell replication,” which is where the braille cells are produced on the student’s place of articulation, particularly the arm, so the dots are represented with pressure and more pressure where the dots are raised in the braille cell. And then “social feedback,” laughing, smiling, coming or going, et cetera. Next slide

Next, you are going to see a video of Peggy demonstrating four haptic signals to Adriana prior to one of their braille lessons. Peggy’s seated next to Adriana and is demonstrating the signals on her upper arm. And then explaining what each signal represents in tactile sign language. The haptic signals that you will see, which will also be described to you are “yes,” “go ahead,” “no, not correct,” “scrubbing” and “up a line.

Narrator:  Here, Peggy taps Adriana’s upper arm with the palm of her hand. She explains, this is the haptic signal for yes, or go ahead. Here, Peggy again, places the palm of her hand on Adriana’s upper arm and moves it in a quick back and forth motion from left to right. She explains, this is the signal for no, you’ve made a mistake. Peggy is now placing the tip of her index finger on Adriana’s upper arm and moving it up and down slightly. This signal was created to remind students to stop scrubbing and use the correct method when reading breath. Here, Peggy is placing the tips all four fingers on Adriana’s arm and moving her hands slowly from left to right simulating the experience of reading braille. This signal was created to remind or prompt students to use the appropriate reading technique. Peggy is now introducing the signal for “go back a line.” This is used when a student mistakenly skips a line when reading braille. Peggy demonstrates the signal for Adriana. She places the side of her open hand, palm down on Adriana’s upper arm. She then disconnects, moves her hand up, and again, connects with Adriana’s arm.

Megan: I’d like to discuss our data collection and analysis methods. We used a multiple case study design, and we had a number of ways of collecting data. The first were interviews. We had several kinds of interviews. We had an intake questionnaire, where we collected basic demographics and information about previous experience with braille and Haptics, use of language, et cetera, from participants. We had post-lesson interviews, so after every lesson where we were using Haptics, we would interview the students. We would ask them about what signals were used, which signals they liked or didn’t like, and how Haptics was helpful or not. We also asked them about any modifications that might have been made during that particular session. And then we had a final interview with students where we ask them for particular recommendations for the instructor, for using Haptics, we ask them for extended details about how Haptics was helpful or not helpful for them when using braille, and also we asked them for information about when we used Haptics, not in the context of braille, but more in terms of social interactions and environmental information, how that worked for them, how they liked that, and how it impacted the braille lesson. We also conducted observations during the sessions and these observations were conducted by the instructor who took notes, by the supervisor and by the director of research. We videotaped several lessons, with each participant as well. In terms of data analysis, we used qualitative analysis methodology, both within and across cases. We reviewed and coded the interviews and observations for themes and patterns, and we reviewed the videos for the frequency of Haptics use, instructor delivery, and student responses. We completed a case summary for each individual case and then we created a summary across cases to look at similarities and differences between the different data. Next slide.

So this is Peggy. And now we’re going to talk about one woman in particular, a case study, Haptics being used during braille instruction. So this woman named Adriana, has Usher Syndrome type one. She was born deaf. Her visual condition, Retinitis Pigmentosa has restricted her fields. At this point, she communicates using tactile ASL. She had actually grown up a visual signer, but transitioned to tactile sign language. She’s someone who benefited from using Haptics during braille instruction, she felt they were very helpful. She used the information really effectively, so when she was given the haptic signal, for example, to line up her fingers, she was able to do that quickly and efficiently. She was able to communicate and read simultaneously. She liked the immediate feedback, and as I said, made very good use of it. She felt that the haptics communicated encouragement from the instructor, so she’s someone who really appreciated when the instructor was laughing, to use that haptic signal, or smiling, or presented a signal very enthusiastically. She liked that because she felt, as she said, felt like you were happy that I got it right. She also felt using haptics was less distracting. She was able to focus more easily on reading the braille. And we’re going to go to the next slide.

So next, we would like to show a video clip of Adriana and myself using Haptics during braille instruction. Just to let you know about the haptic signals being used, one of the haptic signals is to go back to the beginning of the line. Another is of the braille cell being outlined on her upper arm. The haptic signal for smiling, and the haptic signal for switch it around. “Switch it around” was a signal that we developed from the times that Adriana was identifying a braille character that may be the mirror image of the actual character. For example, the braille letter “M” and the braille contraction for “s h” are the mirror images of each other. Adriana and I worked the single out together for a specific need. Originally we tried pointer and middle finger with the fingertips on the upper arm. Then I flipped my hand around. Adriana suggested we incorporate a twist motion with the two fingers, as in turning something around, which gave her more to feel

Descriptive Narrator: Here, Peggy places both index fingers right next to each other on Adriana’s upper arm. She then slides her right index finger towards the front of Adriana’s body and then back again to meet the other finger. This signal is used to indicate that the student has strayed from one line to another. Here, Adriana is using both hands to read her braille document. She raises her right shoulder and glances at Peggy to let her know that she is having difficulty identifying a particular braille cell. Peggy looks at the braille material and then uses Haptics to create the braille cell on Adriana’s arm. The purpose of the braille cell signal is to represent the cell on the receivers arm so that they can feel where the dots are raised and better identify the letter or contraction they’re having difficulty identifying. With her index finger and middle finger, Peggy draws the braille cell on Adriana’s arm. She emphasizes the dots on the cell to represent the letter that Adriana is working on. In this example, she presses very lightly for dots one, and then use as much more pressure for dots two, three, four, and five. Then again, light pressure for dot six. Adriana thinks about it, and then starts signing dots two, three, and then realizes that it’s the letter “T.” Peggy confirms that she’s correct using Haptics. Adriana then spells out the whole word,”rabbit.” Peggy again, uses Haptics to confirm that she is correct and uses the signal for “smile” by drawing a smile with her index finger on Adriana’s arm. Here, Adriana is spelling out a word and finger spells “M.” Peggy uses the haptic signal for “no,” to let her know that she’s incorrect. Adriana goes back to her document. Peggy uses the signal for “switch it around” to let her know that she’s mixed up the signals. This signal is an example of a signal that was created by Peggy and Adriana, specifically to meet Adriana’s needs. Peggy places a “V” with her index and middle finger and places, the tips of her fingers on Adriana’s arm with the middle finger above the index finger. While keeping her index finger in place, she slides the middle finger around and down. So now that it’s below the index finger. Adriana realizes that she’s made a mistake and finger spells the correct letter “M.”

Peggy: As you can see from this video, the use of Haptics allowed me to communicate instructions, prompts, feedback, and encouragement to Adriana without interrupting the flow of the lesson. We saw the same effect with other students who participated in the research.

Hello, this is Megan. I’d like to describe the research study results. So, there were a number of things we observed, but there were definitely five significant results that we saw. The first was the potential for enhanced learning communication and instructor student rapport. So, we had definitely seen this answer anecdotally, and this did come out when we examined our data through the research. We saw that Haptics did indeed support components of learning, such as efficiency, focus, and minimizing instruction. Excuse me, minimizing frustration. Secondly, we saw that modified and new signals improved understanding. So, although haptics is a set system, so in other words, we, had identified specific signals that would mean specific things between the instructor and the student. That wasn’t to say that modifications and creating new signals was not helpful. In fact, it was very necessary different times. So, some signals were more difficult to understand for students, and some of those signals were also identified with some of the more important signals. So for example, the braille cell replication was one of those. Students often did find that signal difficult to understand, but when they worked with the instructor on things like, different ways to create pressure, more or less pressure, different places of articulation, and even a different signal altogether. They were able to work out a way of presenting the signal that made it easier for them to understand and was effective for them. Students and instructors did also create their own haptic signals in many cases, based on individual needs of the students. So an example of that, if there was a student who tended to slouch over and drop his head, when he was concentrating on the braille, and he wanted to be reminded to pick up his head and sit up straight. So, the student and the instructor created a signal that would let him know that he needed to do that. A third finding was that some signals were more helpful than others. We did ask students often which signals they found to be the most helpful or the least helpful, and there was a fair amount of consistency across students. The most helpful for students were “keep going” and “yes.” “Not correct,” “no,” “up a line” or “leveling,” And “braille cell.” Fourth, students did have personal preferences. They had preferences for which haptic signals were the most useful, how and where they wanted them presented, and the amount of pressure that they wanted. Some students just wanted the basics, like “yes” and “no.” They didn’t want to complicate matters with other signals. Other students said, “give me everything. Every haptic signal you can give me, I find useful.” And they just were very hungry to have that kind of feedback. Some students wanted haptics to be presented constantly, so they really were encouraged when the instructor would say “yes, go ahead.” They liked to know immediately if they were doing something incorrectly, and so they wanted pretty constant feedback, but some of the students just wanted to get it occasionally. They found it distracting when Haptics was being presented all of the time. Essentially, it was just that although there was definitely commonalities in the way that students responded to Haptics, there was generally positive impressions of Haptics, there were personal preferences that needed to be respected. Lastly, instructor factors were also important to how smoothly Haptics went, to how it seems to impact the students in the lesson in general. So, in particular, the instructor’s familiarity with Haptics and their comfort in using Haptics, seem to be very important. Instructors that were more comfortable and more familiar seemed to have a better rapport with the students, there was more flow to the lesson, and also, the method of delivery seemed to make a difference in how students responded. Just in terms of consistency, and in terms of where the haptics was presented, et cetera, respecting student differences, all of the things that we’ve discussed, that were of a benefit, if the instructor was sensitive to those things, it seemed to be very beneficial. There was definitely a variation across students because of communication needs and style, but in general, how the student and the instructor were positioned when using haptics seemed to be important. So it appeared that when the student and instructor were sitting side by side, as opposed to across from each other, and when the instructor maintained some kind of contact, constant contact with the student, that seemed to make the Haptics and the lessons flow more effectively. So that contact could just be the instructor slightly, you know resting their hand against the student’s arm, where the haptics was going to be presented, or touching knee to knee, some way of constant contact with the student, so that Haptics didn’t kind of come out of the blue, but was more a natural part of that, that contact. Next slide.

Megan Narrating: Displayed here are several images that illustrate differences that we found in instructor behavior and student preferences. The first two images illustrate student preferences. In the first image a student is receiving the haptic signal for “braille cell” on her lower forearm. In the second image, another student is receiving the haptic signal for “correct, go ahead” on his upper arm. The third and fourth images illustrate how the instructor can provide different amounts of contact to their student. In the third image on this page, the instructor is sitting back and is not in contact with the student. In the fourth image, which features the same instructor and student, the instructor is providing constant contact with the student by resting her hand next to his hand.

Next, we would like to show you a video of a student explaining what she thinks about Haptics. In this video, Peggy and her student Adriana, are seated at a table. Peggy is asking Adriana in ASL to share her thoughts about the use of Haptics with braille training. As Adriana responds, notice that Peggy is using the “yes” or “nod” haptic signal to let Adriana know that she is following along or agreeing with her comments.

Descriptive Narrator: Peggy speaking. “Great job Adriana. So what did you think?” Adriana speaking. “Wow, I have to be honest with you. I thought the whole process was just so smooth. The use of haptics was extremely helpful. I didn’t have to keep stopping whatever it was doing to receive your directions or feedback via tactile sign language. I was able to remain engaged in my braille activity and receive this information simultaneously. This allowed me to get through my work much quicker. It’s very distracting to have to stop and pull my hands away from my work each time my instructor wants to communicate something to me. Also, I really appreciated having access to your facial expressions and reactions. Letting me know that you were spiraling or laughing was so beneficial. I don’t have access to that type of information anymore.”

Megan: In summary, some of the things that she mentioned are that haptics is smooth, it helps her maintain her focus, it minimizes frustration, it’s efficient and effective. Next slide.

This is Peggy and we definitely found some implications for practice, through this haptic project. Haptic communication shows promise as an important tool for enhancing learning,for making communication easier during the lesson, and in helping student instructor rapport during lessons. Instructors using Haptics need to practice and become comfortable with using Haptic communication. Important that the instructor respect student preferences and the nuances of delivery of Haptics. For example, just having a conversation with a student about, “well, gee, should I tell you you’re right after each letter? Should I tell you you’re right after every couple of words or the sentence?” So having conversations with students about how they want the haptics used, was very important. And we need to do more research on Haptic communication and other areas of rehab teaching, where we have training. For example, in the area of independent living. If a student is working in the kitchen, for example, with a rehab teacher, Haptics can be used to give information during independent living classes or adaptive technology. For example, if someone who is learning about screen modifications, the adaptive tech instructor can perhaps use Haptics, for example, mapping the students back about what’s on the screen and where things are on the screen, as well as the area of mobility, in terms of giving students directions during a mobility class. So a lot of implications for practice. I’m going to go to the next slide.

This is Megan. “Summary.” Haptic signals are used in the DeafBlind community to enhance access to communication and environmental information. Research shows that when used during braille instruction, Haptics supports learning and eases communication barriers. Haptics is a promising tool for enhancing rehabilitation training for DeafBlind people.

“Additional resources.” Haptic Communication: The American edition of the original title spelled K O M M U N I K A S J O N by Hildebjorg K. Bjorge, and Kathrine G. Rehder. 2015 available from https://www.helenkeller.org/hknc/publications. Haptics Pocket Edition, free mobile app available for download from app stores for iPhone and Android created by Hapti-Co. [2020]. Helen Keller National Center, 2018 available from HTTPS colon backslash backslash, www.helenkeller.org, backslash H K N C backslash publications.

This has been a presentation by the Helen Keller National Center. We hope this information has been helpful. This PowerPoint is the property of Helen Keller national center. So please do not distribute or use for training purposes, and the reason for that is because the releases for images and videos were obtained for just Helen Keller purposes only. And for more information, contact the Helen Keller national center at pld@hknc.org.

Megan Conway can be reached at MConway@helenkeller.org. Peggy Costello can be reached at pcostello@helenkeller.org. The IRPD department can be reached pld@hknc.org. And thank you. [End of Transcript]

Additional Resources