The TDI Keynote presentation where Gary shared about the possibilities of future mobile phone technology and elaborated more on the IRIS project. 

 

 

Hello, my name is Gary Behm.  I use the pronouns he, his, and him and I identify as a deaf, white male.  I have graying, light brown hair and I am wearing glasses and a black polo shirt with the RIT/NTID logo embroidered in orange and white.

I worked as an engineer with IBM for 30 years.  Currently, I am a Professor and Associate Vice President for NTID Academic Affairs.  Also, I am the Director for the Center on Access Technology, or CAT, at RIT/NTID.

Allow me to expand on the importance of the Center on Access Technology.  The Center on Access technology was established at NTID over 15 years ago with the purpose of addressing the challenges we as Deaf people face regarding accessibility to communication, information in the classroom, and various technologies.  We wanted more research and understanding to improve accessibility for our Deaf and hard of hearing students. For example, often we have Deaf and hard of hearing students on the RIT campus who participate in hearing classes.  Some students prefer to watch the lectures with Real Time Captioning.  We ensure the captions are accurate and allow the student full access to the information so that individual can learn more effectively.  Also, we want to be sure Deaf and hard of hearing students can interact with their hearing professors, peers, and friends.  We want them to feel comfortable and that is why we make sure they have full access to communication.  Historically, that research is in a constant state of improvement.  We haven’t achieved the perfect solution as of yet, but research is ongoing.  

[2:18]

As you are aware, Technology is rapidly changing.  We need to make sure that the needs of Deaf and hard of hearing people are taken into consideration for future technology designs and that those designs meet the needs of Deaf and hard of hearing.  We don’t want new technologies released only to find that they are not deaf friendly, so we want to be part of the design process.   For example, during the COVID pandemic, many people were instructed to stay home and work or go to school remotely.  While video conferencing was a new, disorienting experience with a steep learning curve for most hearing people, video conferencing is a technology that Deaf and hard of hearing people had been using for many years.  Now hearing people were forced to use video conferencing for work and school.  Rarely did the hearing community reach out to ask the Deaf community about their experience using video conferencing technology to improve designs for the future.  Hearing people rushed to develop new video conferencing platforms, but from the Deaf perspective, the new video conferencing platforms were not Deaf friendly and lacked accessibility – lacking things such as various camera perspectives and accurate captioning.  Developers could have asked the deaf community how to improve video conferencing technology based on their experience.  Alas, they did not.  In the rush to release a product when COVID-19 hit, little regard was given to the deaf experience and needs.  That’s a current example with new technology.     

 

As we enter “the new normal” of post-COVID times, it will be interesting to see what happens with technology.  Will the new normal include people who traditionally worked on campus but now continue to work remotely?  How will the new normal affect the Deaf and hard of hearing Community?  Is remote work a good thing? We are in interesting times.

[4:43]

Image descriptions:

1. Illustration of North America in light blue, with a white text over it that reads, “11 M”

2. Illustration of a mobile phone in blue with its screen in a darker blue, with a white text over it that reads “+8M”

3. A capitalized text written in white saying “ZERO” with illustrations in opposing corners of the “interpreter” sign and a laptop with “CC” on it, both in blue.

 

Another example is the Mobile Phone.  The technology in the modern phone is amazing.  I remember my first phone.  Now the phones are bigger, sharper, and more advanced!  There are approximately 11 million Deaf and hard of hearing in US.  Of those, more than 8 million own a mobile phone.  While there are many Deaf and hard of hearing mobile phone users, no one can use a native phone for accessibility.  Native phones come with the ability to make calls, but require the Deaf user to install other apps to make calls and that is not very accessible. When a hearing person buys a phone it is ready to go, out of the box.  When Deaf person buys a phone, first they must download an app, add their accessible phone number, and now they have two different phone numbers, which can be confusing: one number is for texting, another is for calling a video relay service or a captioning service.  It’s not uncommon for a Deaf person to have 2, 3, or 4 different phone numbers.  In my opinion, that’s not very accessible.  

[6:08]

I would like to discuss why it is important for companies to conceptualize Inclusive Design.  For example, consider a curb ramp or it’s also called a curb cut.  Curbs run along the edge of streets to keep traffic on the street.  The curb is cut or ramped at intersections to allow people in wheelchairs to go from street level to sidewalks without abrupt drop offs.  Believe it or not, the curb ramp was developed in 1945 for injured veterans returning from war, it wasn’t designed specifically for wheelchairs.  It was years later that groups of wheelchair users fought for more curb cuts to improve wheelchair access and now, they’re everywhere.  Any new road will automatically have the curb ramp there.  The design allows improved access for all.  Even non-wheelchair persons benefit from the design.  Truly a nice design and similar to the concept of a mobile phone design for Deaf and hard of hearing that would benefit all mobile phone users.

 

Another example is doorbells.  I remember my first house had a doorbell connected to a chime.  That didn’t help me.  When someone came to the door and pushed the button the bell rang, but I couldn’t hear that.  Fortunately, I knew a few deaf engineers who helped me hard wire a system in my house, so when a person pressed the doorbell lights would flash.  The same system worked with the phone, when I got a call, the lights would flash.  At that time I couldn’t just go to the store and by a ready-made system to install for a doorbell/light because companies don’t think about the Deaf and hard of hearing population, it’s such a small number.  The majority of people buying door bells can hear a chime and I understand that, but deaf need to know there’s someone at the door too.  Now, years later, it’s really nice, you can buy a system like Hue, for example, so when someone presses the bell the Hue lights flash.  I no longer need to design my own homemade system.  I can go to the store and buy a system off the shelf like Hue or Smart Doorbell, bring it home, set it up, and it works.  It’s a major stress reliever and it makes things more accessible for the future. 

 

Now, why can’t the same design concept apply to a mobile phone making it more accessible for Deaf and hard of hearing?  The phone itself is very powerful with tremendous technology.  Phones come with everything, why not add just a little more functionality to make it more accessible for Deaf and hard of hearing?  

 

For that reason, I have a project called “IRIS.”  The focus is a simple concept.  The phone has a dialer where a person types in a number to make a call.  Sometimes the number is already in the phone so you just select the contact and the phone automatically dials.  Deaf and hard of hearing don’t use that dialer.  We use a separate app with a separate dialer and a unique dialer for each app.  Why not incorporate the dialers we use into the phone’s dialer.  It’s the same concept as the captions decoder on old TVs.  I remember back in the 1970s we finally got captions on the TV after I bought a decoder box from Sears Department Store for $280.  We brought this box home plugged it into the TV and finally I could see captions.  I was thrilled!  Yes, they were captions, but it was better than nothing.  There was a big box on the TV, but I didn’t care, I could finally follow what was happening on the TV.  Now, many years later, we don’t see boxed on TVs and that’s because the decoder is now integrated in the TV itself.  So, thank you to our Deaf community who advocated for that technology be built in.  Now, we don’t worry about captions on the TV.  When you buy a new TV, that technology is already there.  You simply push “CC” on the remote and the captions pop up.  Whereas before I had the added work of buying the separate box and adding it to the TV, similarly with the mobile phone I have to get an app and add it to my new phone to make it work for me.  The concept is similar to the old decoders.  The app has separate phone numbers, directory, and video mail.  Why not just build it in.  It’s so simple.  So, we are researching that concept.  Believe it or not the phone technology has enough power to do whatever we want it to.  Now we need to make phone manufacturers agree phones need a redesign to meet our needs and that’s why we’re discussing that.  

[11:28]

Now to expand on Project IRIS.  We need a lot of experimentation and testing to make sure our project is feasible.  The goal is a mobile phone with one number that can be used to call VRS, CTS, or 911.  We want to be sure the technology is feasible.  So, a group of Deaf are working together in a lab environment with outside companies to see if it’s possible that one phone number can place a call to VRS or a captioning service.  We’ve been experimenting with that and, sure enough, it works beautifully.  It really depends on how you set it up.  For example, we want a single phone number for everything instead of the 2, 3, or 4 numbers: one for VRS, another for Captioning, etc.  When I give my number to a friend it can be confusing.  Do I give them my VRS number or my native phone number?  And with texting I don’t want to confuse that with my number for the video phone.  We want to see if it’s possible to have only one number.  We’ve been working for almost one year now on various parts of the project.  One part is the phone carrier or network provider.  Those are big companies.  The second part is the phone itself and the phone manufacturer.  For example, Samsung, LG, and Google make the phones and sell them to the phone carrier who rebrands the phone.  We also work with several different captioning providers and VRS companies as well as different 911 systems.  There are many parts involved.  It’s simple when working in a lab where we control all the variables, but in real life, working with a variety of phone carriers is a major challenge.  The point is, we’re not limited by the technology, all of this is technically possible, the challenge is getting all the parts to work together: meaning the phone carrier, CTS, and VRS.  Often one part will depend on another part’s solution, but that solution depends on a different part’s solution and the relationships are very enmeshed and dependent on each other.  That is why, historically, there has not been a successful one phone number solution.  We are working hard to include the community to help us make this happen.  Captions on the TV wasn’t too difficult a problem to solve.  There was the TV manufacturer and the decoder manufacturer and it was pretty simple getting those to work together.  The mobile phone is a much more complex ecosystem with multiple parts involved.  That’s why we want to discuss more about how we push that project forward.  

[15:13]

I want to discuss the mobile phone, but also other future technologies.  Phone technology will continue to improve.  New phone models are released every year or two and we have that phone to meet our accessibility needs.  But we need to be mindful of other technologies. The internet of things for example.  The Internet of Things or IOT is a strange name.  Basically, IOT means the internet is on everything.  Technology, has of late, truly taken off.  The tiniest chip has the power of a full computer complete with its own IP Address, memory, and IO integration and these tiny chips and they are everywhere.  IOT technology has exploded and can be found in the world of manufacturing, in homes, and in healthcare as examples.  We need to be sure we are not overlooked as that technology progresses and that Deaf and hard of hearing are considered with new designs and technological solutions.  We want to ensure design is inclusive and that our observations are included in the IOT designs.  The Internet of Things and those chips with the IP address enable your computer to connect to the device via WiFi.  A simple example of IOT at home is any “Smart” appliance such as a dishwasher, washing machine, dryer, or a stove.  Now, I can buy and install a new stove, set the timer to cook for 45 minutes and when it is done the stove will send a message to my phone.  So, I can set the stove to cook and go do other things and when I get the alert on my phone, I know my food is ready.  In the past deaf had to watch the time.  If we didn’t pay attention the time would go over, the food would overcook, and dinner was burnt.  But that concept of the appliance communicating with the phone is the IOT.  A washing machine manufacturer installs a chip with an IP Address which connects to my phone via an app.  Wow!  Now that’s the future!  Smart homes are the future.  It is true IOT easily connects my phone to the washing machine and that benefits deaf people. In the past there was nothing like this.  I would start the washer in the morning and get busy during the day, leaving the wet clothes in the machine all day because I forgot to move them to the dryer.  That happens sometimes, people get busy with other things, but the IOT phone app alerts me when the wash is done so I can move my clothes to the dryer.  That is a benefit to the deaf community.  That shows how the IOT helps deaf people but there is still more ideation and development to happen and I want to be on top of that to be sure Deaf and hard of hearing are included.  That is important.

 

Now, related to IOT, a big topic is AI or Machine Learning. It’s another explosive field.  The point of Machine Learning is to develop a smart system that can make appropriate decisions based on collected data.  Many companies love to collect your data.  They want to know more about you so they can sell more products to you or make a smarter system.  One example is ASR or Automatic Speech Recognition a technology that has recently taken off.  Really, ASR is an old technology developed back in the late 1950s that improved incrementally over the decades and only recently has the technology taken off all of a sudden and many companies offer ASR.  You may wonder how ASR benefits deaf people and that’s the challenging part.  If you remember, when developing new ASR or new AI they need data, so companies collect lots of data, mostly from hearing people.  Based on that data, they develop new smart systems, but do they include data from Deaf and hard of hearing?  When data is collected from deaf it is often deleted because it’s not “normal,” be we want to encourage companies to keep that data to help improve the lives of Deaf and hard of hearing.  That practice is called “data bias,” because it does not include the small amount of data from Deaf and hard of hearing as compared with the millions upon millions of other data points.  We want to be sure that we are “counted” in their data collection.  AI applied to the Deaf population has many benefits.  For example, with ASR a hearing person can speak and the speech is translated into readable captions.  But the big question is, “is the quality of the translation reliable?”  ASR is a great idea, but does it meet the needs of the deaf community?  Some deaf complain there are so many translation errors in ASR that they miss information and the conversation goes over their heads.  That’s not good either, so we need ASR to meet their expectations.  Of course we will continue to need human captionists, but ASR is blowing up.  Another future, very cool technology is Sign Recognition or a system that recognizes your signing with AI.  They need data so now they are collecting data around signing to help build a smart system.  That is a big growth area as well.  So, again, the future is exciting, but, at the same time we want to make sure new innovations include us in their design process, in other words Inclusive Design.

[21:32]

How do we, as a community, become more involved in the change of technology?  That’s what I want to discuss.  Thinking about the recent emergence of COVID.  We survived thanks to technologies like video conferencing and we saw the progression of differing technologies.  So, how can we get involved?  We have ways.  We have great conferences like the TDI conference to help us share ideas with TDI, but also other organizations as well.  Obviously, we have NAD, HLA, and there are more out there and we need the communities to come together.  We are responsible to gather together and discuss what is happening with technology.  We have a habit of just accepting what were given, for example the mobile phone.  We accept it and add an app to it.  After many years we finally have a solution for us and we accept it.  But we need more.  We need improvements and if we get involved we can encourage those changes.  I believe that.  If we analyze the mobile phone we can make changes.  If we do nothing, the phones will stay the same and we will continue to add apps, have three different numbers, and accept what we’re given.  That’s okay.  It will work, but it could be more.  I think it’s important through organizations like TDI that our concerns are recognized and we ask, “Hey, what about this?”  That enables TDI to represent us to large corporations and advocate for solutions that companies typically overlook.  I understand the company’s priority is to make money so they tend not to focus on disabilities, but they are improving.  But, we can be more actively involved with communicating our needs to be recognized by large companies.  Perhaps solutions to our needs will be included in the next product cycle.  I realize that nothing happens overnight.  Changes to smart phones will take a couple years to implement, I recognize that it requires time and that’s why it’s important to come together.  NAD is a great organization and it is very supportive of our community and we need to let them know when something is not accessible or we’re not satisfied with the captions, or education is not sufficient, whatever the concern.  It is our responsibility to get together and have good discussions, not to criticize their products, but there is always room for improvement.  We need to continue to work together.  Hopefully, during this conference, we can come together and discuss new technologies.  I myself am an engineer, but I learn new things every day.  I have to keep up with new technology.  Some of us may have given up on trying to keep up and I understand that, but it’s important to remember it is our responsibility to keep up with technology and ask if it’s meeting our needs.  If yes, great.  If not, “what are we going to do about it?”  Realize that there is power in the collective group compared to an individual.  If I go to a big company as an individual, it’s easy for a company to ignore me.  But, if TDI or NAD collectively express our needs to a corporation they will cooperate with us.  TDI is an important conference for us to move forward with new technology for the future.  Thank you.