You may enjoy TV — as do many people with hearing or visual disabilities. But those who are both deaf and blind need special help to follow along. Now an innovative technology is turning television signals into a form that deaf-blind people can understand.
Deaf people can’t hear. But they can use closed captioning to read subtitles of the words spoken on TV. Blind people can’t see. But they can make use of visual description in voice-over comments that describe what’s happening on the TV screen. Neither method, however, works for people who are both deaf and blind. That makes it harder for them to “watch” television shows or programs.
Roughly 45,000 to 50,000 deaf-blind people live in the United States, according to the National Center on Deaf-Blindness in Monmouth, Ore. By that center’s count, almost 10,000 of them are under age 22. Thousands more deaf-blind people live elsewhere around the world.
Ángel García Crespo is a computer engineer at Carlos III University of Madrid in Spain. His group has invented a new way for deaf-blind people to “watch” TV. He unveiled the technology at a conference, last year, in Aveiro, Portugal. The team went on to describe what they’d done in a paper, earlier this year.
The idea for the system grew out of previous work by García Crespo’s group. The team had already worked on making audiovisual materials accessible to people with either vision or hearing disabilities. But the group wanted to help people with both challenges. So they asked some deaf-blind people what would help.
“We heard from them that they would like to know, without intermediaries, what is said in the TV newscasts,” García Crespo says. In other words, the deaf-blind people didn’t want to always need someone else to tell them what was going on. That sent the team brainstorming.
Getting technologies to work together
Deaf-blind people rely on their sense of touch to communicate. One way to get info is to have someone on hand — literally. A deaf-blind person can get and give information through touch-based hand signals with another person. But it isn’t always “handy” to have someone else around.
People who can’t see can also get and send information with a braille line, better known as a refreshable braille display. The braille system uses patterns of raised dots to stand for letters and numbers. A refreshable braille display is an electronic machine with a changeable braille display. Dots or pins rise up or drop down based on electronic information sent to the machine. With such a portable device, someone who cannot see a screen can still read email or other information from a computer.
The new system converts TV signals to data that a refreshable braille display can use.
“Key to the system is the possibility of using subtitles to collect TV information,” García Crespo explains. “Subtitles travel with the image and the audio in electromagnetic waves that we do not see. But an electronic system can capture those waves. That is what we do.”
First, a computer program, or app, pulls out the subtitles and visual descriptions from the broadcast signal. The system then combines the information and converts both into data for braille. “No one had done this before,” García Crespo notes.
Now another app gets to work. It sends the data out to people’s refreshable braille displays on demand. “This is done in real time, in less than a second,” García Crespo says. This lets a deaf-blind person “watch” TV as it is broadcast. The system will work with all types of refreshable braille displays, as long as there is a Bluetooth connection available.
Currently, the system is only used in Europe. Teams need to tweak the decoding process to work with the TV signals used by broadcasters in different regions. Indeed, it should soon be available in the United States.
The Dicapta Foundation in Winter Springs, Fla., has been working with García Crespo’s team and others to make that happen. They call their project GoCC4All. Apps for Google and Apple phones are just about ready, says Lourdes Fiallos. She’s a project manager at Dicapta. Testing with deaf-blind users should start in a few weeks.
García Crespo’s team also wants to create a “universal communicator” for deaf-blind people. It would let them communicate with anyone without the need to have a human assistant present.
Anindya “Bapin” Bhattacharyya is a technology-development and training specialist at the Helen Keller National Center for Deaf-Blind Youths and Adults. It’s in Sands Point, N.Y. Bapin is deaf-blind himself. And he says the new technology sounds like “a great development.”
Bapin does raise a few questions. “There needs to be a menu to allow me to select a channel or show that is captioned and also has audio/visual descriptions,” he points out.
Bapin also would like a way to skip an ad. People with sight and hearing can take a break when a commercial comes on. When they hear or see that the show resumed, they can again pay attention. Deaf-blind people would like such a signal to let them know when a show resumes, he says.
Technologies to assist people with disabilities “are fantastic and give deaf-blind people access to digital info and communication,” Bapin says. However, he notes, gaps remain. Examples include self-help machines at some stores and banks. Too often the developers forget to include accessibility features.
Inventing new technologies to boost their accessibility takes work, as García Crespo’s team has learned. For instance, the TV system had to work in real time. Yet no one knew in advance which show someone might want to “watch.” To deal with that, the team has different computer processors handle each TV channel’s signal. Then one server centrally manages all of them. It collects the processed subtitles and visual descriptions and then sends them to users on demand.
Getting the whole set-up to work was tricky, but García Crespo liked the challenge.
“I like to solve problems,” he says. “If the solutions are related to technology to improve people’s lives, I like those problems better.”
This is one in a series presenting news on technology and innovation, made possible with generous support from the Lemelson Foundation.