Spotlight TEDx Talk: Developers, design navigation apps with the blind in mind!

Geographer Megan Lawrence discusses how navigation systems could be more inclusive at TEDxIndianapolis (Photo: Denis Ryan Kelly Jr.)

Geographer Megan Lawrence discusses how navigation systems could be more inclusive at TEDxIndianapolis (Photo: Denis Ryan Kelly Jr.)

How do you design maps for people with visual disabilities? With creative technology, says geographer Megan Lawrence. Every one of us (no matter our visual ability) needs to be able to pick up information about our spatial environment — and technology helps, Lawrence says in a talk at TEDxIndianapolis — whether that’s through a GPS “blue dot” telling you where you are in a city or wall sensors in an art museum beaming information about nearby paintings to your phone. Despite this, people with visual disabilities are often shut out of navigation technology, Lawrence says, because these apps and devices are not designed with them in mind.

“People who are blind or low vision do not lack spatial abilities,” Lawrence says, “… what they do lack is access to environmental representations.” The standard mobile apps for navigation — ones that help sighted people find new restaurants or navigate to unfamiliar neighborhoods — are not accessible to the blind and visually impaired, she says, explaining: “There’s a digital divide between those who do and those who do not have access to mainstream technology … The [apps] that comes standard on your phone, for free, [are] made for a narrow group of people in our society. They’re not universally designed to be accessible to people with and without disabilities.”

Lawrence knows that there are a lot of pain points in developing navigation technology for all abilities, but she believes that coming up with solutions to these pain points will make for better navigation tech overall. One hurdle? GPS failure. “We see a breakdown in GPS technology when we go from the macro scale of navigating city blocks — which GPS does pretty well –  to the micro scale of finding places, entrances, exits, even specific rooms in buildings.” How do you find the entrance for a building if your app isn’t sure where you are? If you have low vision, how do you scan the street and read storefront signs?

There are have been technologies designed to help the blind and low vision communities do this, says Lawrence, but these are niche. One, Smith-Kettlewell Eye Research Institute’s Talking Signs project, relies on installing an infrared transmitter outside and having users carry around a special receiver with headphones. “When you point your handheld receiver in the direction that you are interested in, you hear [the name of the businesses sent by the transmitter],” Lawrence says. “[but] nobody wants to carry around this receiver,” she says. “Blind and low vision people — along with people with and without disabilities –  just want to use their smartphones.”

A smartphone picks up an iBeacon signal (Photo: Jonathan Nalder)

A smartphone picks up an iBeacon signal (Photo: Jonathan Nalder)

Which is why it’s important to design universal solutions, Lawrence says, ones that address the needs and wants of those with and without disabilities. One technology, iBeacon, is moving in this direction. iBeacons are small transmitters that use Bluetooth technology to send information to smartphones. “[iBeacon technology] is being used in the San Francisco airport to provide blind travelers with information about things that are in a terminal … like where to charge your phone.” But physical tech like iBeacon requires infrastructure, says Lawrence, and that isn’t easy to install and maintain.

Miami International Airport uses iBeacon technology developed by IT company SITA (Photo: SITA)

Miami International Airport uses iBeacon technology developed by IT company SITA (Photo: SITA)

Lawrence believes that truly universal navigation tech has to focus on the user and enable them to explore their environment naturally, rather than forcing them to be bound to a predetermined set of information transmitted by a beacon. This technology has to be flexible, reactive, moveable like the human visual system. This may be possible sooner than one might think, she says, thanks to to advancing camera technology that can track motion, perceive depth and create 3D maps of our world. “All of a sudden, those really complicated environments that we couldn’t map, we can, and we can map it into fine detail, providing accessibility to the world that we have never seen before,” she says.

However, this technology will only be as accessible as the software that operates it will allow it to be, Lawrence says. She worries that if developers do not design the apps and programs that control the technology of the future — the 3D cameras, the depth sensors — with people with and without disabilities in mind, this opportunity for unprecedented accessibility will go by the wayside.

Watch Lawrence’s whole talk to learn more:

Leave a Reply

Your email address and name are required fields marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>