What do you get when you put together the blind and visually-impaired community and transportation organizations to discuss the future of public transportation? Read below to see some great insights from the #AccessibleOlli workshop.
By Erich Manser
Continuing our work of creating the world’s most accessible, self-driving vehicle, IBM, Local Motors and the CTA Foundation hosted an #AccessibleOlli workshop last March, which tapped the unique combination of the blind and visually-impaired community and transportation organizations in the greater Boston area.
The workshop had participants who are at varying stages of vision loss – from mild impairment to completely blind. Though most attendees were frequent riders of public transportation, a few indicated that they seldom used it, however were excited by the prospects of what Olli could offer to enhance their lives. The workshop also included members of Massachusetts’ transit agencies (Mass Bay Transit Authority (MBTA), MassDOT and the Mass Rehab Commission’s Adaptive Auto Program).
As the workshop got underway, it became apparent that the familiarity the group has with public transportation had equipped them with some thoughtful and practical ideas. We were surprised to receive some input that went beyond blindness or visual-impairment, such as methods of securing a wheelchair while in-transit.
As one of the IBM project leads, and also visually-impaired, I was intrigued by the depth and innovation of the discussion. There were some ideas raised that we hadn’t yet heard, though we also noticed some recurring themes from the previous workshops:
- Voice activated services are critical – from calling or ordering Olli, to all interactions and information received while riding the vehicle.
- Personalized narration, including information on the surroundings, delivered over smart device/headphones with the ability to repeat information, if needed.
- Vehicle should recognize the rider, and enable personalized, individual experiences.
- Multi-modal delivery of information – voice, text, touch – based on individual preference.
- Security – Emergency directions and navigation instructions should be delivered in multiple ways (voice, text, touch) so it feels safe.
- Flexibility – needs to minimize wasted time and respond to changes.
- Predictable, reliable – must promote confidence.
The effort to make #AccessibleOlli the world’s most accessible, self-driving vehicle is an exciting process indeed, and the momentum gained with each new workshop only serves to reinforce all the exciting possibilities.
Can you think of other possible ways of instrumenting a vehicle to be more accessible to people of all abilities? Follow the #AccessibleOlli on Twitter to join the discussion or post your thoughts in the comments section below.
Erich Manser is a Software Engineer who is part of the IBM Research team. Erich is fast becoming an industry thought-leader for his work on research initiatives around accessibility. His involvement in IBM projects with eyes-free navigation, self-driving tech and Watson Personal Assistant are earning him a reputation as a researcher keen on using technology to empower humans of any ability.
Discover what you can do at IBM: http://ibm.co/jobs