The inability to operate a conventional motor vehicle has proven to be a significant barrier to personal mobility for the millions of blind and low vision persons in the United States.
Research suggests that in addition to impacting quality of life, mobility issues impact access to employment opportunities, educational resources and medical treatment for many of these visually impaired persons. Access to an accessible self-driving vehicle for a visually impaired person may therefore prove life-changing.
Despite the tremendous promise of this technology, however, advocates for persons with visual disabilities argue that these emerging vehicles are being designed in a manner that will render them largely inaccessible to persons with significant visual disabilities.
The rise of commercially available human quantification devices like smartwatches and wearable bands presents an opportunity for the amelioration of this issue, however. Within the context of self-driving vehicles, real time access to analyses of galvanic skin response, neural electrical activity, blood pressure or heart rate for instance might produce a more symbiotic relationship between the human operator and the machine while increasing the intelligence of the autonomous system.
For users with visual disabilities specifically, this type of user quantification-based or quantification-enhanced interaction may more readily support non-visual interaction with a self-driving vehicle in a manner that may surpass the use of haptic or audio interfaces alone. Given the potential impact self-driving vehicles for persons with visual disabilities, coupled with the rise of advanced quantification devices, there is a critical need to explore how human quantification may be used to facilitate non-visual human-vehicle interaction.
The goals of this project are to:
1) Gain a foundational understanding of the factors that influence the visually impaired user’s experience with an autonomous vehicle,
2) To understand how human quantification may be realized in an automotive context for the visually impaired with advanced sensing technologies,
3) To explore the relationship between user experience factors and measures of human quantification for visually impaired users,
4) To develop tools that capitalize on the hypothesized relationship between the visually impaired user experience factors and their states,
5)To evaluate the derived tools under laboratory conditions to measure their impact on the subjective experience of visually impaired self-driving vehicle users.
Project Date: Jan. 1, 2019 - Present
Keywords: accessibility, self-driving vehicles, bci, wearables