Visually impaired individuals are a growing segment of our population. However, social constructs are not always designed with this group of people in mind, making the development of better electronic accessibility tools essential in order to fulfill their daily needs. Traditionally, such assistive tools came in the form of expensive, specialized and at times heavy devices which visually impaired individuals had to independently look after and carry around. The past few years have witnessed an exponential growth in the computing capabilities and onboard sensing capabilities of mobile phones making them an ideal candidate for building powerful and diverse applications. We believe that the mobile phone can help concentrate the main functions of all the various specialized assistive devices into one, by enabling the rapid and cheap development of simple and ubiquitous assistive applications. The current thesis describes the design, implementation, evaluation and user-study based analysis of four different mobile applications, each one of which helps visually impaired people overcome an everyday accessibility barrier.
Our first system is a simple to operate mobile navigational guide that can help its visually impaired users repeat paths that were already walked once in any indoor environment. The system permits the user to construct a virtual topological map across points of interest within a building by recording the user's trajectory and walking speed using Wi-Fi and accelerometer readings. The user can subsequently use the map to navigate previously traveled routes without any sighted assistance. Our second system, Mobile Brailler, presents several prototype methods of text entry on a modern touch screen mobile phone that are based on the Braille alphabet and thus are convenient for visually impaired users. Our third system enables visually impaired users to leverage the camera of a mobile device to accurately recognize currency bills even if the images are partially or highly distorted. The final system enables visually impaired users to determine whether a pair of clothes, in this case of a tie and a shirt, can be worn together or not, based on the current social norms of color-matching.
|Commitee:||Fergus, Rob, Overton, Michael, Perlin, Ken, Shasha, Dennis, Subramanian, Lakshminarayanan|
|School:||New York University|
|School Location:||United States -- New York|
|Source:||DAI-B 74/01(E), Dissertation Abstracts International|
|Keywords:||Assistive technology, Image recognition, Location tracking, Mobile phones, Text entry systems, Visually impaired|
Copyright in each Dissertation and Thesis is retained by the author. All Rights Reserved
The supplemental file or files you are about to download were provided to ProQuest by the author as part of a
dissertation or thesis. The supplemental files are provided "AS IS" without warranty. ProQuest is not responsible for the
content, format or impact on the supplemental file(s) on our system. in some cases, the file type may be unknown or
may be a .exe file. We recommend caution as you open such files.
Copyright of the original materials contained in the supplemental file is retained by the author and your access to the
supplemental files is subject to the ProQuest Terms and Conditions of use.
Depending on the size of the file(s) you are downloading, the system may take some time to download them. Please be