Please support our work with a donation!
- Go to MIT's Donation Form for the project.
- Enter a donation amount, then press CONTINUE to finalize.
According to the World Health Organization, approximately
285 million people worldwide are blind or visually
impaired (BVI): 39 million are blind and 246 million have
moderate to severe visual impairment, with one person losing
their vision each minute. For people without usable vision,
access to effective assistive technology – to enable independent
pursuit of activities of daily living, education, and employment – is critical.
Our team is developing wearable devices for blind and low-vision people. These devices combine sensing, computation and interaction to provide the wearer with timely, task-appropriate information about the surroundings – the kind of information that sighted people get from their visual systems, typically without conscious effort.
Safe mobility is the ability to move where one wishes, safely, efficiently and independently. To ensure safety, BVI people typically employ long canes or service animals (sac- rificing efficiency) or accept help from sighted others (sac- rificing independence and privacy). What is needed is a means for independent awareness of obstacles, drop-offs, ascents, descents etc. in the user’s path, even when the objects or hazards are small, complex or otherwise difficult to detect and characterize.
Environmental text provides information and guides people in many task domains. Examples outdoors include house num- bers and traffic and informational signage; indoor text arises in building directories; aisle guidance signs and office numbers; and nameplates. We wish to develop machine perception systems, for use by the blind or visually impaired people, that can efficiently and effectively detect and decode text from sensor observations of the surroundings.
Existing wearables provide access to notifications and voice-driven Internet queries, and provide insights into our daily activity and sleep levels. Yet they remain oblivious to our social interactions. We seek to develop wearables that can detect, characterize and assist with social interactions. Such abilities could be used to suppress inopportune distractions and could form the basis for ‘social fitness’ apps that automatically log and summarize characteristics of users’ social interactions (e.g. typical duration, time, va- riety), and could provide BVIs with improved awareness of others’ arrival, departures, and gaze direction.
We are developing a new tactile mapping framework using tactile display for the blind that codes information intuitively so that any tactile interface can best convey educational and other graphical materials intutively and interavtively. Key challenges of this work are to create rich, readily-interpreted tactile outputs, and to optimally code information via these diverse tactile signals.
The Wearable Haptic Array (WHA) project aims at developing low cost, open source wearables utilizing haptic sensory feedback to deliver rich information to human users. As a genertic method for information delivery, WHA devices act as an independent and discrete means for diversified applications such as communication, games, and assistive technology for the visually or hearing impaired.