Accessible interactive graphics for visually impaired users
Mar 15th, 14:00
Sorbonne University, towers 65/66 – room 304 (3rd floor) How to get to there?
STEM is an American acronym for science, technology, engineering and mathematics. These four disciplines rely on graphical representations and are considered central in technologically advanced societies. Obviously graphs are inherently visual and therefore inaccessible for the visually impaired (approx 5% of the world population.) This has important consequences on education, social inclusion and quality of life.
Raised-lines maps are the most common tool for providing access to tactile graphics but have numerous limitations (cost, limited number of elements displayed, knowledge of Braille, etc.) A few research projects aimed to overcome these limitations by designing interactive systems for accessing digital image [see 1 for review]. Based on this previous work, we developed a set of devices based on tactile exploration that allow non-visual access to images. The interactive audio-tactile device called Mappie  allows access to multiple levels of information. It is based on a raised-line map overlay placed over a tactile surface. In addition to map exploration, it provides advanced interactions functions (e.g. learning routes). We showed that it is effective for acquiring spatial concepts, and is more usable than regular raised-lines map . It is currently used by professionals of low vision and commercialization is planned. We have also designed a device allowing the visually impaired to build and explore tangible representations of digital graphics . It is based on the design of tangible objects that represent important elements on a map, and which can be linked to each other in order to create interactive lines and areas. Adapted non-visual guidance assists users in placing and linking objects to build new graphical representations. We have shown that this device is usable by visually impaired users to build and explore graphs of various complexities. More recently, we designed a device based on a smartwatch and filtering functions that allows visually impaired users to explore virtual maps in mobility .
Dr. Christophe Jouffrais is with the IRIT Lab (UMR5505, CNRS & Univ of Toulouse) in Toulouse, FR. Recently, he joined the IPAL research lab in Singapore. He is a senior CNRS researcher with a background in Cognitive Science. He holds a European PhD (2000) in cognitive neuroscience from the University of Lyon, FR and the University of Fribourg, CH. His current research focuses on non-visual spatial perception, action and cognition in visually impaired human, with an emphasis on non-visual human-computer interaction, and Assistive Technologies. Ongoing research projects aimed at designing technologies that help visually impaired users to understand and interact with maps.