DS0404 - Innovation biomédicale

Open Data maps: interactive, collaborative, and accessible for visually impaired people – AccessiMap

Submission summary

AccessiMap is a multidisciplinary and innovative project with elements of fundamental and applied research. Its goal is to improve access to maps for the visually impaired (VI), through the design of suitable non-visual interactions based on Open Data (e.g. OpenStreetMap). The project addresses different situations (training, home, mobility) through a diversity of surfaces and interactions. Specifically, we propose to design and evaluate the prototype of an interactive collaborative table (1m) enabling VI to explore maps, but also to collaborate with other VI and / or sighted persons. This device could be used independently, but also during training sessions in geography or locomotion with teachers from the special education centers for the visually impaired (CESDV). We will use a method of participatory design including users, but also specialized teachers to design: 1 / a semi-automatic adaptation of content online (Including the aggregation of other types of content, e.g. cultural); 2 / design and evaluate the prototype of a collaborative table incorporating new forms of haptic, tangible, and multi-media interactions; and 3 / design and evaluate non-visual interactions to explore the data on all types of surface (tablet, smartphone, etc.).

We will also develop apps for smartphones as many visually impaired people already use smartphones or GPS adapted to their uses and impairments. We will examine the role of these personal devices in our different scenarios of uses, in particular as they can be used to explore geographic data. For example, data recorded during a trip can be augmented with online data, and enable VI to “replay” their own paths. With these devices, they can improve overall geographical knowledge, but also understand their own trips. We will also use personal devices as privileged interaction support, e.g. by displaying a menu of commands for the collaborative table. In addition, our approach will include other types of geo-referenced (socio-economic, cultural, artistic) contents, and will include scientific contents (like charts). This will have a positive impact on VI’s autonomy and quality of life.

From a fundamental point of view, we will make progress in HCI in the field of non-visual, tangible and multi-media interaction, but also in the field of co-design of assistive technologies. We will also make advances in Cognitive Psychology, in the fields of non-visual perception (haptic- or sound-based) and cognitive mapping (mental representation of space). These scientific goals are accompanied by a desire for a small company to expand its R & D in the field of open geographic data, but also to produce a set of applications and services for VI, their families and specialized teachers. Finally, a set of procedures and tools will be deployed in the CESDV network to change practices by integrating new technologies.
The consortium brings together a research team specializing in Human-Computer Interaction and Assistive Technologies for VI (IRIT-ELIPSE), a team of social scientists specialized in mobility and in design practices in relation to new digital objects (Telecom ParisTech, Co-Design Lab), an Open Software company in Toulouse specialized in the processing of complex geographic data (Makina Corpus), and a specialized education center (CESDV-IJA).

Project coordination

Christophe Jouffrais (Institut de Recherche en Informatique de Toulouse)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.


Télécom ParisTech LTCI Institut Mines Télécom
IRIT Institut de Recherche en Informatique de Toulouse

Help of the ANR 749,987 euros
Beginning and duration of the scientific project: October 2014 - 48 Months

Useful links

Explorez notre base de projets financés



ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter