Auditory space and multisensory attention in cochlear implant patients – VIRTUALHEARING3D
Cochlear implants CI are electronic devices that provide partial hearing to individuals who are profoundly deaf or severely hard-of-hearing. They substitute the damaged receptors of the cochlea, transducing sounds from the environment into electrical impulses for the brain. Thirthy-one years after the first approval of CI use in humans, more than 320.000 people worldwide have received a CI (www.nidcd.nih.gov) and, in France, people with CI are approximately 7000 (www.cochleefrance.fr). Despite their overall clinical success, these biomedical devices are still incomplete approximations of the full auditory experience that characterises a hearing person. First and foremost, the current generations of CI provide only limited access to natural sounds recognition or music perception and to spatial hearing. Localisation of sounds in the environment and perception of the 3D structure of the auditory scene are central to our ability to discern signals from noise and orient auditory attention in space. Although deficits in these skills are well-known in CI patients, rehabilitation after CI surgery is currently limited to linguistic competences and does not aim to increase spatial hearing abilities in these patients. Recent clinical advances in CI surgery that promote binaural hearing and approval of CI use in people with single-sided deafness (SSD) make the development of spatial-hearing rehabilitation particularly pressing and timely. Moreover, experimental approaches that take advantage of multisensory interactions to promote learning suggest that rehabilitation of spatial hearing and spatial attention in CI patients may actually be feasible. The overall goal of the present project is to develop and validate a multisensory approach for the study and rehabilitation of spatial hearing in CI patients. Specifically, we aim to (1) establish a new virtual-reality based paradigm for measuring spatial hearing in 3D space; (2) develop virtual-reality audio-visual stimulation for promoting multisensory learning and attention.
Project coordination
Francesco Pavani (Lyon Neuroscience Research Center - Integrative Multisensory Perception Action & Cognition Team - Unité Inserm 1028)
The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.
Partner
CHUT CHU TOULOUSE
HCL - CHU - HEH Lyon Civil Hospitals - Centre Hospitalier Universitaire - Edouard Herriot
CNRS - CERCO - UMR 5549 Centre National de la Recherche Scientifique
IMPACT Inserm U 1028 Lyon Neuroscience Research Center - Integrative Multisensory Perception Action & Cognition Team - Unité Inserm 1028
Help of the ANR 414,971 euros
Beginning and duration of the scientific project:
September 2016
- 48 Months