CE37 - Neurosciences intégratives et cognitives

Processing of naturalistic motion in early vision – ShootingStar

Submission summary

The natural visual environments in which we have evolved have shaped and constrained the neural mechanisms of vision. Rapid progress has been made in recent years in understanding how the retina and visual cortex are specifically adapted to processing natural scenes. However, studies in this research tradition have mainly addressed the processing of natural images in the spatial domain. Although the processing of temporal properties of visual stimuli is just as important as spatial properties, stimuli with naturalistically valid temporal dynamics have not been sufficiently investigated. Although objects and creatures we view undergo a variety of intrinsic movements, probably the most common motions on the retina are image shifts due to our own eye movements: in free viewing in humans, ocular saccades occur about three times every second, shifting the retinal image at speeds of 100-500 degrees of visual angle per second.4 How these very fast shifts are suppressed, leading to clear, accurate and stable representations of the visual scene is an fundamental unsolved problem in visual neuroscience known as saccadic suppression. One reason why this problem is difficult is technological: to make progress we need to visually simulate these fast retinal shifts, but computer displays have been too slow to produce adequate simulations.

In this project we propose a unique convergence between neurophysiology, modeling and psychophysics, aided by recent technological developments. Some of the partners have been at the forefront of recent developments that have led to a realization that moving stimuli lead to traveling waves of activity in primary visual cortex, propagating at speeds similar to those produced by saccades. Other partners have developed detailed models of the retina and primary visual cortex based on multielectrode recordings from the retina and optical imaging of the cortex that have been able to account for these wave phenomena. Finally, another partner recently made psychophysical observations--aided by new, ultrafast computer displays that allow us to realistically simulate saccadic dynamics on a static retina--that show how image dynamics alone can account for saccadic suppression phenomena. The main hypothesis that we will be testing in this project is that cortical waves, driven by horizontal connections, are the physiological substrate behind these suppression phenomena. If this hypothesis is true, we will have solved the age-old problem of how vision is stable despite eye movements, invoking an elegant and well-documented physiological mechanisms.

We expect that the convergence of these three research currents and methodologies will lead to rapid progress in understanding how the visual system is adapted to naturalistic dynamics. The psychophysical observations will provide new leads and targets for the neurophysiology and modeling, which in turn may provide detailed neural explanations for the psychophysics. We predict that the neural architectures that have been uncovered in the retina and the primary visual cortex will be revealed as most effective when processing naturalistic, fast stimuli that arise as the consequence of eye movements.

Project coordination

Mark Wexler (CENTRE NEUROSCIENCES INTEGRATIVES ET COGNITION)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.

Partner

INT Institut de Neurosciences de la Timone
Neuro-PSI Institut des Neurosciences Paris Saclay
INCC CENTRE NEUROSCIENCES INTEGRATIVES ET COGNITION
Inria Centre de Recherche Inria Sophia Antipolis - Méditerranée
IdV Institut de la vision

Help of the ANR 641,017 euros
Beginning and duration of the scientific project: December 2020 - 48 Months

Useful links

Explorez notre base de projets financés

 

 

ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter