CE37 - Neurosciences intégratives et cognitives

Sensory and cognitive interactions in the cortex of Non-human Primate – SensCogI

Submission summary

Interacting with our environment requires to constantly analyse sensory information (from the outside world) depending on internal cognitive states. For example, While searching for a friend in a crowded environment, we analyze in priority, using a mechanism called selective attention, people with similar shape, size, color hair... This is possible because of our ability to build an internal model of our friend and to store it in working memory. Therefore, our cognitive states (visual attention) interact with cortical system extracting visual information and allow us to compare visual information (what we are looking at) with cognitive information stored in working memory (what we are looking for).

Solving such a task engage several cortical and sub-cortical areas. For example, the prefrontal cortex is strongly involved in the control of cognitive functions such as the encoding of working memory or the orientation of visual attention. Moreover, the occipital cortex is organized as a hierarchy of visual areas in charge of extracting basic visual features from visual stimulation. In addition, I have recently proposed , in a series of scientific articles (Ibos & Freedman 2014, 2016, 2017; Freedman & Ibos 2018) a new theoretical framework in which the posterior parietal cortex (lateral Intraparietal area LIP) acts as an interface between both prefrontal and occiptal networks and played a fundamental role in comparing sensory and cognitive representations. In this model, LIP integrate bottom-up, sensory information from the visual cortex (which have been previously gated by top-down attention) as well as top-down signals from the prefrontal cortex about the identity of the stimulus kept in working memory and compare them. This framework raises several questions.

First, we still don't understand how PFC (i) dynamically encodes cognitive information and (ii) appropriately targets and modulates populations of neurons in the visual cortex which are tuned to the behaviorally relevant features. Second, in this framework, LIP acts as a receiver of attention modulations while it has been traditionally described as an emitter of attention modulations. It is therefore necessary to directly test the role of LIP in the control of selective attention.Finally, it is important to define (i) by LIP integrates bottom-up and top-down signals and (ii) the computations by which LIP compares them.

In order to answer these questions, we will record simultaneously record the activity of PFC, LIP and V4 neurons during a behavioral task which will allow us to parametrically control (i) the features of visual stimuli, (ii) the information stored in working memory and (iii) the information used to decision making. In a second time, we will associate this multi-site multi-unit recording approach to causal manipulation as we will reversibly inactivate LIP. This will allow us to test the specific role of LIP in the integration of visual information and in the control of visual attention. This novel approach will show how three major cognitive mechanisms are supported by the interaction of three of the most studied cortical areas. It will allow us to better link cognitive and cerebral functions and to better interpret deficits related to cortical lesions.

Project coordination

Guilhem Ibos (Centre National de la Recherche Scientifique Délégation Provence et Corse_INT)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.

Partner

CNRS DR12_INT Centre National de la Recherche Scientifique Délégation Provence et Corse_INT

Help of the ANR 325,084 euros
Beginning and duration of the scientific project: September 2019 - 48 Months

Useful links

Explorez notre base de projets financés

 

 

ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter