CE30 - Physique de la matière condensée 2025

Energetics of neural computation – ENACT

Submission summary

Despite the unsustainable growth of energy consumption by Artificial Intelligence models and the recognition that metabolic constraints have played a major role in brain evolution, the relation between computing and energy remains insufficiently studied and understood. The central hypothesis of the ENACT project is that inspiration can be drawn from a ‘real’ memory system -the brain memory center of the Drosophila fruit fly- shaped by million years of evolution in energy-constrained environments. Our project will therefore combine computational approaches based on statistical physics and machine learning with experimental measurements on the interplay between memory dynamics and metabolism in fruitflies to answer two fundamental questions: How is energy allocated and spent in an energy-efficient learning network? What strategies can make neural network performances resilient to energy availability?
The project is structured along three main axes. First, we will extract relationships between performance and energy consumption for simple neural networks capable of plasticity. Using theoretical tools from statistical physics and field theory, we will study how the dynamics of learning of these model networks is affected by energy restrictions, and characterize the tradeoffs between performances, learning time, and overall consumption. Our abstract models will then be informed by the measurements of the relationship between the consumption rate and the amplitude of plastic changes, as well as the spatially inhomogeneous distribution of energy fluxes across the neuronal assembly involved in memory formation in the fly brain.
Second, we will study the multiple coupled states of the neural and metabolic systems in the fruit fly, and investigate how this complex organization is capable of maintaining performance and adapting to fluctuating energy availability. These experimental investigations will be complemented by a detailed computational modeling of the entire learning center, which, in turn, will allow us to make predictions about the kinetics and performance of memory formation under various environmental situations.
Last of all, we will exploit our enhanced understanding of the relationship between energy and computation to examine how neural networks could be optimally designed to minimize their consumption, while guaranteeing resilience to temporary energy scarcity. This objective will be achieved using meta-learning techniques, also called ‘learning to learn’. We will train not only neural networks to achieve desired tasks but also their learning dynamics to make sure that the final networks are performant, energy-efficient and robust. The outcome will be compared to the architecture of the metabolic states in flies. In addition, we will test our optimal learning rules in silico, using performance monitoring counters, to estimate the energy gains obtained for representative computational tasks.
ENACT thus aims at fostering the emergence of a new and timely research field, by embracing the multiple challenges associated with energy and learning.

Project coordination

Rémi Monasson (Laboratoire de physique de l'ENS)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.

Partnership

LPENS Laboratoire de physique de l'ENS
PDC Plasticité du cerveau

Help of the ANR 487,025 euros
Beginning and duration of the scientific project: February 2026 - 36 Months

Useful links

Explorez notre base de projets financés

 

 

ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter