CE25 - Sciences et génie du logiciel - Réseaux de communication multi-usages, infrastructures de hautes performances

Profitable Pruning of Neural Networks – ProPruNN

Submission summary

Deep Neural Networks (DNN) have become the state of the art in the fields of Image Classification, Object Detection and Machine Translation, among others. However, this comes at the cost of increased complexity: more parameters, computations, energy consumption. DNN pruning is an effective way to reduce this complexity and provide high performance, low energy DNN implementations for embedded systems. ProPruNN proposes to precisely study the impact of structured pruning. This exploration will be done by co-designing hardware architectures capable of taking advantage of this pruning. The first objective is to clearly identify the real impact of structured pruning on the performance of networks implemented on FPGA. Indeed, in the literature, this impact is underestimated, because only a fraction of the prunable parameters are actually pruned. The second is to design predictive models of this impact, to incorporate it in the training of networks in order to optimize their throughput, latency, and energy efficiency during the training itself.

Project coordination

Mathieu LEONARDON (Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.

Partner

LAB-STICC Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire

Help of the ANR 252,662 euros
Beginning and duration of the scientific project: March 2023 - 42 Months

Useful links

Explorez notre base de projets financés

 

 

ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter