CE23 - Intelligence artificielle

Rethinking archive PostProduction with LEarning, vAriational, and Patch-based methods – PostProdLEAP

Submission summary

The goal of the PsotProdLEAP project is to develop new tools for video archive post-production by leveraging both recent deep learning approaches and patch-based and variational approaches. Frequent artefacts observed with deep-learning methods include loss of details, spatial and temporal discontinuities, and colour bleeding along edges. These artefacts make state-of the-art deep learning approaches inappropriate for professional post-production. In the framework of the PostProdLEAP project, these limitations will be solved by designing deep learning models including spatial and temporal regularization and constraints on texture features. These tools will be developed in collaboration with the artists and historians from Composite Films, world leader in restoration and colorization of films and archive footage. Through this collaboration, we will enable artists interaction and give them control on the final results, while making their tasks less time consuming and tedious than with current professional softwares. Our models will be trained, tested and validated on datasets of video archives with different resolutions and quality, created specially for the project from movies restored by Composite Films.

Project coordinator

Madame Aurelie Bugeau (Laboratoire Bordelais de Recherche en Informatique)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.

Partner

UPDESCARTES-MAP5 Mathématiques appliquées à Paris 5
COMPOSITE
LaBRI Laboratoire Bordelais de Recherche en Informatique
IMB Institut de mathématiques de Bordeaux

Help of the ANR 698,758 euros
Beginning and duration of the scientific project: December 2019 - 48 Months

Useful links

Explorez notre base de projets financés

 

 

ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter