CHIST-ERA step 2 - 2ème étape du 9ème appel à projets de l'ERA-NET CHIST-ERA 2019

Analog PROcessing of bioinspired VIsion Sensors for 3D reconstruction – APROVIS3D

Submission summary

APROVIS3D project targets analog computing for artificial intelligence in the form of Spiking Neural Networks (SNNs) on a mixed analog and digital architecture including field programmable analog array (FPAA) and SpiNNaker, applied to a stereopsis system dedicated to coastal surveillance using an aerial robot. Computer vision systems widely rely on artificial intelligence and especially neural network based machine learning, which recently gained huge visibility. The training stage for deep convolutional neural networks is time-consuming and yields enormous energy consumption. In contrast, the human brain has the ability to perform visual tasks with unrivalled computational and energy efficiency. It is believed that one major factor of this efficiency is the fact that information is vastly represented by short pulses (spikes) at analog –not discrete– times. However, computer vision algorithms using such representation still lack in practice, and its high potential is largely underexploited. Inspired from biology, the project addresses the scientific question of developing a low-power, end-to-end analog sensing and processing architecture of 3D visual scenes, running on analog devices, without a central clock and to validate them in real-life situations. More specifically, the project will develop new paradigms for biologically-inspired vision, from sensing to processing, in order to help machines such as Unmanned Autonomous Vehicles (UAV), autonomous vehicles, or robots gain high-level understanding from visual scenes. The ambitious long-term vision of the project is to develop the next generation AI paradigm that will eventually compete with deep learning. We believe that neuromorphic computing, mainly studied in EU countries, will be a key technology in the next decade. It is therefore both a scientific and strategic challenge for EU to foster this technological breakthrough.
The consortium from four EU countries has a unique combination of expertise necessary for this project. SNNs specialists from the field of visual sensors (IMSE, Spain), neural network architecture and computer vision (Uni. of Lille, France) and computational neuroscience (INT, France) will team up with robotics and automatic control specialists (NTUA, Greece), and low power integrated systems designers (ETHZ, Switzerland) to help geoinformatics researchers (UNIWA, Greece) build a demonstrator UAV for coastal surveillance (TRL5). All the members of the consortium share a common interest in analog based computing and computer vision with complementary points of view and expertise.
Key challenges of this project will be end-to-end analog system design (from sensing to AI-based control of the UAV and 3D coastal volumetric reconstruction), energy efficiency, and practical usability in real conditions. We aim to show that such a bioinspired analog design will bring large benefits in terms of power efficiency, adaptability and efficiency needed to make coastal surveillance with UAVs practical and more efficient than digital approaches.

Project coordination

Jean MARTINET (Laboratoire informatique, signaux systèmes de Sophia Antipolis)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.

Partnership

CRIStAL Centre de Recherche en Automatique, Signal et Automatique de Lille
NTUA National Technical University of Athens
I3S Laboratoire informatique, signaux systèmes de Sophia Antipolis
UNIWA University of West Attica
INT Institut de Neurosciences de la Timone
IMSE-CNM Instituto de Micorelctronica de Sevilla IMSE-CNM
ETH ETH Zürich

Help of the ANR 357,372 euros
Beginning and duration of the scientific project: March 2020 - 36 Months

Useful links

Explorez notre base de projets financés

 

 

ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter