Smart Autonomous Microscopy – SAMic
Smart Autonomous Microscopy
Our goal is to automate disturbance microscopy experiments for non-expert microscopy users and HCS applications.
The challenges and objectives of the project: develop a Roboscope
To meet this challenge, we will pursue three objectives:<br />1) Applicative, by developing two experimental biology applications for which the use of SAMic is mandatory.<br />2) Technological by designing the smartCam-LEAD (Localized Events Advanced Detector) module.<br />3) Proof of concept with the development of the prototype to demonstrate the power of SAMic.
The different bricks are:
- develop the smartCam LEAD module. The main innovation lies in the porting of a convolutional semantic segmentation (neural network) to a dedicated microprocessor based on ARM in the perspective of reaching real time, based on our current achievements in image classification. by machine learning. We will connect it to the microscope control module to perform autonomous experiments.
- develop the prototype. We aim here to go beyond a development device that only answers our two biological questions. Taking advantage of the experience of the coordinator and the industrial partner in the transfer of technological development to imaging platforms and the market, we will bring our configuration to an appropriate prototype for distribution by the industrial partner. These include the design of the own HMIs by the industrial partner and the transfer of the prototype to the Rennes microscopy platform (MRic).
- Demonstrate concept. The biological questions studied come from our laboratories: the role of AurkA in the mitochondria affecting the mitochondrial dynamics and the mechanical robustness of cell division in human cells.
To date, on the application objective, (i) we have available a solid database of annotated images (either from web databases or acquired for the project) for the development of AI algorithms for classification of cells in mitosis; (ii) we have in place the cell lines and the markings for the two proof of concept applications (robustness of the division and study of the mitochondrial network); (iii) we are starting to demonstrate the potential of the prototype for the real-time classification of the different stages of mitosis (Roboscope v1).
On the technological goal, we now have functional deep learning algorithms in real time embedded execution with transfer learning and fine tuning methods to make them efficient and generic on small datasets and on many applications. We still have to explore the algorithms allowing the localization simultaneously with the classification of the objects of interest in the images, so that we can set up the photoperturbation experiments.
At the prototype level, all the on-board automation modules are now functional and communicate with each other. The proof of concept of real-time analysis and feedback is done. It is now time to move on to (i) the user interface development stage and (ii) the prototype industrialization stage.
We are convinced that we will be able to demonstrate the concept of a biological application for the monitoring and selection of dividing cells in real time associated with precision acquisition in the coming months. We plan to submit two publications, one on algorithm developments (small datasets, transfer learning, fine tuning, real time), one on application proof of concept. We can then focus for the end of the project on aspects of photoperturbation experiments with our prototype, which should allow us to aim for a very high impact application publication.
The consortium filed a patent in February 2020 resulting from the thesis work of Mael Balluet (CIFRE Inscoper / IGDR) just before the start of the ANR. This patent is a very important initial step in connection with the SAMic project.
French patent application No. FR2001798 filed on February 24, 2020 and entitled: «Process for managing command blocks intended for a microscopy imaging system, computer program, storage means and corresponding device«.
The consortium published a first digital method of classification of mitosis images on ARM architecture to demonstrate the concept of real-time analysis, an essential result to build the smartCam-LEAD module.
Neural network fast-classifies biological images through features selecting to power automated microscopy.
Balluet M, Sizaire F, El Habouz Y, Walter T, Pont J, Giroux B, Bouchareb O, Tramier M, Pecreaux J.
J Microsc. 2021 Oct 8.doi: 10.1111 / jmi.13062. Online ahead of print.
Optical microscopy, through its variegated modalities, is an unparalleled approach to investigate the living. Beyond the bare acquisition of a snapshot, it is routinely used to gain a dynamical view of biological processes at a high frame rate (tens of frames per sec.). On top of that, it uses light as a perturbation, to either locally photo-switch the dyes to investigate the dynamics of labelled proteins, or to create a subcellular laser nano-ablation and observe how the biological system cope with it. In industrial setups, high content screening is often limited to bare observation since no generic system can perform photo-perturbative methodologies autonomously. They are still crafty approaches requiring experts; their automating is yet to be done, meaning that they are not usable in screening or routine. The smart autonomous microscopy (SAMic) project aims to open this perspective. We assert that semantic segmentation using fully convolutional network (FCN) as user-customizable image processing, embedded onto an ARM-based dedicated electronics, will enable the real-time processing to detect the right time and place to apply a perturbation (fluorescence switching or nano-ablation), and together with our Inscoper control module, a timely adapting of microscope driving.
To tackle these challenges, we will pursue three objectives: (i) Applicative, by developing two experimental biology approaches for which the SAMic is mandatory to drive the developments. The investigated biological questions are current in our labs: the role of AurkA at mitochondria affecting mitochondrial dynamics and the mechanical based robustness of cell division in human cells. (ii) Technological by designing the smartCam-LEAD module (Localised Events Advanced Detector). The main innovation lies in porting a convolutional network powered semantic segmentation to a dedicated ARM-based microprocessor in perspective to achieve real-time, building on our current achievement of machine learning image classification. We will connect it to the microscope driving module to perform autonomous experiments. (iii) Prototype to demonstrate the power of SAMic. We aim here to go beyond a development setup that can only address our two biological questions. Taking advantages of the experience of the consortium in tech development transfer to imaging facilities and the market, we will bring our setup to a proper prototype, appropriate for dissemination by the industrial partner. This involves, in particular, the design of proper HMIs by the industrial partner and the transfer of the prototype on the MRic microscopy facility.
The partners teaming up to create the SAMic, two academic labs from IGDR and Inscoper company, have broad expertise, from mathematics for image analysis to microscopy applied in biology through computer science, electronics and instrumental development. These interdisciplinary skills are a distinctive trait of our consortium and, we believe, a strong force to succeed in the project.
The SAMic project aims to be a technical and methodological major breakthrough in fluorescence microscopy to investigate life mechanisms which will allow obtaining a large amount of unsupervised photo-perturbative experiments. Artificial Intelligence applied to fluorescence microscopy will dramatically help the researchers to better observe and understand what happens within their live samples. The industrial partner Inscoper aims to offer to the market an interoperable and optimised platform to control any microscopy device and to achieve any image acquisition modalities. Adding the AI and feedback control capabilities will position SAMic as a real disruptive product in the market. Creating an intelligent microscope is one of the next big challenges for the life sciences. The project SAMic will contribute to making it real.
Project coordinator
Monsieur Marc Tramier (INSTITUT DE GENETIQUE ET DEVELOPPEMENT DE RENNES)
The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.
Partner
IGDR équipe MFQ INSTITUT DE GENETIQUE ET DEVELOPPEMENT DE RENNES
IGDR équipe CEDRE INSTITUT DE GENETIQUE ET DEVELOPPEMENT DE RENNES
INSCOPER
Help of the ANR 469,010 euros
Beginning and duration of the scientific project:
February 2020
- 36 Months