CE40 - Mathématiques, informatique théorique, automatique et traitement du signal

Variational Methods for Graph Signals – GraVa

Variational Methods for Graph Signals

The goal of the project GraVa is to provide a unified mathematical framework and efficient algorithms to cope with inverse problems and learning tasks for signals on graph through variational methods.

Objective of the project and the main issues raise

Most of the literature neglects, for simplicity, the underlying graph structure, or uses over simplistic linear estimators to overcome these issues. We advocate the use of robust non-linear regularizations to deal with inverse problems or classification tasks on such signals. This stance raises several challenges:<br /><br />1. Which estimators are good candidates for such tasks, and how to assess their performance? We will encompass recent contributions from communities of graph harmonic analysis, statistics and optimization, and develop new tools in nonlinear spectral graph theory which is only emerging.<br /><br />2. How to design computationally tractable algorithms for these methods? We will rely on modern distributed and parallel optimization schemes. Our main ambition is to go beyond standard approaches by fully taking advantage of key graph properties (regularity, sparsity, etc.).<br /><br />3. What is a good metric to compare graph signals and how classify them? Measuring an L2-error is standard in signal processing but reflects a Gaussian assumption on the noise distribution. We intend to develop more robust and structure-dependent error metrics able to deal with inverse problems, segmentation and classification tasks.<br /><br />4. How to tackle time-dependent signals? Dealing with static signals is a first step, but concrete physical networks requires considering time-evolving graphs structures. We plan to transfer time-series analysis to graph signals in order to deploy our achievements to real case scenarios.

Our first axis aims at pushing the boundaries of the current knowledge in spectral graph theory in three important theoretical directions. Concurrently, in a second axis, a better understanding on how to design, analyze and implement efficient algorithms for these more and more complex graph priors is addressed for computational challenges. In particular, to cope with the growing size of the underlying graphs investigated, we will have to rely on modern distributed and parallel optimization schemes. These two axes constitute the core of GraVa and should serve as a basis for future works.

The third axis goes beyond regularized least square regression, and includes classification or segmentation on graphs tasks. Understanding how deep learning can play a role in graph signal analysis would hence be a natural question for this field due to its recent success on images and videos. This axis will be in direct connection to the last one, concerning the deployment to physical networks. In addition to the graph structure, we will have to deal with the time evolution aspect of such contexts. To help us in our deployment phase, we intend to collaborate with a French startup DCBrain, specialized in predictive models for physical networks.

Y. Pilavci, P.-O. Amblard, S. Barthelme, N. Tremblay. Smoothing graph signals via random spanning forests. ICASSP. 2020
N. Keriven, A. Bietti, SV. Convergence and Stability of Graph Convolutional Net- works on Large Random Graphs. NeurIPS. 2020. arXiv:2006.01868.
Q. Bertrand, Q. Klopfenstein, M. Blondel, SV, A. Gramfort, J. Salmon. Implicit differentiation of Lasso-type models for hyperparameter optimization. ICML. 2020. arXiv:2002.08943.
B. Pascal, SV, N. Pustelnik, P. Abry. Automated data-driven selection of the hyperparameters for Total-Variation based texture segmentation. 2020. arXiv:2004.09434. (in revision in J. Math. Vis. Imaging).
N. Keriven, SV. Sparse and Smooth: improved guarantees for Spectral Clustering in the Dynamic Stochastic Block Model. 2020. arXiv:2002.02892.
Q. Klopfenstein, SV. Linear Support Vector Regression with Linear Constraints. 2019. arXiv:1911.02306. (under review at Mach Learn.).
M. Massias, SV, A. Gramfort, J. Salmon. Dual Extrapolation for Sparse Gener- alized Linear Models. 2019. arXiv:1907.05830. (in revision in JMLR).
X. Dupuis, SV. The Geometry of Sparse Analysis Regularization. 2019. arXiv:1907.01769. (in revision in SIAM J. Optim.).

Beyond the scientific publications expected from this project, we believe in reproducible research through the contribution in open-source projects. It starts with a commitment to publish all numerical simulations associated to reports in an open and accessible way. The authors are familiar with this behavior, with experiments available online (GitHub or homepage) associated to their work.

Concerning the development of packages, the Python module NetworkX could be a good candidate for the novel developed methods, especially described in Tasks 4 and 5. Note that scikit-learn would also be well suited Python module to host some of our contributions. This package is widely used in machine learning both from academic (with around 5,000 citations according to Google Scholar) and industrial perspectives. Besides direct contributions to scikit-learn, we intend to release our work under permissive license such as the MIT or New BSD.

1. 2. 3.
4.
5. 6. 7. 8.
Y. Pilavci, P.-O. Amblard, S. Barthelme, N. Tremblay. Smoothing graph signals via random spanning forests. ICASSP. 2020
N. Keriven, A. Bietti, SV. Convergence and Stability of Graph Convolutional Net- works on Large Random Graphs. NeurIPS. 2020. arXiv:2006.01868.
Q. Bertrand, Q. Klopfenstein, M. Blondel, SV, A. Gramfort, J. Salmon. Implicit differentiation of Lasso-type models for hyperparameter optimization. ICML. 2020. arXiv:2002.08943.
B. Pascal, SV, N. Pustelnik, P. Abry. Automated data-driven selection of the hyperparameters for Total-Variation based texture segmentation. 2020. arXiv:2004.09434. (in revision in J. Math. Vis. Imaging).
N. Keriven, SV. Sparse and Smooth: improved guarantees for Spectral Clustering in the Dynamic Stochastic Block Model. 2020. arXiv:2002.02892.
Q. Klopfenstein, SV. Linear Support Vector Regression with Linear Constraints. 2019. arXiv:1911.02306. (under review at Mach Learn.).
M. Massias, SV, A. Gramfort, J. Salmon. Dual Extrapolation for Sparse Gener- alized Linear Models. 2019. arXiv:1907.05830. (in revision in JMLR).
X. Dupuis, SV. The Geometry of Sparse Analysis Regularization. 2019. arXiv:1907.01769. (in revision in SIAM J. Optim.).

Handling large datasets has become a major challenge in fields such as applied mathematics, machine learning and statistics. However, many methods proposed in the literature do not take into account the fine structures (geometric or not) behind the underlying data. Such structures can often be modeled by graphs.

Though many worldwide companies such as Google, Facebook or Twitter, have made their success extracting information where the signals live natively on a graph, a refined analysis of the underlying graph influence is still missing and most of the literature neglects, for simplicity, the underlying graph structure, or uses over simplistic linear estimators to overcome these issues. We advocate the use of robust non-linear regularizations to deal with inverse problems or classification tasks on such signals.

Project GraVa is an endeavor to solve such concrete and difficult issues through the mathematical perspective of variational methods for graph signal processing.
This stance raises several challenges:

1. Which estimators are good candidates for such tasks, and how to assess their performance? We will encompass recent contributions from communities of graph harmonic analysis, statistics and optimization, and develop new tools in nonlinear spectral graph theory which is only emerging.

2. How to design computationally tractable algorithms for these methods? We will rely on modern distributed and parallel optimization schemes. Our main ambition is to go beyond standard approaches by fully taking advantage of key graph properties (regularity, sparsity, etc.).

3. What is a good metric to compare graph signals and how to classify them? Measuring an L2-error is standard in signal processing but reflects a Gaussian assumption on the noise distribution. We intend to develop more robust and structure-dependent error metrics able to deal with inverse problems, segmentation and classification tasks.

4. How to tackle time-dependent signals? Dealing with static signals is a first step, but many networks requires considering time-evolving graphs structures. We plan to transfer time-series analysis to graph signals in order to deploy our achievements to real case scenarios.

To help GraVa reach its objective to make an industrial impact, we intend to collaborate closely with Kwyk, a French startup specialized in e-learning solutions towards high school.

Project coordinator

Monsieur Samuel Vaiter (INSTITUT DE MATHEMATIQUES DE BOURGOGNE)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.

Partner

INSTITUT DE MATHEMATIQUES DE BOURGOGNE

Help of the ANR 182,480 euros
Beginning and duration of the scientific project: March 2019 - 42 Months

Useful links

Explorez notre base de projets financés

 

 

ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter