Algorithmic Techniques for Restricted Data Access Models – RDAM
Algorithmic Techniques for Restricted Data Access Models
This project aims to better understand computation when access to input data is restricted in various ways. The scenarios lead to a fresh study of the usual notions of complexity such as time or space.
How can one deal with massive quantities of data?
Running time considerations lead to natural restrictions modeled by sublinear time algorithms. Sublinear time algorithms, and among them property testers, impose a partial view of the input. They are inherently robust against corrupted data. Space considerations lead to natural restrictions modeled by streaming algorithms, and more generally algorithms with external memory access, that have sublinear internal memory and sequential access to input data. In some cases, Communication complexity can provide crucial tools for proving impossibility results.<br /><br />How can one deal with pieces of private data revealed at the discretion of their owners? Access to inputs may also be restricted in the sense that the input seen by the algorithm may not be the «true« input. Game-theoretic settings then provide a model for algorithmic design, that can also be explored to incorporate privacy concerns. Quantum game theory can provide some tools for proving impossibility results.<br /><br />How can one deal with network data that dynamically evolves over time? Instead of viewing the changes as liabilities, one can use them as assets that highlight trends and pervasive structure.
This project brings together researchers from different fundamental areas of computer science, and leverages their expertise to develop technical tools to design algorithms and to study the complexity of restricted data access models. Some tools will be adapted from existing ones, and others will be created.
1. We designed the first quantum algorithm for recommendation systems that runs in time polylogarithmic in the dimensions of the matrix and provides an example of a quantum machine learning algorithm for a real world application (ITCS'17).
2. We established several tight separations between some measures of deterministic, randomized and quantum query complexity (STOC’16). In particular our separations disproved a conjecture of Saks and Wigderson from 1986.
3. We formally defined and studied the glass ceiling effect in social networks and proposed a natural mathematical model, called the biased preferential attachment model, that partially explains the causes of the glass ceiling effect (ITCS’15).
4. How much cutting is needed to simplify the topology of a surface? We provide bounds for several important instances of this question (SOCG'14). In particular we prove a conjecture of Przytycka and Przytycki from 1993.
5. First implementation of a quantum protocol for coin flipping, with security unconditionally stronger than in any classical protocol (Nature Communication 2014).
The two research groups involved in this project have a very strong track record of fundamental research in theoretical computer science. Within the framework of fundamental research, they are starting to shift perspective according to the new priorities laid out in this proposal. The goal of this project is to support this evolution towards emerging directions. In some of these directions, they already compete at the international level. In the others, the objective is to reach that level within a short period.
The project has generated 111 publications, including 38 in the most prestigious international journals, and 67 in the most famous international conference proceedings. Note also 4 public broadcast actions.
This project aims to better understand computation when access to input data is restricted in various ways. The scenarios lead to a fresh study of the usual notions of complexity such as time or space.
How can one deal with massive quantities of data? Running time considerations lead to natural restrictions modeled by sublinear time algorithms. Sublinear time algorithms, and among them property testers, impose a partial view of the input. They are inherently robust against corrupted data. Space considerations lead to natural restrictions modeled by streaming algorithms, and more generally algorithms with external memory access, that have sublinear internal memory and sequential access to input data. Communication complexity provides tools for proving impossibility results.
How can one deal with pieces of private data revealed at the discretion of their owners? Access to inputs may also be restricted in the sense that the input seen by the algorithm may not be the "true" input. Game-theoretic settings then provide a model for algorithmic design, that can also be explored to incorporate privacy concerns. Quantum game theory provides some tools for proving impossibility results.
How can one deal with network data that dynamically evolves over time? Instead of viewing the changes as liabilities, they can be used as assets that highlight trends and pervasive structure.
This project brings together researchers from different fundamental areas of computer science, and leverages their expertise to develop technical tools to design algorithms and to study the complexity of restricted data access models. Some tools will be adapted from existing ones, and others will be created.
Project coordination
Frederic MAGNIEZ (Laboratoire d'Informatique Algorithmique : Fondements et Applications)
The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.
Partnership
LIENS Laboratoire d’Informatique de l’Ecole Normale Supérieure
LIAFA Laboratoire d'Informatique Algorithmique : Fondements et Applications
Help of the ANR 510,359 euros
Beginning and duration of the scientific project:
December 2012
- 48 Months