Online Algorithms Beyond Traditional Approaches – OATA
The traditional design and analysis of algorithms assumes that complete knowledge of the entire input is available to an algorithm. However, in many cases the input is revealed online over time, and the algorithm needs to make its current decision without knowledge of the future. For example, scheduling jobs that arrive over time, managing a portfolio of stocks, making prediction based on expert advice and so on. Thus, the main issue in online computation is obtaining good performance in the face of uncertainty due to inputs arriving sequentially, one at a time. Besides, the emerging of massive data problems gives rise to the need of algorithms which solve problems while reading the input with a single pass in the sense of online algorithms.
Online computation is a well-established and active fields. Many interesting algorithms with performance guarantee and deep techniques have been designed. However, the current set of techniques does not provide effective means to study problems whose nature is non-linear or even non-convex such as the problems in the contexts of super-modular (energy minimization) or sub-modular function optimization. Together with the widely-open status of many fundamental problems, the development of new principled approaches is crucial for the advance of the field.
The other main issue, well-identified in online computation, is the weakness of the worst case paradigm. Summarizing an algorithm performance by a pathological worst case can overestimate its performance on most inputs. Many practically well-performed algorithms admit mediocre theoretical guarantee whereas theoretically established ones behave poorly even on simple instance in practice. Several models have been introduced beyond the worst case analysis. Each model has successfully explained the performance of algorithms in certain contexts but has limits in other classes of problems. A common point of those models is that they employ the same tools to analyze algorithms as in the worst-case model. The lack of appropriate tools is a primary obstacle for the development of the models.
In the project, our first objective is to establish principled methods for the design and analysis of online algorithms beyond the current techniques. We aim to develop tools to study convex and non-convex problems. The second objective is to investigate new models that measure accurately the algorithm performance beyond the worst-case analysis. In particular, we are interested in the foundation of a general model of resource augmentation unifying previous apparently unrelated models. We propose novel approaches based on the duality paradigm of mathematical programming as traversal tools in both objectives.
Beside of theoretical interests, the advance of the project could give significantly improved algorithms in the context of computational sustainability (e.g., energy-aware scheduling) and that of massive data (e.g., sub-modular optimization).
Monsieur Kim Thang Nguyen (Laboratoire Informatique, Biologie Intégrative et Systèmes Complexes, Université Evry Val d'Essonne)
The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.
IBISC Laboratoire Informatique, Biologie Intégrative et Systèmes Complexes, Université Evry Val d'Essonne
Help of the ANR 234,291 euros
Beginning and duration of the scientific project: September 2015 - 48 Months