Titre : Distributed stochastic convex optimization
Conférencier : Michael Rabbat – Université McGill, Canada
Résumé
This talk considers the problem of distributed convex optimization in a stochastic setting. Each node in a network of processors has a stochastic oracle for a common objective function, and the aim of the network is to collectively minimize the objective as quickly as posible. Such a problem arises, e.g., in large-scale machine learning where the goal of the network is to fit a model to training data that is spread across multiple nodes. We study a consensus-based approach where nodes individually take descent steps and then consensus iterations are performed to synchronize models across the nodes. We prove that the proposed method achieves the optimal centralized regret bound when the objective function has Lipschitz continuous gradients, and we discuss the tradeoff between communication, computation, and the network topology. This is joint work with Konstantinos Tsianos.
Entrée gratuite.
Bienvenue à tous.