Calendrier

Séminaire du GERAD : Distributed stochastic convex optimization

Séminaire du GERAD :  Distributed stochastic convex optimization

Titre : Distributed stochastic convex optimization

Conférencier : Michael Rabbat – Université McGill, Canada

Résumé

This talk considers the problem of distributed convex optimization in a stochastic setting. Each node in a network of processors has a stochastic oracle for a common objective function, and the aim of the network is to collectively minimize the objective as quickly as posible. Such a problem arises, e.g., in large-scale machine learning where the goal of the network is to fit a model to training data that is spread across multiple nodes. We study a consensus-based approach where nodes individually take descent steps and then consensus iterations are performed to synchronize models across the nodes. We prove that the proposed method achieves the optimal centralized regret bound when the objective function has Lipschitz continuous gradients, and we discuss the tradeoff between communication, computation, and the network topology. This is joint work with Konstantinos Tsianos.

Entrée gratuite.
Bienvenue à tous.

Date

Mardi 12 mai 2015
Débute à 10h45

Prix

gratuit

Contact

514-340-6053 x 6991

Lieu

Université de Montréal - Pavillon André-Aisenstadt
2920, chemin de la Tour
Montréal
QC
Canada
H3T 1N8
514 343-6111
4488

Catégories