11 Jul 2024
11:15

Alte Universität, Rheinsprung 9, 4051 Basel, Seminarraum -201

A Universal Framework for Federated (Convex) Optimization: Sebastian Stich

Machine learning and Optimization Talk

Federated learning has emerged as an important paradigm in modern large-scale machine learning. Unlike traditional centralized learning, where models are trained using large datasets stored on a central server, federated learning keeps the training data distributed across many clients, such as phones, network sensors, hospitals, or other local information sources. In this setting, communication-efficient optimization algorithms are crucial. We provide a brief introduction to local update methods developed for federated optimization and discuss their worst-case complexity. Surprisingly, these methods often perform much better in practice than predicted by theoretical analyses using classical assumptions. Recent years have revealed that their performance can be better described using refined notions that capture the similarity among client objectives. In this talk, we introduce a generic framework based on a distributed proximal point algorithm, which consolidates many of our insights and allows for the adaptation of arbitrary centralized optimization algorithms to the convex federated setting (even with acceleration). Our theoretical analysis shows that the derived methods enjoy faster convergence if the degree of similarity among clients is high. Based on joint work with Xiaowen Jiang and Anton Rodomanov.


Veranstaltung übernehmen als iCal