Omar Montasser – Beyond Worst-Case Online Classification

When: 9/19/25 3:00 PM
Where: Tyler 055
Abstract: In this talk, we revisit online binary classification by shifting the focus from competing with the best-in-class binary loss to competing against relaxed benchmarks that capture smoothed notions of optimality. Instead of measuring regret relative to the exact minimal binary error — a standard approach that leads to worst-case bounds tied to the Littlestone dimension — we consider comparing with predictors that are robust to small input perturbations, perform well under Gaussian smoothing, or maintain a prescribed output margin. Our algorithms achieve regret guarantees that depend only on the VC dimension and the complexity of the instance space (e.g., metric entropy), and notably, they incur only an O(log(1/γ)) dependence on the generalized margin γ. This stands in contrast to most existing regret bounds, which typically exhibit a polynomial dependence on 1/γ. We complement this with matching lower bounds.

Based on joint work with Abhishek Shetty and Nikita Zhivotovskiy.

Omar Montasser is an Assistant Professor at Yale in the department of Statistics and Data Science. His research broadly explores theory and foundations of machine learning. Prior to joining Yale, Montasser was a FODSI-Simons postdoctoral fellow at UC Berkeley. He earned his Ph.D. from the Toyota Technological Institute at Chicago in 2023.