Stochastic Iterative Methods for Online Rank Aggregation from Pairwise Comparisons

-
Abstract

In this talk, we consider large-scale ranking problems where one is given a set of pairwise comparisons, and the underlying ranking explained by those comparisons is desired. We show that stochastic gradient descent approaches, particularly the Kaczmarz method, can be leveraged to offer convergence to a solution that reveals the underlying ranking while requiring low-memory operations. We introduce several variations of this approach that offer a tradeoff in speed and convergence when the pairwise comparisons are noisy. We prove theoretical results for convergence almost surely and study several regimes including those with full, partial, and noisy observations. Our empirical results give insights into the number of observations required as well as how much noise in those measurements can be tolerated.

Bio
Lara Kassab is an Assistant Adjunct Professor in the Department of Mathematics at the University of California, Los Angeles. She obtained her Ph.D. in Mathematics from Colorado State University in 2021. Her research focuses on mathematical data science, where she develops linear algebraic methods to tackle key challenges in machine learning, including transparency, efficiency, and fairness.

Description

DoMSS Seminar
Monday, March 17
1:30pm MST/AZ
GWC 487

Speaker

Lara Kassab
Assistant Adjunct Professor
UCLA

 

Location
GWC 487