Scalable Multiclass High-Dimensional Linear Discriminant Analysis via the Randomized Kaczmarz Method

-
Abstract

Fisher's linear discriminant analysis (LDA) is a foundational method of dimension reduction for classification that has been useful in a wide range of applications.  The goal is to identify an optimal subspace to project the observations onto that simultaneously maximizes between-group variation while minimizing within-group differences.  The solution is straight forward when the number of observations is greater than the number of features but difficulties arise in the high dimensional setting, where there are more features than there are observations.  Many works have proposed solutions for the high dimensional setting and frequently involve additional assumptions or tuning parameters.  We propose a fast and simple iterative algorithm for high dimensional multiclass LDA on large data that is free from these additional requirements and that comes with some guarantees.  We demonstrate our algorithm on real data and highlight some results.

Bio 
https://jocelynchi.com/

Description

DoMSS Seminar
Monday, February 24
1:30pm MST/AZ
GWC 487

Speaker

Jocelyn Chi
Assistant Professor
Dept of Applied Mathematics
University of Colorado Boulder

 

Location
GWC 487