Three Topics in High-Dimensional Machine Learning.

-
Type
Abstract

Big data often means more data points, but it can also mean higher-dimensional data points. When data points are high dimensional, conventional machine learning algorithms can fail in interesting ways, either due to quirks of high-dimensional probability or because of an explosion in computational complexity of simple subroutines.

In this talk, I will describe three ways in which my work extends the applicability of machine learning to high-dimensional data. The first involves a new kind of data-driven metric which differentiates high-dimensional data points in ways the Euclidean metric cannot. The second topic borrows ideas from signal processing to produce several novel algorithms for zeroth-order optimization when the number of variables is extremely large, in some cases over 1.7 million. The final topic presents a new neural network architecture for predicting the Nash equilibrium of a contextual, many-player game. Whereas previous architectures were constrained to very small games, our proposed Nash-FPN architecture easily scales to games with hundreds of players and/or actions. I demonstrate this by applying a Nash-FPN to a city-scale traffic flow prediction problem.

Description

This colloquium will be presented in person in WXLR A206.

If you cannot join us in person, you can connect via Zoom: 
https://asu.zoom.us/j/88910082801?pwd=RVhEY3A3RHg1NG5zVWFRQ2dUK0NnQT09


 

Speaker

Daniel McKenzie
Assistant Adjunct Professor
UCLA

Location
WXLR A206 and virtual via Zoom