On Rate-optimal Uniform Concentration Inequalities for Shannon Entropies

-
Abstract

We present a new type of exponential decay concentration inequalities that bounds the tail probability of the difference between the log-likelihood of discrete random variables and the negative entropy. In contrast to classical Bernstein’s inequality and Hoeffding’s inequality when applied to log-likelihoods, the new bound is independent of the parameters and therefore does not blow up as the parameters approach 0 or 1. We further present a refined inequality that achieves the optimal rate (log K)^2/n=o(1), where n is the sample size and K is the number of possible values of the discrete variable. The key step in the proof is bound the moment generating function. We prove the bound by viewing it as a non-convex optimization problem and showing the duality gaps are zero by techniques in real analysis. The new inequalities strengthen certain theoretical results on likelihood-based methods for community detection in networks and can be applied to other likelihood-based methods for binary data and information theory.
 

Bio
Dr. Zhao obtained his Ph.D. at University of Michigan. He is is an associate professor in the School of Mathematical and Natural Sciences in New College of Interdisciplinary Arts and Sciences at Arizona State University. His primary research interest includes machine learning methodology and theory in network analysis with applications in biology and the social sciences. He is also working on high dimensional data analysis with applications in genomics.

Description

Statistics Seminar
Friday, November 5
11:00am MST/AZ

Virtual via Zoom
https://asu.zoom.us/j/88521538236?pwd=K1VscVlWTmFnN0tsRHlrWG8rT0Nhdz09
Meeting ID: 885 2153 8236
Password: ASUSTATS

Speaker

Yunpeng Zhao
Assistant Professor
School of Mathematical and Natural Sciences
Arizona State University

Location
Virtual via Zoom