Information theory approach to Machine learning, Neural Computing and Arti cial Intelligence: a new perspectives for Statistical inference and optimal design?

Monday, October 15, 2018 - 1:30pm


Milan Stehlik
Arizona State University


What is optimal learning of complex data? Do we always expect to learn by Bayesian learning?

Can we learn negatively? Counterexamples, and several paradoxes on regular and singular mod-

els will be provided. Some vogue-hypes of Machine learning from Neural Computation perspective

will be discussed. Importance of Statistics and Optimal designs will be acknowledge in a ecumenic

way to Machine learning and neural networks. Some advanced statistical techniques, like from al-

gebraic statistics and nonparametric statistics will be advertised. I will also introduce topological

and semi-topological data analysis (TDA and STDA) for evolving complex data systems. Applica-

tions to image analysis, nance, ecology, machine learning, neural networks, neural computing and

neuroscience will be given (e.g. Hermann et al. (2015), Stehlk et al. (2017), Kupka et al. (2017),

Guss and Salakhutdinov (2018) and Gebhart (2018), among others).

Author acknowledge support of ASU, FONDECYT Regular N 1151441 and project LIT-2016-1-



Event Type