Skip to content Skip to navigation

Manifold learning for high-dimensional data analysis in the presence of outliers - Gal Mishne

Stanford Neurosciences Institute, Gal Mishne
February 15, 2018 - 4:15pm
Packard 101

Manifold learning for high-dimensional data analysis in the presence of outliers

Gal Mishne

Gibbs Assistant Professor, Applied Math, Yale University


Abstract
In the analysis of high-dimensional data, manifold learning methods are used to reduce the dimensionality of the data, while preserving local neighborhoods and revealing meaningful structures. Out-of-sample function extension techniques are then used for analyzing new points, yet these techniques are inherently limited for handling outliers. I present an analysis of these limitations and propose a new iterative anomaly detection approach that overcomes them. As a more general solution, we propose Diffusion Nets, a new deep learning network for manifold learning that provides both out-of-sample extension and outlier detection. Our approach for out-of-sample extension is more efficient in both computational complexity and memory requirements than previous methods. Finally, I will present a new randomized near-neighbors graph construction, as opposed to the popular k-nearest-neighbors graph, and its implications for graph-based dimensionality reduction methods. 

Bio
Gal Mishne is a Gibbs Assistant Professor in the Applied Mathematics program at Yale University working with Ronald Coifman. She received her Ph.D. in Electrical Engineering in 2017 from the Technion, advised by Israel Cohen. She holds B.Sc. degrees (summa cum laude) in Electrical Engineering and Physics from the Technion, and upon graduation worked as an image processing engineer for several years.