Latest News and Events
Towards Understanding Mixtures of Gaussians: Spectral Methods and Polynomial Time Learning with No Separation
Mixtures of Gaussian distributions is an important tool in machine learning and statistics, widely used in numerous applications, such as speech recognition and computer vision. In recent years there has been considerable progress toward theoretical understanding of the complexity of estimating mixture distribution, especially in high dimensions. However, commonly used methods, such as Expectation Maximization suffer from several shortcomings, particularly their sensitivity to initialization and the inability to estimate the number of components required.
In my talk I will discuss a class of spectral methods based on eigenvectors of certain kernel matrices for learning parameters of Gaussian mixtures and provide theoretical analyses and experimental results showing that these methods may overcome some of the shortcomings of Expectation Maximization. In a slightly different direction, I will also discuss some recent work showing the first polynomial time algorithm for learning mixtures of Gaussians in high dimension with arbitrarily small separation between components. Parts of the talk are joint work with Tao Shi, Bin Yu and Kaushik Sinha.