Max Vladymyrov / Максим Владимиров

Software Engineer
Google Inc.
1600 Amphitheatre Pkwy, Mountain View, CA

Email: vladymyrov [at] gmail.com

About

I am a software engineer working at Google Inc. Before that I spent almost two years working as a Research Scientist at Yahoo Labs. Before that I did my PhD studies at UC Merced on the problems of large-scale dimensionality reduction under the supervision of Miguel Á. Carreira-Perpiñán. I received two MS in Computer Science and in International Economic Relations and BS in Applied Math all from Kharkiv National University in Ukraine. You can find my CV over here.

My work focuses primarily on the problems of search science, such as relevance, query classification and query rewriting. My long-term research goals include developing new techniques for large-scale machine learning. This will enable researchers to learn the structure of data that consists of a million or more observations. I'm also interested in the following topics: nonlinear optimization, feature extraction, deep learning, dimensionality reduction and metric learning.

Publications

  • Vladymyrov, M. and Carreira-Perpiñán, M. Á. (2016): "The variational Nyström method for large-scale spectral problems".
    33th International Conference on Machine Learning (ICML 2016), pp. 211-220.
    [external link] [paper preprint] [supplementary] [reviews] [poster] [slides]
    ▸ This paper addresses the problem of fast approximate spectral methods and presents a variation of a Nyström algorithm that is more justified in that context.

  • Carreira-Perpiñán, M. Á. and Vladymyrov, M. (2015): "A fast, universal algorithm to learn parametric nonlinear embeddings".
    29th Annual Conference on Neural Information Processing Systems (NIPS 2015), pp. 253-261.
    [external link] [paper preprint] [supplementary] [reviews] [poster] [Matlab code]
    ▸ This uses the method of auxiliary coordinates (MAC) to learn an optimal mapping (such as linear or a neural net) for a nonlinear embedding (such as the elastic embedding or t-SNE).
    ▸ This work was also presented at BayLearn 2014 [extended abstract] [poster]

  • Vladymyrov, M. (2014): "Large-scale methods for nonlinear manifold learning",
    PhD thesis, Electrical Engineering and Computer Science, University of California, Merced.
    [external link] [paper] [defense slides]

  • Vladymyrov, M. and Carreira-Perpiñán, M. Á. (2014): "Linear-time training of nonlinear low-dimensional embeddings".
    17th International Conference on Artificial Intelligence and Statistics (AISTATS 2014), pp. 968-977.
    [external link] [paper preprint] [supplementary] [poster] [Matlab code]
    ▸ This paper applies fast multipole methods and Barnes-Hut approximation to nonlinear embedding methods, allowing to get fast embedding of datasets with more than a million points.
    ▸ Check out the animation of optimizing one million MNIST digits. Every marker corresponds to one MNIST digit. Here is the same animations, but subsampled and visualized using actual images of the digits.
    ▸ This work was also presented at BayLearn 2013 [extended abstract] [poster]

  • Vladymyrov, M. and Carreira-Perpiñán, M. Á. (2013): "Locally linear landmarks for large-scale manifold learning".
    24th European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD 2013), pp. 256-271.
    [external link] [paper preprint] [poster] [slides] [Matlab code]
    ▸ This paper defines locally linear landmarks (LLL) algorithm, a method to scale-up spectral method algorithms (such as Kernel PCA, Laplacian Eigenmaps, Spectral Clustering) using landmarks selected from data.
    ▸ This work was also presented at Spectral Learning Workshop at ICML 2013 [paper preprint] [poster]

  • Vladymyrov, M. and Carreira-Perpiñán, M. Á. (2013): "Entropic affinities: properties and efficient numerical computation".
    30th International Conference on Machine Learning (ICML 2013), pp. 477-485.
    [external link] [paper preprint] [supplementary] [slides] [video] [poster] [Matlab code]
    ▸ This introduces the entropic affinities, a fast method to compute variable-bandwidth affinity matrix. This method should give better results than fixing bandwidth σ to a single value.

  • Vladymyrov, M. and Carreira-Perpiñán, M. Á. (2012): "Partial-Hessian strategies for fast learning of nonlinear embeddings".
    29th International Conference on Machine Learning (ICML 2012), pp. 345-352.
    [external link] [paper preprint] [supplementary] [slides] [video] [poster] [Matlab code]
    ▸ This introduces the spectral direction, a fast optimization method for nonlinear embedding methods. This code is the fastest one available for EE, SNE and t-SNE as far as I know.
    ▸ This work was also presented at BayLearn 2012 [extended abstract] [slides]

Social media

[Flickr] [Instagram] [LinkedIn]