Schedule of Workshop 2: Manifold Learning

Monday, May 30

Morning Meet-up at HIM
14:00-14:15 Opening
14:15-15:15 John A. Lee: Unsupervised Dimensionality Deduction: from PCA to recent nonlinear techniques
15:15-16:00 Cake
16:15-17:00 Alexander Hullmann: The Generative Topographic Mapping for dimensionality reduction and data analysis

Tuesday, May 31

09:30-10:30 Neil Lawrence: A unifying probabilistic perspective on spectral approaches to dimensionality reduction
10:30-11:15 Coffee break
11:15-12:00 Alexander Paprotny: An Asymptotic Convergence Result for Maximum Variance Unfolding Based on an Interpretation as a Regularized Shortest Path Problem
12:00-14:15 Lunch break
14:15-15:15 Dan Kushnir: Anisotropic Diffusion Maps with Applications to Inverse Problems
15:15-16:00 Cake
16:00-16:45 Rodrigo Iza Teran: Diffusion Maps for Finite-Element Simulation Data

Wednesday, June 1

09:30-10:30 Matthias Hein: Nonlinear Eigenproblems in Machine Learning
10:30-11:15 Coffee break
11:15-12:00 Christian Rieger: Sampling Inequalities and Manifold Regularization
12:00-12:45 Daniel Wissel: Fast Gauss Transforms for high-dimensional problems
12:45- Lunch break
  afternoon - social program

Thursday, June 2

09:30-10:30 Alexander Gorban: Principal graphs and topological grammars for data approximation
10:30-11:15 Coffee break
11:15-12:00 Mijail Guillemard: New Perspectives in Signal Processing Combining Dimensionality Reduction and Persistent Homology
12:00-14:15 Lunch break
14:15-15:15 Michael Kirby: Geometry and the Analysis of Massive Data Sets
15:15-16:00 Cake
16:00-16:45 Felix Krahmer: New and improved Johnson-Lindenstrauss embeddings via the Restricted Isometry Property

Friday, June 3

09:30-10:30 Zhenyue Zhang: Spectral Analysis of Alignment Matrices in Manifold Learning
10:30-11:15 Coffee break
11:15-12:15 Xavier Pennec: Current Issues in Statistical Analysis on Manifolds for Computational Anatomy
12:15- closing comments
 

Download the abstracts:

ManifoldLearning_Prog.pdf