Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Session Overview
Session
MS171, part 1: Grassmann and flag manifolds in data analysis
Time:
Saturday, 13/Jul/2019:
10:00am - 12:00pm

Location: Unitobler, F007
30 seats, 59m^2

Presentations
10:00am - 12:00pm

Grassmann and flag manifolds in data analysis

Chair(s): Chris Peterson (Colorado State University, United States of America), Michael Kirby (Colorado State University), Javier Alvarez-Vizoso (Max-Planck Institute for Solar System Research in Göttingen)

A number of applications in large scale geometric data analysis can be expressed in terms of an optimization problem on a Grassmann or flag manifold.The solution of the optimization problem helps one to understand structure underlying a data set for the purposes such as classification, feature selection, and anomaly detection.

For example, given a collection of points on a Grassmann manifold, one could imagine finding a Schubert variety of best fit corresponds to minimizing some function on the flag variety parameterizing the given class of Schubert varieties.

A number of different algorithms that exist for points in a linear space have analogues for points in a Grassmann or flag manifold such as clustering, endmember detection, self organized mappings, etc.

The purpose of this minisymposium is to bring together researchers who share a common interest in algorithms and techniques involving Grassmann and Flag varieties applied to problems in data analysis.

 

(25 minutes for each presentation, including questions, followed by a 5-minute break; in case of x<4 talks, the first x slots are used unless indicated otherwise)

 

PCA Integral Invariants for Manifold Learning

Javier Alvarez-Vizoso
Max-Planck Institute for Solar System Research in Göttingen

Local integral invariants at scale based on principal component analysis have recently been shown to provide estimators of curvature information at every point of a manifold. These can thus be applied to perform manifold learning from point clouds sampled from embedded Riemannian manifolds, and to inform optimization and geometry processing methods in arbitrary dimension, e.g. feature detection at scale. In particular, regular curves in Euclidean space are completely characterized up to rigid motion by the EVD of the PCA covariance matrix at every point, which reproduces the Frenet-Serret apparatus asymptotically with scale. We will also introduce the general result that establishes a dictionary in the limit between these statistical integral invariants and the classical differential-geometric curvature, in the form of a generalized Darboux-Ricci frame, providing an algorithm to estimate the Riemann curvature tensor for embedded manifolds of arbitrary dimension.

 

Subspace Averaging in Multi-Sensor Array Processing

Ignacio Santamaria1, Louis Scharf2, Vaibhav Garg1, David Ramirez3
1Universidad de Cantabria, 2Colorado Statew University, 3University Carlos III of Madrid

In this talk we address the problem of averaging on the Grassmann manifold, with special emphasis placed on the question of estimating the dimension of the average. The solution to this problem provides a simple order fitting rule based on thresholding the eigenvalues of the average projection matrix, and thus it is free of penalty terms or other tuning parameters commonly used by other information-theoretic criteria for model order estimation such as the minimum description length (MDL) criterion. The proposed rule appears to be particularly well suited to problems involving high-dimensional data and low sample support, such as the determination of the number of sources with a large array of sensors: the so-called source enumeration problem. The talk will discuss subspace averaging (SA) for source enumeration under the challenging conditions of:

i) large uniform arrays with few snapshots (the small sample regime), and

ii) non-white or spatially correlated noises with arbitrary correlation.

As illustrated by some simulation examples, SA provides a very robust method of enumerating sources in these challenging scenarios.

 

Variations on Multidimensional Scaling for non-Euclidean Distance Matrices

Mark Blumstein
Colorado State University

Classical multidimensional scaling takes as input a distance matrix and extracts a configuration of points in a low dimensional Euclidean space whose Euclidean distances best approximate the input data. In this talk we put a twist on the classical algorithm by changing the geometry of the embedding space. Specifically, we show that pseudo-Euclidean coordinates are the natural choice when working with non-Euclidean distance data. Examples are furnished by Lie groups and homogeneous manifolds, which display characteristic signatures in pseudo-Euclidean space.