no code implementations • 11 Apr 2024 • Brian Bell, Michael Geyer, David Glickenstein, Keaton Hamm, Carlos Scheidegger, Amanda Fernandez, Juston Moore
This article proposes a new framework for studying adversarial examples that does not depend directly on the distance to the decision boundary.
no code implementations • 14 Nov 2023 • Keaton Hamm, Caroline Moosmüller, Bernhard Schmitzer, Matthew Thorpe
This paper aims at building the theoretical foundations for manifold learning algorithms in the space of absolutely continuous probability measures on a compact and convex subset of $\mathbb{R}^d$, metrized with the Wasserstein-2 distance $W$.
no code implementations • 13 Oct 2023 • Keaton Hamm, Varun Khurana
We consider structured approximation of measures in Wasserstein space $W_p(\mathbb{R}^d)$ for $p\in[1,\infty)$ by discrete and piecewise constant measures based on a scaled Voronoi partition of $\mathbb{R}^d$.
no code implementations • 5 Oct 2023 • Keaton Hamm, Andrzej Korzeniowski
We expound on some known lower bounds of the quadratic Wasserstein distance between random vectors in $\mathbb{R}^n$ with an emphasis on affine transformations that have been used in manifold learning of data in Wasserstein space.
no code implementations • 21 Feb 2023 • Keaton Hamm, Zhaoying Lu, Wenbo Ouyang, Hao Helen Zhang
To improve the standard Nystr\"{o}m approximation, ensemble Nystr\"{o}m algorithms compute a mixture of Nystr\"{o}m approximations which are generated independently based on column resampling.
no code implementations • 14 Feb 2023 • Alexander Cloninger, Keaton Hamm, Varun Khurana, Caroline Moosmüller
We introduce LOT Wassmap, a computationally feasible algorithm to uncover low-dimensional structures in the Wasserstein space.
no code implementations • 17 Jun 2022 • Keaton Hamm, Mohamed Meskini, HanQin Cai
This algorithm has the same computational complexity as Iterated Robust CUR, which is currently state-of-the-art, but is more robust to outliers.
3 code implementations • 13 Apr 2022 • Keaton Hamm, Nick Henscheid, Shujie Kang
In this paper, we propose Wasserstein Isometric Mapping (Wassmap), a nonlinear dimensionality reduction technique that provides solutions to some drawbacks in existing global nonlinear dimensionality reduction algorithms in imaging applications.
no code implementations • 18 Aug 2021 • Reyan Ahmed, Md Asadullah Turja, Faryad Darabi Sahneh, Mithun Ghosh, Keaton Hamm, Stephen Kobourov
Graph neural networks have been successful in many learning problems and real-world applications.
1 code implementation • 22 Jun 2021 • Reeshad Arian, Keaton Hamm
This article explores subspace clustering algorithms using CUR decompositions, and examines the effect of various hyperparameters in these algorithms on clustering performance on two real-world benchmark datasets, the Hopkins155 motion segmentation dataset and the Yale face dataset.
1 code implementation • 19 Mar 2021 • HanQin Cai, Keaton Hamm, Longxiu Huang, Deanna Needell
Low rank tensor approximation is a fundamental tool in modern machine learning and data science.
no code implementations • 11 Feb 2021 • Reyan Ahmed, Greg Bodwin, Faryad Darabi Sahneh, Keaton Hamm, Stephen Kobourov, Richard Spence
In this paper, we consider a multi-level version of the subsetwise spanner in weighted graphs, where the vertices in $S$ possess varying level, priority, or quality of service (QoS) requirements, and the goal is to compute a nested sequence of spanners with the minimum number of total edges.
Discrete Mathematics
no code implementations • 5 Jan 2021 • HanQin Cai, Keaton Hamm, Longxiu Huang, Deanna Needell
Additionally, we consider hybrid randomized and deterministic sampling methods which produce a compact CUR decomposition of a given matrix, and apply this to video sequences to produce canonical frames thereof.
1 code implementation • 14 Oct 2020 • HanQin Cai, Keaton Hamm, Longxiu Huang, Jiaqi Li, Tao Wang
Robust principal component analysis (RPCA) is a widely used tool for dimension reduction.
no code implementations • 22 Mar 2019 • Keaton Hamm, Longxiu Huang
This article discusses a useful tool in dimensionality reduction and low-rank matrix approximation called the CUR decomposition.
no code implementations • 11 Nov 2017 • Akram Aldroubi, Keaton Hamm, Ahmet Bugra Koku, Ali Sekmen
An algorithm based on the theoretical construction of similarity matrices is presented, and experiments on synthetic and real data are presented to test the method.