no code implementations • 1 May 2024 • Liran Nochumsohn, Omri Azencot
Data augmentation serves as a popular regularization technique to combat overfitting challenges in neural networks.
no code implementations • 6 Feb 2024 • Nimrod Berman, Eitan Kosman, Dotan Di Castro, Omri Azencot
Graph generation is integral to various engineering and scientific disciplines.
1 code implementation • 4 Oct 2023 • Ilan Naiman, N. Benjamin Erichson, Pu Ren, Michael W. Mahoney, Omri Azencot
In this work, we introduce Koopman VAE (KoVAE), a new generative framework that is based on a novel design for the model prior, and that can be optimized for either regular and irregular training data.
1 code implementation • 31 May 2023 • Ilya Kaufman, Omri Azencot
In contrast, this behavior does not appear in untrained networks in which the curvature flattens.
1 code implementation • 25 May 2023 • Ilan Naiman, Nimrod Berman, Omri Azencot
Unsupervised disentanglement is a long-standing challenge in representation learning.
1 code implementation • 30 Mar 2023 • Nimrod Berman, Ilan Naiman, Omri Azencot
Disentangling complex data to its latent factors of variation is a fundamental task in representation learning.
no code implementations • 23 Dec 2022 • Jack W. Miller, Charles O'Neill, Navid C. Constantinou, Omri Azencot
In addition, we suggest the "eigenloss" penalty scheme that penalises the eigenvalues of the Koopman operator during training.
no code implementations • 18 Feb 2021 • Omri Azencot, N. Benjamin Erichson, Mirela Ben-Chen, Michael W. Mahoney
In this work, we employ tools and insights from differential geometry to offer a novel perspective on orthogonal RNNs.
1 code implementation • 15 Feb 2021 • Ilan Naiman, Omri Azencot
In contrast, we propose to analyze trained neural networks using an operator theoretic approach which is rooted in Koopman theory, the Koopman Analysis of Neural Networks (KANN).
no code implementations • 3 Jul 2020 • Ido Cohen, Omri Azencot, Pavel Lifshitz, Guy Gilboa
Definitions for spectrum and filtering are given, and a Parseval-type identity is shown.
Dynamical Systems Computational Engineering, Finance, and Science
1 code implementation • ICLR 2021 • N. Benjamin Erichson, Omri Azencot, Alejandro Queiruga, Liam Hodgkinson, Michael W. Mahoney
Viewing recurrent neural networks (RNNs) as continuous-time dynamical systems, we propose a recurrent unit that describes the hidden state's evolution with two parts: a well-understood linear component plus a Lipschitz nonlinearity.
1 code implementation • ICML 2020 • Omri Azencot, N. Benjamin Erichson, Vanessa Lin, Michael W. Mahoney
Recurrent neural networks are widely used on time series data, yet such models often ignore the underlying physical structures in such sequences.