no code implementations • 25 May 2024 • Anthony Gruber, Kookjin Lee, Haksoo Lim, Noseong Park, Nathaniel Trask
Metriplectic systems are learned from data in a way that scales quadratically in both the size of the state and the rank of the metriplectic data.
no code implementations • 7 Dec 2023 • Jeongwhan Choi, Hyowon Wi, Jayoung Kim, Yehjin Shin, Kookjin Lee, Nathaniel Trask, Noseong Park
We propose a graph-filter-based self-attention (GFSA) to learn a general yet effective one, whose complexity, however, is slightly larger than that of the original self-attention mechanism.
no code implementations • 27 Oct 2023 • Elise Walker, Jonas A. Actor, Carianne Martinez, Nathaniel Trask
Causal representation learning algorithms discover lower-dimensional representations of data that admit a decipherable interpretation of cause and effect; as achieving such interpretable representations is challenging, many causal learning algorithms utilize elements indicating prior information, such as (linear) structural causal models, interventional data, or weak supervision.
1 code implementation • NeurIPS 2023 • Anthony Gruber, Kookjin Lee, Nathaniel Trask
Recent works have shown that physics-inspired architectures allow the training of deep graph neural networks (GNNs) without oversmoothing.
no code implementations • 6 Oct 2022 • Tiffany Fan, Nathaniel Trask, Marta D'Elia, Eric Darve
We explore the probabilistic partition of unity network (PPOU-Net) model in the context of high-dimensional regression problems and propose a general framework focusing on adaptive dimensionality reduction.
no code implementations • 1 Oct 2022 • Kookjin Lee, Nathaniel Trask
In this study, we propose parameter-varying neural ordinary differential equations (NODEs) where the evolution of model parameters is represented by partition-of-unity networks (POUNets), a mixture of experts architecture.
no code implementations • 16 May 2022 • Khemraj Shukla, Mengjia Xu, Nathaniel Trask, George Em Karniadakis
For more complex systems or systems of systems and unstructured data, graph neural networks (GNNs) present some distinct advantages, and here we review how physics-informed learning can be accomplished with GNNs based on graph exterior calculus to construct differential operators; we refer to these architectures as physics-informed graph networks (PIGNs).
BIG-bench Machine Learning Physics-informed machine learning
no code implementations • 7 Feb 2022 • Nathaniel Trask, Carianne Martinez, Kookjin Lee, Brad Boyce
We introduce physics-informed multimodal autoencoders (PIMA) - a variational inference framework for discovering shared information in multimodal scientific datasets representative of high-throughput testing.
no code implementations • 26 Oct 2021 • Jonas A. Actor, Andy Huang, Nathaniel Trask
Using neural networks to solve variational problems, and other scientific machine learning tasks, has been limited by a lack of consistency and an inability to exactly integrate expressions involving neural network architectures.
no code implementations • 11 Sep 2021 • Kookjin Lee, Nathaniel Trask, Panos Stinis
Discovery of dynamical systems from data forms the foundation for data-driven modeling and recently, structure-preserving geometric perspectives have been shown to provide improved forecasting, stability, and physical realizability guarantees.
no code implementations • 5 Jan 2021 • Yue Yu, Huaiqian You, Nathaniel Trask
In the absence of fracture, when a corresponding classical continuum mechanics model exists, our improvements provide asymptotically compatible convergence to corresponding local solutions, eliminating surface effects and issues with traction loading which have historically plagued peridynamic discretizations.
Numerical Analysis Computational Engineering, Finance, and Science Numerical Analysis Analysis of PDEs
no code implementations • 22 Dec 2020 • Nathaniel Trask, Andy Huang, Xiaozhe Hu
To enforce physics strongly, we turn to the exterior calculus framework underpinning combinatorial Hodge theory and physics-compatible discretization of partial differential equations (PDEs).
Numerical Analysis Numerical Analysis Mathematical Physics Mathematical Physics
no code implementations • 17 May 2020 • Huaiqian You, Yue Yu, Nathaniel Trask, Mamikon Gulian, Marta D'Elia
A key challenge to nonlocal models is the analytical complexity of deriving them from first principles, and frequently their use is justified a posteriori.
2 code implementations • 7 Sep 2019 • Nathaniel Trask, Ravi G. Patel, Ben J. Gross, Paul J. Atzberger
Data fields sampled on irregularly spaced points arise in many applications in the sciences and engineering.
1 code implementation • 26 Jun 2018 • Jakob M. Maljaars, Robert Jan Labeur, Nathaniel Trask, Deborah Sulsky
By combining concepts from particle-in-cell (PIC) and hybridized discontinuous Galerkin (HDG) methods, we present a particle-mesh scheme which allows for diffusion-free advection, satisfies mass and momentum conservation principles in a local sense, and allows the extension to high-order spatial accuracy.
Numerical Analysis