1 code implementation • 10 May 2024 • Rom N. Parnichkun, Stefano Massaroli, Alessandro Moro, Jimmy T. H. Smith, Ramin Hasani, Mathias Lechner, Qi An, Christopher Ré, Hajime Asama, Stefano Ermon, Taiji Suzuki, Atsushi Yamashita, Michael Poli
We approach designing a state-space model for deep learning applications through its dual representation, the transfer function, and uncover a highly efficient sequence parallel inference algorithm that is state-free: unlike other proposed algorithms, state-free inference does not incur any significant memory or computational cost with an increase in state size.
2 code implementations • 9 Aug 2022 • Jimmy T. H. Smith, Andrew Warrington, Scott W. Linderman
Models using structured state space sequence (S4) layers have achieved state-of-the-art performance on long-range sequence modeling tasks.
Ranked #3 on Long-range modeling on LRA
1 code implementation • NeurIPS 2021 • Jimmy T. H. Smith, Scott W. Linderman, David Sussillo
The results are a trained SLDS variant that closely approximates the RNN, an auxiliary function that can produce a fixed point for each point in state-space, and a trained nonlinear RNN whose dynamics have been regularized such that its first-order terms perform the computation, if possible.