no code implementations • 15 Feb 2024 • Arjun Karuvally, Terrence J. Sejnowski, Hava T. Siegelmann
Traveling waves are a fundamental phenomenon in the brain, playing a crucial role in short-term information storage.
no code implementations • 3 Oct 2023 • Arjun Karuvally, Peter DelMastro, Hava T. Siegelmann
Utilizing the EMT, we formulate a mathematically rigorous circuit that facilitates variable binding in these tasks.
no code implementations • 12 Jun 2023 • Peter DelMastro, Rushiv Arora, Edward Rietman, Hava T. Siegelmann
In this way, we demonstrate how dynamical systems theory can provide insights into not only the learned representations of these models, but also the dynamics of the learning process itself.
no code implementations • 11 Dec 2022 • Arjun Karuvally, Terry J. Sejnowski, Hava T. Siegelmann
We introduce a new class of General Sequential Episodic Memory Models (GSEMM) that, in the adiabatic limit, exhibit temporally changing energy surface, leading to a series of meta-stable states that are sequential episodic memories.
1 code implementation • 4 Apr 2022 • Adam Kohan, Edward A. Rietman, Hava T. Siegelmann
To further support relevance to biological and hardware learning, we use sigprop to train continuous time neural networks with Hebbian updates, and train spiking neural networks with only the voltage or with biologically and hardware compatible surrogate functions.
no code implementations • 1 Apr 2021 • Tyler L. Hayes, Giri P. Krishnan, Maxim Bazhenov, Hava T. Siegelmann, Terrence J. Sejnowski, Christopher Kanan
Replay is the reactivation of one or more neural patterns, which are similar to the activation patterns experienced during past waking experiences.
1 code implementation • 5 Sep 2019 • Daniel J. Saunders, Cooper Sigrist, Kenneth Chaney, Robert Kozma, Hava T. Siegelmann
To our knowledge, this is the first general-purpose implementation of mini-batch processing in a spiking neural networks simulator, which works with arbitrary neuron and synapse models.
no code implementations • 12 Apr 2019 • Daniel J. Saunders, Devdhar Patel, Hananel Hazan, Hava T. Siegelmann, Robert Kozma
In recent years, Spiking Neural Networks (SNNs) have demonstrated great successes in completing various Machine Learning tasks.
no code implementations • 24 Aug 2018 • Daniel J. Saunders, Hava T. Siegelmann, Robert Kozma, Miklós Ruszinkó
Spiking neural networks are motivated from principles of neural systems and may possess unexplored advantages in the context of machine learning.
no code implementations • 9 Aug 2018 • Adam A. Kohan, Edward A. Rietman, Hava T. Siegelmann
This mechanism, Error Forward-Propagation, is a plausible basis for how error feedback occurs deep in the brain independent of and yet in support of the functionality underlying intricate network architectures.
no code implementations • 24 Jul 2018 • Hananel Hazan, Daniel J. Saunders, Darpan T. Sanghavi, Hava T. Siegelmann, Robert Kozma
We present a system comprising a hybridization of self-organized map (SOM) properties with spiking neural networks (SNNs) that retain many of the features of SOMs.
1 code implementation • 4 Jun 2018 • Hananel Hazan, Daniel J. Saunders, Hassaan Khan, Darpan T. Sanghavi, Hava T. Siegelmann, Robert Kozma
In this paper, we describe a new Python package for the simulation of spiking neural networks, specifically geared towards machine learning and reinforcement learning.
1 code implementation • JMLR 2001 • Asa Ben-Hur, David Horn, Hava T. Siegelmann, Vladimir Vapnik
We present a novel clustering method using the approach of support vector machines.