no code implementations • 30 Dec 2023 • Anders Lansner, Naresh B Ravichandran, Pawel Herman
In this paper we characterize these different aspects of associative memory performance and benchmark six different learning rules on storage capacity and prototype extraction.
no code implementations • 5 May 2023 • Naresh Ravichandran, Anders Lansner, Pawel Herman
We introduce a novel spiking neural network model for learning distributed internal representations from data in an unsupervised procedure.
no code implementations • 13 Apr 2023 • Anders Lansner, Florian Fiebig, Pawel Herman
Theories and models of working memory (WM) were at least since the mid-1990s dominated by the persistent activity hypothesis.
no code implementations • 30 Jun 2022 • Naresh Balaji Ravichandran, Anders Lansner, Pawel Herman
We approach this problem by combining a recurrent attractor network with a feedforward network that learns distributed representations using an unsupervised Hebbian-Bayesian learning rule.
no code implementations • 29 Jun 2021 • Naresh Balaji Ravichandran, Anders Lansner, Pawel Herman
Learning internal representations from data using no or few labels is useful for machine learning research, as it allows using massive amounts of unlabeled data.
1 code implementation • 9 Jun 2021 • Artur Podobas, Martin Svedin, Steven W. D. Chien, Ivy B. Peng, Naresh Balaji Ravichandran, Pawel Herman, Anders Lansner, Stefano Markidis
The modern deep learning method based on backpropagation has surged in popularity and has been used in multiple domains and application areas.
no code implementations • 1 Jan 2021 • Naresh Balaji, Anders Lansner, Pawel Herman
Unsupervised learning of hidden representations has been one of the most vibrant research directions in machine learning in recent years.
no code implementations • 6 May 2020 • Naresh Balaji Ravichandran, Anders Lansner, Pawel Herman
Unsupervised learning of hidden representations has been one of the most vibrant research directions in machine learning in recent years.
no code implementations • 27 Mar 2020 • Naresh Balaji Ravichandran, Anders Lansner, Pawel Herman
Unsupervised learning of hierarchical representations has been one of the most vibrant research directions in deep learning during recent years.
no code implementations • 29 Apr 2014 • Mihai A. Petrovici, Bernhard Vogginger, Paul Müller, Oliver Breitwieser, Mikael Lundqvist, Lyle Muller, Matthias Ehrlich, Alain Destexhe, Anders Lansner, René Schüffny, Johannes Schemmel, Karlheinz Meier
Advancing the size and complexity of neural network models leads to an ever increasing demand for computational resources for their simulation.