Search Results for author: Elena Agliari

Found 13 papers, 0 papers with code

Parallel Learning by Multitasking Neural Networks

no code implementations8 Aug 2023 Elena Agliari, Andrea Alessandrelli, Adriano Barra, Federico Ricci-Tersenghi

A modern challenge of Artificial Intelligence is learning multiple patterns at once (i. e. parallel learning).

Regularization, early-stopping and dreaming: a Hopfield-like setup to address generalization and overfitting

no code implementations1 Aug 2023 Elena Agliari, Francesco Alemanno, Miriam Aquaro, Alberto Fachechi

In this work we approach attractor neural networks from a machine learning perspective: we look for optimal network parameters by applying a gradient descent over a regularized loss function.

Dense Hebbian neural networks: a replica symmetric picture of unsupervised learning

no code implementations25 Nov 2022 Elena Agliari, Linda Albanese, Francesco Alemanno, Andrea Alessandrelli, Adriano Barra, Fosca Giannotti, Daniele Lotito, Dino Pedreschi

We consider dense, associative neural-networks trained with no supervision and we investigate their computational capabilities analytically, via a statistical-mechanics approach, and numerically, via Monte Carlo simulations.

valid

Dense Hebbian neural networks: a replica symmetric picture of supervised learning

no code implementations25 Nov 2022 Elena Agliari, Linda Albanese, Francesco Alemanno, Andrea Alessandrelli, Adriano Barra, Fosca Giannotti, Daniele Lotito, Dino Pedreschi

We consider dense, associative neural-networks trained by a teacher (i. e., with supervision) and we investigate their computational capabilities analytically, via statistical-mechanics of spin glasses, and numerically, via Monte Carlo simulations.

Retrieval valid

Pavlov Learning Machines

no code implementations2 Jul 2022 Elena Agliari, Miriam Aquaro, Adriano Barra, Alberto Fachechi, Chiara Marullo

As well known, Hebb's learning traces its origin in Pavlov's Classical Conditioning, however, while the former has been extensively modelled in the past decades (e. g., by Hopfield model and countless variations on theme), as for the latter modelling has remained largely unaddressed so far; further, a bridge between these two pillars is totally lacking.

Recurrent neural networks that generalize from examples and optimize by dreaming

no code implementations17 Apr 2022 Miriam Aquaro, Francesco Alemanno, Ido Kanter, Fabrizio Durante, Elena Agliari, Adriano Barra

The gap between the huge volumes of data needed to train artificial neural networks and the relatively small amount of data needed by their biological counterparts is a central puzzle in machine learning.

Supervised Hebbian Learning

no code implementations2 Mar 2022 Francesco Alemanno, Miriam Aquaro, Ido Kanter, Adriano Barra, Elena Agliari

In neural network's Literature, Hebbian learning traditionally refers to the procedure by which the Hopfield model and its generalizations store archetypes (i. e., definite patterns that are experienced just once to form the synaptic matrix).

Disentanglement

The emergence of a concept in shallow neural networks

no code implementations1 Sep 2021 Elena Agliari, Francesco Alemanno, Adriano Barra, Giordano De Marzo

We consider restricted Boltzmann machine (RBMs) trained over an unstructured dataset made of blurred copies of definite but unavailable ``archetypes'' and we show that there exists a critical sample size beyond which the RBM can learn archetypes, namely the machine can successfully play as a generative model or as a classifier, according to the operational routine.

The relativistic Hopfield model with correlated patterns

no code implementations10 Mar 2021 Elena Agliari, Alberto Fachechi, Chiara Marullo

In this work we introduce and investigate the properties of the "relativistic" Hopfield model endowed with temporally correlated patterns.

Mathematical Physics Mathematical Physics

A statistical-inference approach to reconstruct inter-cellular interactions in cell-migration experiments

no code implementations4 Dec 2019 Elena Agliari, Pablo J. Sáez, Adriano Barra, Matthieu Piel, Pablo Vargas, Michele Castellana

In the first experiment, cell migrate in a wound-healing model: when applied to this experiment, the inference method predicts the existence of cell-cell interactions, correctly mirroring the strong intercellular contacts which are present in the experiment.

Neural networks with redundant representation: detecting the undetectable

no code implementations28 Nov 2019 Elena Agliari, Francesco Alemanno, Adriano Barra, Martino Centonze, Alberto Fachechi

We consider a three-layer Sejnowski machine and show that features learnt via contrastive divergence have a dual representation as patterns in a dense associative memory of order P=4.

Dreaming neural networks: rigorous results

no code implementations21 Dec 2018 Elena Agliari, Francesco Alemanno, Adriano Barra, Alberto Fachechi

Recently a daily routine for associative neural networks has been proposed: the network Hebbian-learns during the awake state (thus behaving as a standard Hopfield model), then, during its sleep state, optimizing information storage, it consolidates pure patterns and removes spurious ones: this forces the synaptic matrix to collapse to the projector one (ultimately approaching the Kanter-Sompolinksy model).

Retrieval

Dreaming neural networks: forgetting spurious memories and reinforcing pure ones

no code implementations29 Oct 2018 Alberto Fachechi, Elena Agliari, Adriano Barra

The standard Hopfield model for associative neural networks accounts for biological Hebbian learning and acts as the harmonic oscillator for pattern recognition, however its maximal storage capacity is $\alpha \sim 0. 14$, far from the theoretical bound for symmetric networks, i. e. $\alpha =1$.

Cannot find the paper you are looking for? You can Submit a new open access paper.