Multiple Sequential Learning Tasks Represented in Recurrent Neural Networks

Our brain can flexibly perform a variety of sequential learning tasks including music, language, and mathematics, but the underlying mechanism hasn't been elucidated in traditional experimental and modeling studies which were designed for only one task at a time. From the computational perspective, we hypothesize that the working mechanism of a multitask model can provide a possible solution to that of brains. Therefore, we trained a single recurrent neural network to perform 8 sequential learning tasks that depend on working memory, structure extraction, categorization, and other cognitive processes. After training, the model can learn sophisticated information holding and erasing strategies to perform multitasks simultaneously. More interestingly, the model learns to reuse neurons to encode similar task features. Hopefully, this work can provide a computational platform to investigate the neural representations of cognitive sequential learning ability.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here