Jan. 20, 2024, 6:22 p.m. | /u/APaperADay

Machine Learning www.reddit.com

**Paper**: [https://www.nature.com/articles/s41562-023-01799-z](https://www.nature.com/articles/s41562-023-01799-z)

**Preprint version(s)**: [https://www.biorxiv.org/content/10.1101/2023.01.19.524711](https://www.biorxiv.org/content/10.1101/2023.01.19.524711)

**Code**: [https://github.com/ellie-as/generative-memory](https://github.com/ellie-as/generative-memory)

**Abstract**:

>Episodic memories are (re)constructed, share neural substrates with imagination, combine unique features with schema-based predictions and show schema-based distortions that increase with consolidation. Here we present a computational model in which hippocampal replay (from an autoassociative network) trains generative models (variational autoencoders) to (re)create sensory experiences from latent variable representations in entorhinal, medial prefrontal and anterolateral temporal cortices via the hippocampal formation. Simulations show effects of memory age and hippocampal lesions in …

abstract autoencoders computational consolidation features generative generative models imagination machinelearning memories network predictions schema sensory show temporal trains variational autoencoders via

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US