April 22, 2024, 10:08 a.m. | /u/AIRI_Institute

Machine Learning www.reddit.com

The researchers segmented the sequence and added special memory tokens to the input: memory states from the output of the previous segment became inputs for the next one. Thus, a whole transformer acts as a recurrent cell, and memory serves as the recurrent state of the network. This approach was called Recurrent Memory Transformer (RMT).

The authors augmented small transformer models like BERT and GPT-2 with this memory and tested them on various question-answering tasks where facts needed for answering …

context inputs machinelearning memory networks neural networks next researchers segment state tokens transformer

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US