May 13, 2024, 4:42 a.m. | Jianyu Zhang, Niklas Nolte, Ranajoy Sadhukhan, Beidi Chen, L\'eon Bottou

cs.LG updates on arXiv.org arxiv.org

arXiv:2405.06394v1 Announce Type: new
Abstract: Memory Mosaics are networks of associative memories working in concert to achieve a prediction task of interest. Like transformers, memory mosaics possess compositional capabilities and in-context learning capabilities. Unlike transformers, memory mosaics achieve these capabilities in comparatively transparent ways. We demonstrate these capabilities on toy examples and we also show that memory mosaics perform as well or better than transformers on medium-scale language modeling tasks.

abstract arxiv capabilities context context learning cs.ai cs.lg cs.ne examples in-context learning memories memory networks prediction show transformers transparent type

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US