April 18, 2024, noon | code_your_own_AI

code_your_own_AI www.youtube.com

The Latest in AI research: The video introduces a powerful new TransformerFAM (Feedback Attention Memory) by @Google a novel architecture designed to enhance Transformers by incorporating a feedback mechanism that emulates working memory.
Plus the introduction of the new Transformer BSWA (Block Sliding Window Attention).

Based on ring attention by @UCBerkeley

This design allows the Transformer to maintain awareness of its own latent representations across different blocks of data, improving its ability to process indefinitely long sequences without additional computational …

ai research architecture attention block design feedback google introduction memory novel research ring transformer transformers video

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York