April 18, 2024, noon | code_your_own_AI

code_your_own_AI www.youtube.com

The Latest in AI research: The video introduces a powerful new TransformerFAM (Feedback Attention Memory) by @Google a novel architecture designed to enhance Transformers by incorporating a feedback mechanism that emulates working memory.
Plus the introduction of the new Transformer BSWA (Block Sliding Window Attention).

Based on ring attention by @UCBerkeley

This design allows the Transformer to maintain awareness of its own latent representations across different blocks of data, improving its ability to process indefinitely long sequences without additional computational …

ai research architecture attention block design feedback google introduction memory novel research ring transformer transformers video

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Scientist

@ Publicis Groupe | New York City, United States

Bigdata Cloud Developer - Spark - Assistant Manager

@ State Street | Hyderabad, India