April 11, 2024, 8:30 p.m. | /u/tfburns

Machine Learning www.reddit.com

Mechanically, attention appears to perform *hetero-association*. Actually, it can in principle *mix* *auto- & hetero-association* together.

Question: What abilities does this permit?
Answer: A lot!

**Finite automata**

By assigning neural activities for image or text data and converting their combinations into auto-associating attractors (states) or hetero-associating quasi-attractors (transitions), we can simulate finite automata.

(see section 3.4 and appendix A12 of the paper linked below)

[ An example of mapping a finite automaton to a 'memory graph'. ](https://i.redd.it/gv1ob2eqswtc1.gif)

**Multi-scale graph representations** …

association attention auto iteration machinelearning memory modern neuro neuroscience question show tasks together transformer transformers

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US