Feb. 29, 2024, 10:48 a.m. | /u/SunsetOneSix

Natural Language Processing www.reddit.com

**EMNLP 2023**: [https://aclanthology.org/2023.emnlp-main.772/](https://aclanthology.org/2023.emnlp-main.772/)

**arXiv**: [https://arxiv.org/abs/2305.14952](https://arxiv.org/abs/2305.14952)

**OpenReview**: [https://openreview.net/forum?id=DlQeSfGYfS](https://openreview.net/forum?id=DlQeSfGYfS)

**Abstract**:

>We present a new layer in which dynamic (i.e., input-dependent) **Infinite Impulse Response** (**IIR**) **filters** of order two are used to process the input sequence prior to applying conventional attention. The input is split into chunks, and the coefficients of these filters are determined based on previous chunks to maintain causality. Despite their relatively low order, the causal adaptive filters are shown to focus attention on the relevant sequence elements. The new …

abstract attention causality dynamic filters languagetechnology layer low prior process

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US