Feb. 14, 2024, 5:43 a.m. | Gb\`etondji J-S Dovonon Michael M. Bronstein Matt J. Kusner

cs.LG updates on arXiv.org arxiv.org

Transformer-based models have recently become wildly successful across a diverse set of domains. At the same time, recent work has argued that Transformers are inherently low-pass filters that gradually oversmooth the inputs. This is worrisome as it limits generalization, especially as model depth increases. A natural question is: How can Transformers achieve these successes given this shortcoming? In this work we show that in fact Transformers are not inherently low-pass filters. Instead, whether Transformers oversmooth or not depends on the …

become cs.lg diverse domains filters inputs low natural question set the record transformer transformers work

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

AI Engineering Manager

@ M47 Labs | Barcelona, Catalunya [Cataluña], Spain