all AI news
Waveformer: Linear-Time Attention with Forward and Backward Wavelet Transform. (arXiv:2210.01989v1 [cs.CL])
Oct. 6, 2022, 1:16 a.m. | Yufan Zhuang, Zihan Wang, Fangbo Tao, Jingbo Shang
cs.CL updates on arXiv.org arxiv.org
We propose Waveformer that learns attention mechanism in the wavelet
coefficient space, requires only linear time complexity, and enjoys universal
approximating power. Specifically, we first apply forward wavelet transform to
project the input sequences to multi-resolution orthogonal wavelet bases, then
conduct nonlinear transformations (in this case, a random feature kernel) in
the wavelet coefficient space, and finally reconstruct the representation in
input space via backward wavelet transform. We note that other non-linear
transformations may be used, hence we name the …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US