May 8, 2024, 4:47 a.m. | Zhongren Dong, Zixing Zhang, Weixiang Xu, Jing Han, Jianjun Ou, Bj\"orn W. Schuller

cs.CL updates on arXiv.org arxiv.org

arXiv:2405.03952v1 Announce Type: cross
Abstract: Automatically detecting Alzheimer's Disease (AD) from spontaneous speech plays an important role in its early diagnosis. Recent approaches highly rely on the Transformer architectures due to its efficiency in modelling long-range context dependencies. However, the quadratic increase in computational complexity associated with self-attention and the length of audio poses a challenge when deploying such models on edge devices. In this context, we construct a novel framework, namely Hierarchical Attention-Free Transformer (HAFFormer), to better deal with …

abstract alzheimer's architectures arxiv attention complexity computational context cs.cl cs.sd dependencies detection diagnosis disease eess.as efficiency framework free hierarchical however modelling role self-attention speech transformer type

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US