all AI news
MelHuBERT: A simplified HuBERT on Mel spectrogram. (arXiv:2211.09944v1 [cs.CL])
Nov. 21, 2022, 2:15 a.m. | Tzu-Quan Lin, Hung-yi Lee, Hao Tang
cs.CL updates on arXiv.org arxiv.org
Self-supervised models have had great success in learning speech
representations that can generalize to various downstream tasks. HuBERT, in
particular, achieves strong performance while being relatively simple in
training compared to others. The original experimental setting is
computationally extensive, hindering the reproducibility of the models. It is
also unclear why certain design decisions are made, such as the ad-hoc loss
function, and whether these decisions have an impact on the learned
representations. We propose MelHuBERT, a simplified version of HuBERT …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Data Science Analyst
@ Mayo Clinic | AZ, United States
Sr. Data Scientist (Network Engineering)
@ SpaceX | Redmond, WA