all AI news
Non-autoregressive Transformer-based End-to-end ASR using BERT. (arXiv:2104.04805v3 [cs.CL] UPDATED)
May 19, 2022, 1:11 a.m. | Fu-Hao Yu, Kuan-Yu Chen
cs.CL updates on arXiv.org arxiv.org
Transformer-based models have led to significant innovation in classical and
practical subjects as varied as speech processing, natural language processing,
and computer vision. On top of the Transformer, attention-based end-to-end
automatic speech recognition (ASR) models have recently become popular.
Specifically, non-autoregressive modeling, which boasts fast inference and
performance comparable to conventional autoregressive methods, is an emerging
research topic. In the context of natural language processing, the
bidirectional encoder representations from Transformers (BERT) model has
received widespread attention, partially due to …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Senior ML Researcher - 3D Geometry Processing | 3D Shape Generation | 3D Mesh Data
@ Promaton | Europe
Senior Data Analyst - SQL
@ Experian | Heredia, Costa Rica
Lead Business Intelligence Developer
@ L.A. Care Health Plan | Los Angeles, CA, US, 90017
(USA) Senior Manager, Data Analytics
@ Walmart | (USA) AR BENTONVILLE Home Office J Street Offices, Suite #2
Autonomous Haulage System Application Specialist
@ Komatsu | Belo Horizonte, BR
Machine Learning Engineer
@ GFT Technologies | Alcobendas, M, ES, 28108