April 16, 2024, 4:45 a.m. | Zijian Yang, Wei Zhou, Ralf Schl\"uter, Hermann Ney

cs.LG updates on arXiv.org arxiv.org

arXiv:2309.14130v2 Announce Type: replace-cross
Abstract: Internal language model (ILM) subtraction has been widely applied to improve the performance of the RNN-Transducer with external language model (LM) fusion for speech recognition. In this work, we show that sequence discriminative training has a strong correlation with ILM subtraction from both theoretical and empirical points of view. Theoretically, we derive that the global optimum of maximum mutual information (MMI) training shares a similar formula as ILM subtraction. Empirically, we show that ILM subtraction …

abstract arxiv correlation cs.cl cs.lg cs.sd eess.as fusion language language model performance recognition rnn show speech speech recognition training type work

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US