April 16, 2024, 4:45 a.m. | Zijian Yang, Wei Zhou, Ralf Schl\"uter, Hermann Ney

cs.LG updates on arXiv.org arxiv.org

arXiv:2309.14130v2 Announce Type: replace-cross
Abstract: Internal language model (ILM) subtraction has been widely applied to improve the performance of the RNN-Transducer with external language model (LM) fusion for speech recognition. In this work, we show that sequence discriminative training has a strong correlation with ILM subtraction from both theoretical and empirical points of view. Theoretically, we derive that the global optimum of maximum mutual information (MMI) training shares a similar formula as ILM subtraction. Empirically, we show that ILM subtraction …

abstract arxiv correlation cs.cl cs.lg cs.sd eess.as fusion language language model performance recognition rnn show speech speech recognition training type work

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Data Engineer

@ Kaseya | Bengaluru, Karnataka, India