May 16, 2024, 4:42 a.m. | Bushi Xiao, Chao Gao, Demi Zhang

cs.LG updates on arXiv.org arxiv.org

arXiv:2405.09508v1 Announce Type: cross
Abstract: This study evaluates the performance of Recurrent Neural Network (RNN) and Transformer in replicating cross-language structural priming: a key indicator of abstract grammatical representations in human language processing. Focusing on Chinese-English priming, which involves two typologically distinct languages, we examine how these models handle the robust phenomenon of structural priming, where exposure to a particular sentence structure increases the likelihood of selecting a similar structure subsequently. Additionally, we utilize large language models (LLM) to measure …

abstract architectures arxiv bilingual chinese cs.cl cs.lg english human key language language processing languages modeling network neural network performance processing recurrent neural network rnn study transformer type

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

Software Engineer III -Full Stack Developer - ModelOps, MLOps

@ JPMorgan Chase & Co. | NY, United States

Senior Lead Software Engineer - Full Stack Senior Developer - ModelOps, MLOps

@ JPMorgan Chase & Co. | NY, United States

Software Engineer III - Full Stack Developer - ModelOps, MLOps

@ JPMorgan Chase & Co. | NY, United States

Research Scientist (m/w/d) - Numerische Simulation Laser-Materie-Wechselwirkung

@ Fraunhofer-Gesellschaft | Freiburg, DE, 79104

Research Scientist, Speech Real-Time Dialog

@ Google | Mountain View, CA, USA