March 25, 2024, 4:47 a.m. | Fatemeh Lalegani, Eric De Giuli

cs.CL updates on arXiv.org arxiv.org

arXiv:2309.14913v2 Announce Type: replace-cross
Abstract: The Random Language Model (De Giuli 2019) is an ensemble of stochastic context-free grammars, quantifying the syntax of human and computer languages. The model suggests a simple picture of first language learning as a type of annealing in the vast space of potential languages. In its simplest formulation, it implies a single continuous transition to grammatical syntax, at which the symmetry among potential words and categories is spontaneously broken. Here this picture is scrutinized by …

abstract arxiv computer cond-mat.dis-nn context cs.cl ensemble free human language language model languages random robustness simple space stochastic syntax type vast

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Global Data Architect, AVP - State Street Global Advisors

@ State Street | Boston, Massachusetts

Data Engineer

@ NTT DATA | Pune, MH, IN