April 25, 2024, 5:44 p.m. | Chenxuan Cui, Ying Chen, Qinxin Wang, David R. Mortensen

cs.CL updates on arXiv.org arxiv.org

arXiv:2404.15690v1 Announce Type: new
Abstract: Proto-form reconstruction has been a painstaking process for linguists. Recently, computational models such as RNN and Transformers have been proposed to automate this process. We take three different approaches to improve upon previous methods, including data augmentation to recover missing reflexes, adding a VAE structure to the Transformer model for proto-to-language prediction, and using a neural machine translation model for the reconstruction task. We find that with the additional VAE structure, the Transformer model has …

abstract arxiv augmentation automate computational cs.cl cs.lg data form language prediction process proto rnn transformer transformer model transformers type vae

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne