all AI news
Self-Augmented In-Context Learning for Unsupervised Word Translation
Feb. 16, 2024, 5:43 a.m. | Yaoyiran Li, Anna Korhonen, Ivan Vuli\'c
cs.LG updates on arXiv.org arxiv.org
Abstract: Recent work has shown that, while large language models (LLMs) demonstrate strong word translation or bilingual lexicon induction (BLI) capabilities in few-shot setups, they still cannot match the performance of 'traditional' mapping-based approaches in the unsupervised scenario where no seed translation pairs are available, especially for lower-resource languages. To address this challenge with LLMs, we propose self-augmented in-context learning (SAIL) for unsupervised BLI: starting from a zero-shot prompt, SAIL iteratively induces a set of high-confidence …
abstract arxiv bilingual capabilities context cs.ai cs.cl cs.ir cs.lg few-shot in-context learning language language models languages large language large language models llms mapping match performance seed translation type unsupervised word work
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US