all AI news
Language Model Knowledge Distillation for Efficient Question Answering in Spanish
March 19, 2024, 4:45 a.m. | Adri\'an Bazaga, Pietro Li\`o, Gos Micklem
cs.LG updates on arXiv.org arxiv.org
Abstract: Recent advances in the development of pre-trained Spanish language models has led to significant progress in many Natural Language Processing (NLP) tasks, such as question answering. However, the lack of efficient models imposes a barrier for the adoption of such models in resource-constrained environments. Therefore, smaller distilled models for the Spanish language could be proven to be highly scalable and facilitate their further adoption on a variety of tasks and scenarios. In this work, we …
abstract adoption advances arxiv cs.cl cs.lg development distillation environments however knowledge language language model language models language processing natural natural language natural language processing nlp processing progress question question answering spanish stat.ml tasks type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US