all AI news
Getting the most out of your tokenizer for pre-training and domain adaptation
Feb. 5, 2024, 3:48 p.m. | Gautier Dagan Gabriele Synnaeve Baptiste Rozi\`ere
cs.CL updates on arXiv.org arxiv.org
analysis cs.cl domain domain adaptation fine-tuning llms modern pre-training tokenization training
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Sr. Software Development Manager, AWS Neuron Machine Learning Distributed Training
@ Amazon.com | Cupertino, California, USA