all AI news
Building Efficient Universal Classifiers with Natural Language Inference
March 25, 2024, 4:47 a.m. | Moritz Laurer, Wouter van Atteveldt, Andreu Casas, Kasper Welbers
cs.CL updates on arXiv.org arxiv.org
Abstract: Generative Large Language Models (LLMs) have become the mainstream choice for fewshot and zeroshot learning thanks to the universality of text generation. Many users, however, do not need the broad capabilities of generative LLMs when they only want to automate a classification task. Smaller BERT-like models can also learn universal tasks, which allow them to do any text classification task without requiring fine-tuning (zeroshot classification) or to learn new tasks with only a few examples …
abstract arxiv automate become bert building capabilities classification classifiers cs.ai cs.cl generative however inference language language models large language large language models llms natural natural language text text generation type universal
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Software Engineer, Generative AI (C++)
@ SoundHound Inc. | Toronto, Canada