April 11, 2024, 4:42 a.m. | Sara Kangaslahti, David Alvarez-Melis

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.07117v1 Announce Type: cross
Abstract: As large language models (LLMs) have gained popularity for a variety of use cases, making them adaptable and controllable has become increasingly important, especially for user-facing applications. While the existing literature on LLM adaptation primarily focuses on finding a model (or models) that optimizes a single predefined objective, here we focus on the challenging case where the model must dynamically adapt to diverse -- and often changing -- user preferences. For this, we leverage adaptation …

abstract applications arxiv become cases continuous cs.cl cs.lg dynamic interpolation language language model language models large language large language models literature llm llms making text text generation them type use cases

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US