Feb. 23, 2024, 5:43 a.m. | Xinyu Wang, Hainiu Xu, Lin Gui, Yulan He

cs.LG updates on arXiv.org arxiv.org

arXiv:2402.14522v1 Announce Type: cross
Abstract: Task embedding, a meta-learning technique that captures task-specific information, has become prevalent, especially in areas such as multi-task learning, model editing, and interpretability. However, it faces challenges with the emergence of prompt-guided Large Language Models (LLMs) operating in a gradientfree manner. Existing task embedding methods rely on fine-tuned, task-specific language models, which hinders the adaptability of task embeddings across diverse models, especially prompt-based LLMs. To unleash the power of task embedding in the era of …

abstract arxiv become beyond challenges cs.cl cs.lg editing embedding embeddings emergence gap information interpretability language language models large language large language models llms meta meta-learning multiple multi-task learning prompt type

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Lead Data Scientist, Commercial Analytics

@ Checkout.com | London, United Kingdom

Data Engineer I

@ Love's Travel Stops | Oklahoma City, OK, US, 73120