all AI news
Leveraging Code to Improve In-context Learning for Semantic Parsing
March 29, 2024, 4:48 a.m. | Ben Bogin, Shivanshu Gupta, Peter Clark, Ashish Sabharwal
cs.CL updates on arXiv.org arxiv.org
Abstract: In-context learning (ICL) is an appealing approach for semantic parsing due to its few-shot nature and improved generalization. However, learning to parse to rare domain-specific languages (DSLs) from just a few demonstrations is challenging, limiting the performance of even the most capable LLMs. In this work, we improve the effectiveness of ICL for semantic parsing by (1) using general-purpose programming languages such as Python instead of DSLs, and (2) augmenting prompts with a structured domain …
abstract arxiv code context cs.cl domain few-shot however in-context learning languages llms nature parsing performance semantic type work
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior ML Engineer
@ Carousell Group | Ho Chi Minh City, Vietnam
Data and Insight Analyst
@ Cotiviti | Remote, United States