April 24, 2024, 4:48 a.m. | Lane Lawley, Christopher J. MacLellan

cs.CL updates on arXiv.org arxiv.org

arXiv:2310.01627v2 Announce Type: replace-cross
Abstract: Machine learning often requires millions of examples to produce static, black-box models. In contrast, interactive task learning (ITL) emphasizes incremental knowledge acquisition from limited instruction provided by humans in modalities such as natural language. However, ITL systems often suffer from brittle, error-prone language parsing, which limits their usability. Large language models (LLMs) are resistant to brittleness but are not interpretable and cannot learn incrementally. We present VAL, an ITL system with a new philosophy for …

abstract acquisition arxiv box contrast cs.ai cs.cl cs.hc dialog error examples gpt however humans incremental interactive knowledge knowledge acquisition language machine machine learning natural natural language parsing systems type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Machine Learning Engineer

@ Apple | Sunnyvale, California, United States