all AI news
Bootstrapping Cognitive Agents with a Large Language Model
March 5, 2024, 2:52 p.m. | Feiyu Zhu, Reid Simmons
cs.CL updates on arXiv.org arxiv.org
Abstract: Large language models contain noisy general knowledge of the world, yet are hard to train or fine-tune. On the other hand cognitive architectures have excellent interpretability and are flexible to update but require a lot of manual work to instantiate. In this work, we combine the best of both worlds: bootstrapping a cognitive-based model with the noisy knowledge encoded in large language models. Through an embodied agent doing kitchen tasks, we show that our proposed …
abstract agents architectures arxiv bootstrapping cognitive cognitive architectures cs.ai cs.cl general interpretability knowledge language language model language models large language large language model large language models train type update work world
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne