March 5, 2024, 2:52 p.m. | Feiyu Zhu, Reid Simmons

cs.CL updates on arXiv.org arxiv.org

arXiv:2403.00810v1 Announce Type: cross
Abstract: Large language models contain noisy general knowledge of the world, yet are hard to train or fine-tune. On the other hand cognitive architectures have excellent interpretability and are flexible to update but require a lot of manual work to instantiate. In this work, we combine the best of both worlds: bootstrapping a cognitive-based model with the noisy knowledge encoded in large language models. Through an embodied agent doing kitchen tasks, we show that our proposed …

abstract agents architectures arxiv bootstrapping cognitive cognitive architectures cs.ai cs.cl general interpretability knowledge language language model language models large language large language model large language models train type update work world

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne