Feb. 21, 2024, 5:42 a.m. | Zhiyuan Li, Hong Liu, Denny Zhou, Tengyu Ma

cs.LG updates on arXiv.org arxiv.org

arXiv:2402.12875v1 Announce Type: new
Abstract: Instructing the model to generate a sequence of intermediate steps, a.k.a., a chain of thought (CoT), is a highly effective method to improve the accuracy of large language models (LLMs) on arithmetics and symbolic reasoning tasks. However, the mechanism behind CoT remains unclear. This work provides a theoretical understanding of the power of CoT for decoder-only transformers through the lens of expressiveness. Conceptually, CoT empowers the model with the ability to perform inherently serial computation, …

abstract accuracy arxiv chain of thought cs.cc cs.lg generate intermediate language language models large language large language models llms reasoning solve stat.ml symbolic reasoning tasks thought transformers type work

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York