April 26, 2024, 4:47 a.m. | Tianhui Zhang, Bei Peng, Danushka Bollegala

cs.CL updates on arXiv.org arxiv.org

arXiv:2404.16807v1 Announce Type: new
Abstract: Generative Commonsense Reasoning (GCR) requires a model to reason about a situation using commonsense knowledge, while generating coherent sentences. Although the quality of the generated sentences is crucial, the diversity of the generation is equally important because it reflects the model's ability to use a range of commonsense knowledge facts. Large Language Models (LLMs) have shown proficiency in enhancing the generation quality across various tasks through in-context learning (ICL) using given examples without the need …

abstract arxiv commonsense context cs.cl diversity generated generative improving in-context learning knowledge language language models large language large language models quality reason reasoning type via while

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote