Jan. 31, 2024, 3:46 p.m. | Le Chen Arijit Bhattacharjee Nesreen Ahmed Niranjan Hasabnis Gal Oren Vy Vo Ali Jannesari

cs.LG updates on arXiv.org arxiv.org

Large language models (LLMs), as epitomized by models like ChatGPT, have revolutionized the field of natural language processing (NLP). Along with this trend, code-based large language models such as StarCoder, WizardCoder, and CodeLlama have emerged, trained extensively on vast repositories of code data. Yet, inherent in their design, these models primarily focus on generative tasks like code generation, code completion, and comment generation, and general support for multiple programming languages. While the generic abilities of code LLMs are useful for …

chatgpt code codellama cs.dc cs.lg cs.se data design focus generative generative pre-trained transformer language language models language processing large language large language models llms natural natural language natural language processing nlp processing repositories starcoder transformer transformer model trend vast wizardcoder

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote