June 27, 2023, 10:01 p.m. | Muhammad Arham

Towards AI - Medium pub.towardsai.net

This article introduces Gorilla; UC Berkeley, and Microsoft’s API support for Large Language Models.

Image by Author

Introduction

LLMs suffer from outdated information and they require re-training to keep up-to-date with recent changes. With limited context and weights, LLMs cannot store data for accurate responses. Therefore, LLMs are augmented by the use of numerous tools and plugins that use external APIs for better answers.

Gorilla introduces self-instruct fine-tuning and retrieval training on a large corpus of APIs that provides better …

api article context data everything image information language language model language models large language large language models llms machine learning microsoft naturallanguageprocessing nlp plugins responses support tools training uc berkeley

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Codec Avatars Research Engineer

@ Meta | Pittsburgh, PA