May 18, 2023, 3:36 p.m. | James Briggs

James Briggs www.youtube.com

Let's take a look at Mosaic ML's new MPT-7B LLM. We'll see how to use any MPT-7B model (instruct, chat, and storywriter-65k) in both Hugging Face transformers and LangChain. By using MPT-7B in LangChain we give it access to all of the tooling available via the library, like AI agents, chatbot functionality, and more.

🔗 Notebook link:
https://github.com/pinecone-io/examples/blob/master/generation/llm-field-guide/mpt-7b/mpt-7b-huggingface-langchain.ipynb

🎙️ Support me on Patreon:
https://patreon.com/JamesBriggs

👾 Discord:
https://discord.gg/c5QtDB9RAP

🤖 70% Discount on the NLP With Transformers in Python course:
https://bit.ly/3DFvvY5

👋🏼 …

agents ai agents chat chatbot face hugging face langchain library llm look tooling transformers

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Technology Consultant Master Data Management (w/m/d)

@ SAP | Walldorf, DE, 69190

Research Engineer, Computer Vision, Google Research

@ Google | Nairobi, Kenya