loading
loading
loading
Let's take a look at Mosaic ML's new MPT-7B LLM. We'll see how to use any MPT-7B model (instruct, chat, and storywriter-65k) in both Hugging Face transformers and LangChain. By using MPT-7B in LangChain we give it access to all of the tooling available via the library, like AI agents, chatbot functionality, and more. š Notebook link: https://github.com/pinecone-io/examples/blob/master/learn/generation/llm-field-guide/mpt/mpt-7b-huggingface-langchain.ipynb š² Subscribe for Latest Articles and Videos: https://www.pinecone.io/newsletter-signup/ šš¼ AI Consulting: https://aurelio.ai š¾ Discord: https://discord.gg/c5QtDB9RAP Twitter: https://twitter.com/jamescalam LinkedIn: https://www.linkedin.com/in/jamescalam/ 00:00 Open Source LLMs like MPT-7B 00:50 MPT-7B Models in Hugging Face 02:29 Python setup 04:16 Initializing MPT-7B-Instruct 06:28 Initializing the MPT-7B tokenizer 07:10 Stopping Criteria and HF Pipeline 09:52 Hugging Face Pipeline 14:18 Generating Text with Hugging Face 16:01 Implementing MPT-7B in LangChain 17:08 Final Thoughts on Open Source LLMs #artificialintelligence #nlp #langchain #deeplearning #huggingface
The Generative AI and Large Language Models (LLMs) course covers everything you need to know about: - Generative AI - Large Language Models (LLMs) - OpenAI, Cohere, Hugging Face - Managed vs. Open Source - LLM Libraries like LangChain and GPT Index - Long-term memory and retrieval-augmentation And more to come...