Prompt Engineering Guide
πŸ˜ƒ Basics
πŸ’Ό Applications
πŸ§™β€β™‚οΈ Intermediate
🧠 Advanced
Special Topics
🌱 New Techniques
πŸ€– Agents
βš–οΈ Reliability
πŸ–ΌοΈ Image Prompting
πŸ”“ Prompt Hacking
πŸ”¨ Tooling
πŸ’ͺ Prompt Tuning
πŸ—‚οΈ RAG
🎲 Miscellaneous
Models
πŸ”§ Models
Resources
πŸ“™ Vocabulary Resource
πŸ“š Bibliography
πŸ“¦ Prompted Products
πŸ›Έ Additional Resources
πŸ”₯ Hot Topics
✨ Credits
πŸ—‚οΈ RAG🟒 Introduction

Introduction

🟒 This article is rated easy
Reading Time: 1 minute
Last updated on August 7, 2024

Valeriia Kuka

This chapter covers Retrieval-Augmented Generation (RAG) techniques that enhances the capabilities of Large Language Models (LLMs) by integrating external data retrieval.

Instead of relying solely on pre-trained data, RAG allows models to fetch relevant information from external sources, making them more dynamic and adaptable. This architecture solves key limitations of LLMs, such as "hallucinations" (generating false information) and the need for constant fine-tuning to reflect updated data.

Valeriia Kuka

Valeriia Kuka, Head of Content at Learn Prompting, is passionate about making AI and ML accessible. Valeriia previously grew a 60K+ follower AI-focused social media account, earning reposts from Stanford NLP, Amazon Research, Hugging Face, and AI researchers. She has also worked with AI/ML newsletters and global communities with 100K+ members and authored clear and concise explainers and historical articles.

🟦 Multi-Fusion Retrieval Augmented Generation (MoRAG)