Prompt Engineering Guide
πŸ˜ƒ Basics
πŸ’Ό Applications
πŸ§™β€β™‚οΈ Intermediate
🧠 Advanced
Special Topics
🌱 New Techniques
πŸ€– Agents
βš–οΈ Reliability
πŸ–ΌοΈ Image Prompting
πŸ”“ Prompt Hacking
πŸ”¨ Tooling
πŸ’ͺ Prompt Tuning
πŸ—‚οΈ RAG
🎲 Miscellaneous
Models
πŸ“ Language Models
Resources
πŸ“™ Vocabulary Resource
πŸ“š Bibliography
πŸ“¦ Prompted Products
πŸ›Έ Additional Resources
πŸ”₯ Hot Topics
✨ Credits
🧠 AdvancedThought Generation🟒 Introduction

🟒 Introduction to Thought Generation Techniques

Reading Time: 1 minute
Last updated on September 27, 2024

Valeriia Kuka

Welcome to the thought generation section of the advanced Prompt Engineering Guide.

Thought generation uses various techniques to prompt an Large Language Model (LLM) to clearly articulate its reasoning while solving problems.

We've already explored the Contrastive Chain-of-Thought prompting technique. Stay tuned for more advanced techniques coming soon!

In this section, you'll learn about:

Valeriia Kuka

Valeriia Kuka, Head of Content at Learn Prompting, is passionate about making AI and ML accessible. Valeriia previously grew a 60K+ follower AI-focused social media account, earning reposts from Stanford NLP, Amazon Research, Hugging Face, and AI researchers. She has also worked with AI/ML newsletters and global communities with 100K+ members and authored clear and concise explainers and historical articles.

🟦 Active Prompting

🟦 Analogical Prompting

🟦 Automatic Chain of Thought (Auto-CoT)

🟦 Complexity-Based Prompting

🟦 Contrastive Chain-of-Thought

🟦 Memory-of-Thought (MoT)

🟦 Step-Back Prompting

🟦 Tabular Chain-of-Thought (Tab-CoT)

🟦 Thread of Thought (ThoT)