Prompt Engineering Guide
😃 Basics
💼 Applications
🧙‍♂️ Intermediate
🧠 Advanced
Special Topics
🌱 New Techniques
🤖 Agents
⚖️ Reliability
🖼️ Image Prompting
🔓 Prompt Hacking
🔨 Tooling
💪 Prompt Tuning
🗂️ RAG
🎲 Miscellaneous
Models
📝 Language Models
Resources
📙 Vocabulary Resource
📚 Bibliography
📦 Prompted Products
🛸 Additional Resources
🔥 Hot Topics
✨ Credits

Prompt Engineering

🟢 This article is rated easy
Reading Time: 9 minutes

Last updated on October 22, 2024

Introduction

In this document, we introduce you to the concept of prompt engineering and explain why it's a crucial skill when working with Generative AI models like ChatGPT and DALL-E.

We cover:

  1. What is Prompt Engineering?
  2. Why Prompt Engineering is Important
  3. What is a Prompt?
  4. Examples of Prompt Engineering in Action
Tip

If you're into prompt engineering, check our list of the 14 best prompt engineering courses with certificates.

What is Prompt Engineering?

Prompt engineering is the process of crafting and refining prompts to improve the performance of generative AI models. It involves providing specific inputs to tools like ChatGPT, Midjourney, or Gemini, guiding the AI to deliver more accurate and contextually relevant outputs.

Note

When an AI model doesn’t produce the desired response, prompt engineering allows us to iterate and adjust the prompt to optimize the output. This method is particularly useful for overcoming limitations of generative models, such as logical errors or insufficient context in responses.

Why Prompt Engineering is Important?

Generative AI relies heavily on the input it receives. The more clearly and contextually a task is described, the better the AI can respond. While anyone can try basic prompts, prompt engineering as a skill allows users to structure queries that better unlock an AI’s capabilities. Prompt engineering enables tasks from writing creative text to generating images or even code.

In today’s AI landscape, where large language models (LLMs) power a wide range of applications, prompt engineering is essential. It helps bridge the gap between vague, general queries and specific, targeted outputs that align with your goals. This iterative process ensures that the AI not only understands your intent but delivers results that require minimal post-processing.

Moreover, prompt engineering can mitigate risks like prompt injection attacks (where users try to manipulate the AI’s logic) and ensure that the model’s outputs align with desired ethical and operational standards.

What is a Prompt?

A prompt is an input or instruction provided to an AI model to generate a response. Prompts can take many forms, from simple questions to more complex instructions that specify tone, style, or structure. They’re the mechanism through which users communicate with AI models, and the clarity of the prompt directly influences the quality of the AI’s output.

Here's an example of a prompt:

Astronaut

Prompt


What is 923 * 99?

Let's see how it works:

Robot

AI Output


923 * 99 equals 91,677.

In this case, ChatGPT confidently gives an incorrect answer (the correct answer is 91,377). The issue here is that the model isn’t designed to perform complex mathematical reasoning accurately on demand.

Applying Prompt Engineering to Improve the Prompt

Generative AI models, like ChatGPT, can sometimes produce incorrect or misleading outputs. This often happens when a prompt is too vague, lacks necessary details, or doesn’t provide clear instructions.

Let's try to refine the prompt and encourage step-by-step reasoning to improve model's output:

Astronaut

Improved Prompt


What is 923 * 99?

Let's go step by step.

Always write out the full number of 0s for each term.

What model outputs:

Robot

AI Output


To multiply 923 by 99, we can break it down into smaller steps:

Step 1: Write out the full number of 0s for each term: 923 * 99 = 923 * 990

Step 2: Multiply the first term by 9: 923 * 9 = 8307

Step 3: Multiply the first term by 90 (adding one 0): 923 * 90 = 83070

Step 4: Add the results from step 2 and step 3: 8307 + 83070 = 91377

Therefore, 923 multiplied by 99 is equal to 91,377.

Interestingly, with these additional instructions, ChatGPT provides the correct answer by walking through the steps.

This process of editing and refining the prompt is what we call prompt engineering. Adding specificity and guidance to the prompt improves the output and guides the model apply logical reasoning more effectively.

Why Did This Updated Prompt Work?

Generative AI models are trained to predict text based on patterns instead of deep reasoning or factual accuracy. By prompting the model to explicitly think through its steps and break down the problem, we reduce the chance of mistakes and make the task easier for the model to handle.

Examples of Prompt Engineering in Action (Text, Image, and Code)

Prompt engineering is versatile and can be applied to many use cases across text, image, and code generation.

Here are some examples to illustrate its range:

Text

Let’s start with text-based models like ChatGPT. Suppose you need help writing a marketing summary:

Astronaut

Initial Prompt


Write a marketing summary for a new AI tool that helps companies automate tasks.

The output might be too generic, so you refine it:

Astronaut

Refined Prompt


Write a 100-word marketing summary for an AI tool called TaskBot, which automates repetitive tasks for small businesses in industries like retail and healthcare. Highlight efficiency and cost savings.

This provides clearer instructions and results in a more targeted response.

Image Generation

For models like Midjourney or DALL-E, prompts are crafted to generate specific imagery.

Astronaut

Initial Prompt


A cat sitting on a chair.

By adding more details, you guide the model to generate an image that aligns with your vision.

Astronaut

Refined Prompt


Generate an image of a tabby cat sitting on a wooden chair in a cozy, sunlit room, with soft shadows and warm lighting.

Code Generation

Prompts can help generate or optimize code:

Astronaut

Initial Prompt


Write a Python function to multiply two numbers.

Astronaut

Refined Prompt


Write a Python function that multiplies two integers and returns the result. Include error handling for cases where inputs are not integers.

This ensures the AI writes robust, real-world-ready code.

Prompt Engineering for Creative Tasks: Writing Tweets

Next, let’s apply prompt engineering in a creative context. Imagine we are developing a fictional AI product called "ArchaeologistAI," which tells stories about famous archaeologists. Let's ask ChatGPT to write a marketing tweet about it.

Astronaut

Prompt


Write a tweet about ArchaeologistAI.

Robot

AI Output


🔍 Exciting times ahead with #ArchaeologistAI!

This cutting-edge technology is revolutionizing the field of archaeology, helping us uncover ancient mysteries and piece together our past like never before.

Can't wait to see what hidden treasures and insights it unveils!

#AI #Archaeology

Including Relevant Information in the Prompt

This Tweet is inaccurate because ArchaeologistAI doesn't discover new things—it only tells stories. This isn’t ChatGPT’s fault, as it wasn’t given enough context.

Let’s refine the prompt by including more specific details:

Astronaut

Prompt


Write a tweet about ArchaeologistAI.

ArchaeologistAI tells stories about famous archaeologists.

Robot

AI Output


🔍 Unearth the captivating tales of legendary archaeologists with #ArchaeologistAI!

From Indiana Jones to Mary Leakey, this AI-powered storyteller brings their adventures to life, shedding light on their groundbreaking discoveries and inspiring the next generation of explorers.

🌍 #Archaeology

Much better now! This is a much more accurate output because we added key details about the product’s functionality.

Changing Style of the AI Output with the Prompt

Now, let’s refine it further by asking ChatGPT to write the tweet in the style of Indiana Jones:

Astronaut

Prompt


Write a tweet about ArchaeologistAI, in the style of Indiana Jones.

ArchaeologistAI tells stories about famous archaeologists.

Robot

AI Output


Grab your fedora and whip, folks! 🎩🔥

ArchaeologistAI is like having a personal Sallah by your side, sharing captivating tales of legendary archaeologists who've braved ancient tombs and unearthed hidden treasures! 🗺️💎

Get ready to embark on a digital adventure through history!

#ArchaeologistAI #UnearthTheLegends

Alright, that may be the message we need to target archaeology fans! This version adds a fun, thematic spin to the message, making it more engaging. By iterating and refining the prompt, we were able to achieve the desired result.

Conclusion

Prompt engineering is an iterative process. The perfect prompt rarely happens on the first try, so it’s essential to practice refining your inputs to get the best possible output from generative AI models. As we’ve seen, adding specificity, providing context, and guiding the model with detailed instructions can significantly improve its responses.

In the following docs, we’ll explore different prompting strategies that will help you get better at crafting prompts for a variety of AI applications.

FAQ

Why is prompt engineering important?

Basic prompts often lead to incorrect or suboptimal answers from generative AI models. Prompt engineering allows you to refine user instructions, adding context and specificity to improve the accuracy and relevance of the model’s responses.

Why was prompt engineering effective in the ArchaeologistAI example?

In the ArchaeologistAI example, refining the prompt provided the model with the necessary context and instructions, resulting in a tweet that better aligned with the product’s functionality. By iterating on the prompt, we also tailored the message to a specific audience.

How can I become a better prompt engineer?

Becoming a better prompt engineer requires practice. Experiment with prompts across different domains, learn from failures, and continuously test new strategies. Our Learn Prompting guide provides a wide range of tools and techniques to help you master the art of prompt engineering.

Valeriia Kuka

Valeriia Kuka, Head of Content at Learn Prompting, is passionate about making AI and ML accessible. Valeriia previously grew a 60K+ follower AI-focused social media account, earning reposts from Stanford NLP, Amazon Research, Hugging Face, and AI researchers. She has also worked with AI/ML newsletters and global communities with 100K+ members and authored clear and concise explainers and historical articles.

Copyright © 2024 Learn Prompting.