Is Prompt Engineering Dead?
6 minutes
We've all seen the dramatic headlines calling prompt engineering "the skill of the future" and the bloggers scolding us for not fully embracing generative AI. But do we really need to learn prompt engineering, especially with newer and more powerful AI models constantly emerging?
In this article we will tackle the big question: Is prompt engineering dead?
We’ll explore the most common arguments claiming prompt engineering is on its way out:
- New AI tools, like custom GPTs, make prompt engineering unnecessary
- You can rely on pre-made prompt templates—so why bother learning prompt engineering?
- More advanced AI models remove the need for crafting detailed prompts
- AI can now generate its own prompts
Curious to find out the answer? Let’s dive in!
The Rise of Prompt Engineering
Generative AI has exploded in popularity over the last couple of years, finding its way into industries as customer service assistants, personal tutors, content creation tools, and much more. This new technology allowed us to better access information, receive assistance with task completion, and more. However, with the rapid expansion of AI, many left unfamiliar with one crucial skill for success: prompt engineering.
What Is Prompt Engineering?
Prompt engineering is the process of iterating and refining a prompt to ensure it produces the desired result. By mastering prompt engineering, you can dramatically improve the quality of outputs from generative AI models, such as large language models (LLMs) or image generation tools. Once you understand the key principles, getting the response you need will become much quicker and easier.
For beginners, or those looking to sharpen their skills, I recommend a free course, ChatGPT For Everyone—it’s an excellent starting point for learning how to use ChatGPT and prompt engineering.
Why Did Prompt Engineering Emerge?
But why is prompt engineering needed? It's simple. Despite how impressive the latest models are, they lack the depth of knowledge needed to always understand our intent and respond appropriately. I often find myself giving a vague prompt that requires me to follow up my initial query with more details about the response I’m looking for.
Just as with human communication, specificity is key—especially for complex or multi-step tasks. If you don't provide clear and detailed instructions, the model might offer generic or irrelevant responses.
4 Reasons Why Prompt Engineering is Dead—And Why They’re Wrong
Lately, there's been increasing debate about whether prompt engineering is becoming obsolete (here’s a Reddit conversation). From developers to users, many are questioning if it's still a necessary skill.
To satisfy your curiosity right from the start, I’ll mention why those arguments are not plausible. They all present just partial solutions, improving your initial starting point but still requiring you refine the prompts iteratively for the best results.
Let’s dive into these points and see why they don't fully hold up.
Reason 1: Customized AI Tools
With the introduction of custom GPTs by OpenAI, we can now create specialized GPTs tailored to specific tasks. There's even a marketplace featuring these user-made GPTs, with some—like the popular “Write For Me,” which enables one-click content creation—being used in over 5 million conversations.
Custom GPTs might seem like a one-stop solution, but in reality they don’t replace the need for prompt engineering. Instead, they complement it.
- Crafting Custom GPTs: Building effective custom GPTs requires carefully constructed instructions, which means prompt engineering remains crucial.
- Personalization: If you want personalized outputs that align with your unique style, prompt customization is key.
- Power Users: For advanced users, creating custom GPTs for repetitive tasks can be a game changer—but it requires a solid foundation in prompt engineering.
Reason 2: The Rise of Prompt Galleries and Templates
In the last few months, AI leaders like Google and OpenAI have introduced prompt galleries – collections of pre-made prompt templates designed to generate specific responses without needing to craft the prompt from scratch. For example, Google’s gallery offers over 20 ready-made prompts for various tasks.
While these galleries simplify the initial steps, they are not a complete solution.
- Starting Point: Prompt galleries provide a solid base, but we often need to refine the prompts iteratively for the best results.
- Prompt Optimization: These galleries help reduce manual prompt crafting but don't eliminate the need for customization.
Reason 3: More powerful AI models
In recent months, a bunch of new more powerful models appeared on the LLM market, with new capabilities allowing them to better understand and complete tasks that were previously difficult. In September 2024, both Google and OpenAI released models that excel in STEM-related fields, like solving math problems and writing code. Models like OpenAI's o1 series give us confidence to tackle more complex disciplines.
Adapting to New Models
These advanced models require us to adapt their prompting techniques to suit these new, more complex tasks. Generic prompts won't suffice for more complex tasks.
Changing Best Practices
What worked for older models might not apply to newer ones. As model architecture and training methods evolve, so too must your prompting techniques. For instance, OpenAI shared how to effectively prompt their new o1 series, noting that some traditional prompting techniques like, like few-shot prompting or chain-of-thought prompting may no longer be as effective.
AI's Improved Understanding of User Intent — But Not Perfection
While newer models like Google’s Gemini 1.5 Pro can interpret simple prompts with impressive accuracy, fine-tuning remains essential. Advanced LLMs might reduce errors with basic inputs, but even those models can produce irrelevant or unclear results without well-crafted prompts. Iterative refinement ensures more accurate responses.
Specific Use Cases
Despite the power of the latest models, I find myself consistently using prompt engineering for certain tasks, particularly complex, multi-step processes that a simple prompt can't handle. One such area is creating outlines for videos and articles. Most AI models are capable of creating an outline with a general prompt, however, the poor quality of that outline becomes clear when you compare it to one created with an engineered prompt.
Reason 4: AI Create Better Prompts Than Humans
Another common idea floating around is that we shouldn't bother learning how to write prompts ourselves; instead, we should just let AI generate the best prompts for us. This would be very convenient as you don’t need to spend time on manual prompt engineering, which can be time-consuming.
- AI-Generated Prompts: AI can indeed generate impressive prompts, but the initial input from us is still critical. Without a well-crafted starting point, AI-generated prompts won’t deliver optimal results.
- Efficiency, Not Replacement: AI-generated prompts can enhance efficiency, but they still depend on our ability to craft a clear and thoughtful initial query.
Why Prompt Engineering Isn’t Dead
While the arguments against prompt engineering hold some truth, they only present a small part of the larger picture. Sure, we have more tools and shortcuts to get decent results from AI models, but when it comes to delivering consistent, high-quality outcomes, prompt engineering is still the go-to skill.
If you’re still unsure, let’s recap the key reasons why prompt engineering remains essential:
- Custom AI Tools: Custom GPTs still rely on prompt engineering for personalized and effective outputs.
- Prompt Galleries: Pre-made prompts need refinement, requiring prompt engineering for best results.
- Advanced AI Models: More powerful models demand adapted and thoughtful prompting techniques.
- Evolving Techniques: As AI evolves, prompt methods must also evolve, keeping the skill relevant.
- User Intent: Even advanced AI needs refined prompts for precise, high-quality outputs.
- AI-Generated Prompts: AI prompts depend on well-crafted initial inputs, maintaining the need for prompt engineering.
Conclusion
In short, prompt engineering isn’t going anywhere any time soon. es, we’ll see shifts and changes as AI models improve, but mastering prompt engineering will continue to be key for anyone serious about getting the most out of generative AI.
You can cite this work as follows:
@article{is_prompt_engineering_dead2024Kilpatrick,
Title = {Is Prompt Engineering Dead?},
Author = {Chandler Kilpatrick},
Year = {2024},
url={https://learnprompting.org/blog/is_prompt_engineering_dead}
}