The Anatomy of a Great Prompt
Five Components of a Great Prompt
In Lesson 2.1, you learned that clarity, specificity, and context transform prompt quality. Now let's get specific about what those look like in practice. A great prompt typically includes up to five components. You don't always need all five — but knowing them gives you a toolkit you can pull from anytime.
1. ROLE
Telling AI what role to play sets the tone, vocabulary, and expertise level of the response. A "biochemist" and a "middle school science teacher" will explain the same concept very differently.
2. TASK
The task is what you actually want AI to do. Be specific: "create," "explain," "compare," "debug," "brainstorm." A clear verb + clear object makes the task unambiguous.
3. CONSTRAINTS
Constraints are the rules and limits that shape the output. Length, difficulty, tone, what to include, what to avoid. Without constraints, AI will make its own choices — and they might not be what you want.
4. EXAMPLES
Showing AI what you want is often more effective than telling it. Even one example of your desired output format gives AI a concrete pattern to follow.
5. OUTPUT FORMAT
Tell AI exactly how to structure its response: bullet points, numbered lists, tables, paragraphs, JSON, code blocks. If you don't specify, AI picks a format that may not match your needs.
Answer: A constraint ("use plain language, no technical jargon") or a role ("you're explaining this to a 14-year-old"). Both work — constraints directly limit the output, while roles shape the overall tone and complexity.
Putting It All Together
Let's build a complete prompt using all five components. Imagine you're working on a Track 2 project — a portfolio website:
This prompt gives AI a clear picture of what you need. The response will be specific, structured, and immediately useful for building your project. Compare that to: "Help me make a portfolio website." Night and day.
Answer: False. Most prompts use 2–3 components. A quick question might only need a clear task. A complex project prompt might use all five. The components are tools in your toolkit — use the ones that fit the situation.
Iteration: Your First Prompt Is a Rough Draft
Here's something that even experienced AI users sometimes forget: your first prompt almost never produces perfect output. That's fine. It's expected. The real skill is knowing how to iterate.
The iteration loop:
- Send your prompt
- Read the response carefully
- Identify what's good and what's not right
- Revise your prompt (add constraints, clarify, give examples)
- Send the improved prompt
- Repeat until the output meets your needs
Iteration isn't a sign that you're bad at prompting. It's how prompting works. Professional developers, writers, and designers who use AI all iterate. The skill is doing it efficiently — identifying exactly what to change in your prompt to get closer to what you want.
Common iteration moves:
- Too long? Add: "Keep your response under 200 words."
- Too vague? Add specific constraints or an example of what you want.
- Wrong tone? Add a role or a tone instruction: "Write as if explaining to a friend."
- Missing something? Ask: "Also include [specific thing] in your response."
- Completely off track? Start a fresh conversation with a better prompt rather than trying to fix a confused thread.
Key Concepts
- Great prompts can include five components: role, task, constraints, examples, and output format.
- You don't need all five every time. Use the ones that fit the situation.
- Role assignment changes the tone, expertise level, and vocabulary of AI's response.
- Constraints are the guardrails that keep AI's output useful: length, complexity, what to include/exclude.
- Showing examples of your desired output is often more effective than describing it.
- Iteration is the normal process of refining your prompt based on AI's response. Plan for 2–3 rounds.
Try It: Prompt Builder
Build prompts like building blocks. In this interactive activity, you'll learn which components matter most in different situations.
Check Your Understanding
1. What does adding a "role" to your prompt do?
Explanation: When you say "You are a patient math tutor," the AI adjusts its language to be more explanatory and encouraging. Say "You are a senior engineer," and it uses more technical language. The role shapes the style of the entire response.
2. You ask AI to write a blog post and the result is 2,000 words when you only wanted 300. What prompt component should you add?
Explanation: A constraint like "Keep this under 300 words" directly limits the output length. Without length constraints, AI defaults to however much content its patterns suggest is typical for that type of request.
3. Which is more effective: telling AI what format you want, or showing it?
Explanation: An example gives AI a concrete pattern to match, which often produces more consistent results than a description alone. But telling also works, especially for common formats. Using both (a description plus an example) is the most reliable approach.
4. After three rounds of iteration, your AI output is still not right. What should you do?
Explanation: When a conversation gets off track, adding more instructions to the thread often makes things worse — the AI has confused context to work with. Starting fresh with a better prompt (incorporating what you learned from the failed attempts) is usually more effective.
Reflect & Write
Write 2–3 sentences: Think about the five prompt components (role, task, constraints, examples, format). Which one do you think you'll use most for your project? Which one seems most unfamiliar or unintuitive to you right now?
Project Checkpoint
Build a multi-component prompt for your project:
- Write a prompt that includes at least role, task, and constraints for one specific aspect of your project (e.g., "design the home page," "plan the data structure," "outline the features").
- Test it in an AI tool.
- Iterate at least once: identify what's not right in the response and revise your prompt.
- Log both versions (original and revised) in your Prompt Iteration Tracker (PDF).
Find this and all other downloadable resources on the Dashboard Resources page.
Level Up: Coming Next
Lesson 2.3 — Socratic Prompting: Asking Questions That Unlock Better Answers. What if instead of telling AI what to do, you asked it questions? Turns out, this produces deeper, more thoughtful output. You'll learn why — and practice the technique.
Continue to Lesson 2.3 →