Most prompt training courses I’ve seen are written for people who already think like engineers. Too much theory, technical terminology, and “as a language model.” The operators and consultants I speak with do not need any of that. They need to turn a messy meeting transcript into a clean exec brief before their 2 pm, and they’d like Claude to help.
So I’ve been sketching a series of short, Teachable courses that teach expert-level prompting to non-technical knowledge workers. One course per use case. Ten-minute lessons. A real prompt you can copy, and a real output you can compare against.
The problem: writing one course can take days. Writing a comprehensive series, weeks. The fix was to build a prompt that builds the course.
What does the prompt do? Build Claude courses.
This prompt takes one input, the skill you want to teach, and outputs an end-to-end course package:
- Marketing copy (title, description, “You’ll Learn,” audience-fit bullets)
- A full course outline
- Every lesson has a spoken script, a slide outline, a hands-on exercise, and a 3-question quiz.
- A capstone project with a self-grading rubric
- A drip schedule and a 6-email sequence
- SEO keywords, meta description, and social post variants
Paste-ready Markdown. Nothing generic.
How it’s built
I followed Anthropic’s published prompting guidance pretty literally. There’s explicit role priming at the top. Inputs are wrapped in XML tags so Claude can tell them apart from instructions. Before writing a word of course content, Claude runs a <planning> pass: it sketches the learner, decomposes the outcome into micro-skills, and names the single prompt pattern the whole course will reinforce. Nothing ships until Claude runs a <self_check> pass against a quality checklist (jargon count, example coverage, capstone alignment, banned marketing words).
That structure matters more than it sounds. Without the planning pass, Claude produces a generic course about prompting. With it, Claude produces a course on your specific use case, taught through a single consistent pattern, with worked examples tied to that use case in every lesson.
What I learned building it
Three things surprised me. Consider it solid prompting guidance.
First: the “anti-curriculum” step, asking Claude what the course will deliberately not cover, does more for quality than positive scoping.
Second: the banned-words list in the self-check (” unlock,” “leverage,” “supercharge,” “game-changer,” “revolutionize”) changed the marketing copy more than the tone instruction I tried. Negative constraints beat positive ones when you’re fighting defaults.
Third: when you tell Claude “no coding, no prior prompt engineering, ten minutes in a free account,” the exercises get dramatically better. Constraints are not quality limits. They are the reason there is any.
Try it
The prompt is in my repo, free to copy: https://github.com/bwarrene/blanewarrene/blob/main/prompts/teachable-course-builder-prompt.md.
If you run it, send me your course title and topic. I’ll tell you honestly whether I’d take it.
My first courses are here and on Teachable. If you’d like a seat, reply to this post.




Leave a Reply