Overview #
When using the Model Context Protocol (MCP) with LearnDash, the quality of your prompts determines the accuracy and safety of the results. Whether working with Angie, Cursor, or another tool, clear, structured prompts ensure that AI agents act on your LearnDash site in predictable ways.
This guide explains best practices for prompting, highlights common pitfalls, and shows how different models may respond differently to the same request.
Why Prompting Matters #
The MCP allows AI tools to create and manage LearnDash content by sending actions through the REST API. However, the AI only acts on what is written in the prompt.
Good prompts = accurate results.
Vague prompts = errors or unexpected actions.
Best Practices for Prompting #
Based on tested prompts, effective LearnDash MCP prompts share common traits. Use these principles to improve outcomes:
- Limit Scope
- Keep prompts to 3–5 operations maximum.
- Break complex workflows into smaller steps.
- Keep prompts to 3–5 operations maximum.
- Use Exact Field Names and Values
- Match the field names that LearnDash uses (e.g., “Start Date,” “Access Mode”).
- Avoid vague descriptions like “make it not purchasable.”
- Match the field names that LearnDash uses (e.g., “Start Date,” “Access Mode”).
- Specify Draft vs. Published
- Always state whether new content should be saved as a draft or published.
- Always state whether new content should be saved as a draft or published.
- Example:
“Create a lesson called ‘Introduction’ in the course ‘Physics 101.’ Save it as a draft.” - Use Clear Dates and Times
- Provide exact dates rather than relative terms.
- Provide exact dates rather than relative terms.
- Example:
“Set the course start date to September 15, 2025, and the end date to December 20, 2025.” - Reference Relationships Explicitly
- LearnDash content follows a hierarchy: Course → Lesson → Topic → Quiz.
- Make sure prompts respect this structure.
- LearnDash content follows a hierarchy: Course → Lesson → Topic → Quiz.
- Example:
“In the course ‘Anatomy 101,’ add a lesson called ‘Muscle Groups’ with a topic called ‘Upper Body Basics.’” - Include Verification or Fallback Instructions (Optional)
- For important actions, ask the AI to confirm or provide a fallback.
- For important actions, ask the AI to confirm or provide a fallback.
- Example:
“Enroll students from this CSV into the course ‘Botany Basics.’ If any users don’t exist, return their names instead of failing.”
Prompt Examples #
Course Management #
“Update the course ‘Semester 3’ start date to November 5, 2025, and the end date to February 20, 2026.”
Content Management #
“Create a new lesson called ‘Final Project’ at the end of the course ‘Chemistry Basics.’ Enable assignment submission worth 100 points. Save as draft.”
Enrollment #
“Enroll Maria Lopez, Taylor Smith, and Jordan Lee in the course ‘Botany Basics.’”
Group Management #
“Create a group called ‘Fall 2025 Cohort’ and assign all courses tagged ‘Science’ to it.”
Content Organization #
“Add a LearnDash course tag called ‘AI.’ Assign this tag to all courses with titles that include ‘MCP’ or ‘LLM.’”
Content Creation #
“Use this transcript to create a LearnDash course titled ‘World History Lecture Series.’ Break it into 10 lessons with 2 topics each.”
Model Differences #
Different models respond differently to the same prompt:
- Angie
- Optimized for beginner-friendly tasks
- Safer defaults
- Requires clear, simple prompts
- Optimized for beginner-friendly tasks
- Cursor with GPT-4 or Claude
- Handles more complex or multi-step prompts
- Better at parsing external documents (e.g., CSVs, transcripts)
- Handles more complex or multi-step prompts
- Open-source models (Ollama, Mistral, etc.)
- Require more precise prompts
- May not perform as well on complex tasks
- Strong option for privacy-focused use cases
- Require more precise prompts
Common Pitfalls #
Avoid these common mistakes when prompting with MCP:
- Including more than 5 operations in a single request
- Using vague language (e.g., “soon,” “later,” “final stuff”)
- Ignoring LearnDash’s content hierarchy
- Using relative dates like “next semester” instead of exact dates
- Forgetting to specify draft vs. published
Summary #
Effective prompting is the foundation of using MCP with LearnDash. Keep prompts small, precise, and structured. Always reference the correct LearnDash field names, respect content hierarchy, and specify draft/published status.