Design better courses.

Open source instructional design skills for Claude Code. Every recommendation is backed by peer-reviewed research, so you always know how strong the evidence is.


What it does

Your course, reviewed by research

idstack is a design partner that lives inside Claude Code. Import your course, and it checks your work against peer-reviewed research across 11 domains. No jargon. No guessing. Just evidence.

See it work

Import a course, get an evidence-based audit in minutes.

claude code
You: /course-import

How do you want to import your course?
A) IMS Common Cartridge file (.imscc)
B) Paste course documents
C) Canvas API
D) PDF or document file
E) SCORM package (.zip)

You: B - here's my syllabus.

Analyzing... Found 12 modules, 8 assignments, 4 discussions.

You: /course-quality-review

Overall Score: 62/100

⚠ Learning Objectives - 8 modules lack stated objectives
⚠ Assessment - no rubrics, no alignment to objectives
⚠ Social Presence: 3/10 - no group work, minimal peer interaction

[Assessment-8] [T1] - elaborated feedback produces larger
learning gains than correctness-only feedback

Get started

Up and running in 5 minutes

git clone https://github.com/savvides/idstack.git && cd idstack && ./setup

New to Claude Code? No problem. Here's everything, step by step.

  1. Install Claude Code. It's a free AI tool that runs in your terminal.
    Visit claude.ai/download and follow the instructions for your computer.
  2. Install idstack. Open your terminal and paste the command above, or:
    git clone https://github.com/savvides/idstack.git && cd idstack && ./setup
  3. Start Claude Code by typing claude in your terminal.
  4. Import your course. Type /course-import and paste your syllabus, upload a Common Cartridge file, or connect your Canvas API.
  5. Get your review. Type /course-quality-review to see your evidence-based course audit. You'll know in 5 minutes if this is for you.

Your design team

9 specialists. One workflow.

Each skill is a specialist on your team. Import your course and they get to work.

Needs Analyst
/needs-analysis
Three-level assessment before you build anything. Is training the right intervention? Task analysis and learner profiling.
Curriculum Designer
/learning-objectives
Measurable objectives with Bloom's taxonomy. Bidirectional alignment check: does each objective have a matching activity AND assessment?
Assessment Architect
/assessment-design
Evidence-based rubrics and feedback strategies. Formative checkpoints before summative assessments. Nicol's 7 principles of good feedback.
LMS Bridge
/course-import
Import from Common Cartridge, Canvas API, SCORM, PDF, or paste. Quick-scan quality flags and Bloom's pre-classification on import.
Content Generator
/course-builder
Generates syllabus, module pages, assignments, and rubrics from your manifest. Content adapts to learner expertise level.
Quality Auditor
/course-quality-review
Full Quality Matters audit plus Community of Inquiry presence analysis. 8 structural standards, 3 presence dimensions. Every finding cites its evidence.
Accessibility Reviewer
/accessibility-review
WCAG 2.1 AA compliance audit plus Universal Design for Learning review. Scores accessibility 0-100 with "Must Fix" and "Should Improve" tiers.
Adversarial Auditor
/red-team
Assumes your course is broken and tries to prove it. Alignment stress test, persona simulation, prerequisite chain integrity. Confidence score.
LMS Publisher
/course-export
Exports to IMS Common Cartridge, SCORM 1.2, or pushes directly to Canvas via API. The output is the course.

The workflow

/needs-analysis /learning-objectives /assessment-design

/course-builder /course-quality-review /accessibility-review

/red-team /course-export

/course-import enters at /learning-objectives

Each skill feeds into the next, but any skill works on its own. No pipeline required.


Evidence Base

Every recommendation cites its research.

11 domains. 5 evidence tiers. Stronger evidence takes precedence when tiers conflict.

Instructional Design

12
papers cited

Alignment & Objectives

10
papers cited

Needs Analysis

7
papers cited

Cognitive Load

15
papers cited

Assessment & Feedback

10
papers cited

Multimedia Learning

10
papers cited

Learner Analysis

8
papers cited

Evaluation Models

8
papers cited

Rapid Prototyping

9
papers cited

Online Quality

10
papers cited

Accessibility & UDL

9
papers cited
T1: Meta-analyses, RCTs T2: Quasi-experimental T3: Systematic reviews T4: Observational T5: Expert opinion