Most teachers improve by accident. A lesson goes well, you vaguely remember why, and you try something similar next time. Maybe. If you remember.
I decided to stop improving by accident. For one full school year, I ran one deliberate experiment per week using the PDCA cycle (Plan-Do-Check-Act). Here’s what happened.
What Is PDCA and Why Should Teachers Care?
PDCA was developed by W. Edwards Deming, the statistician who helped rebuild Japanese manufacturing after WWII [1]. Toyota used it. Samsung used it. The principle is simple: every process can be improved, and improvement requires measurement.
Applied to teaching:
- Plan: Identify one thing to change, predict the outcome
- Do: Implement in one class
- Check: Measure what actually happened
- Act: Keep it, modify it, or abandon it
My Weekly Template
Every Monday morning, I filled out a card:
Experiment #17
Testing: Starting class with a 2-minute “hook question” instead of announcements
Hypothesis: Students will be seated and engaged 3 minutes faster
Measurement: Time from bell to full class engagement (stopwatch)
Class: 3rd period (my most chaotic)
Friday, I wrote the result on the back of the card. I kept them in a shoebox. Analog on purpose — I knew I wouldn’t maintain a spreadsheet.
5 Experiments That Transformed My Teaching
#3: Brain Dumps (Week 3)
Test: 3-minute memory retrieval at start of every class.
Result: Unit test scores up 18% after 3 weeks. Became permanent. [2]
#11: Standing Desks for 10 Minutes
Test: After 25 minutes seated, everyone stands for 10 minutes of group work.
Result: Off-task behavior dropped 60%. Students self-reported feeling “less dead.” Standardized.
#19: Music During Independent Work
Test: Lo-fi instrumental playlist during solo work time.
Result: Mixed. Some students loved it, others found it distracting. Modified to: student choice with headphones.
#27: Peer Teaching Fridays
Test: Students teach the previous week’s hardest concept to a partner for 8 minutes.
Result: The students who taught scored higher than those who were taught. Consistent with the protege effect research [3]. Standardized every other Friday.
#34: Two Stars and a Wish
Test: Peer feedback format: 2 specific compliments + 1 improvement suggestion.
Result: Students gave more actionable feedback than with open-ended “give feedback.” Quality of revised work improved noticeably.
What the Numbers Looked Like
Out of 40 experiments:
- 22 standardized (became part of my regular practice)
- 11 modified (worked with adjustments)
- 7 abandoned (didn’t work or too resource-intensive)
That’s a 55% hit rate. In one year, I added 22 proven strategies to my teaching toolkit. The year before, when I was improving by accident? Maybe 2 or 3.
Why Most Professional Development Fails
The research on teacher PD is damning. Most workshops produce zero lasting change in classroom practice [4]. The reason: PD tells you what to do, but doesn’t create a feedback loop to verify whether you’re actually doing it, and whether it’s working.
PDCA is built-in PD. Every week, you’re learning something specific about your students, your classroom, and your practice. It’s slow — one variable at a time. But it compounds.
Getting Started
Don’t overthink it. Monday: write what you’ll try. Friday: write what happened. One experiment per week. After a year, you won’t be the same teacher.
After five years? You’ll have run 200 experiments. The compound interest on that is staggering.
References
[1] Deming WE. Out of the Crisis. MIT Press, 1986.
[2] Roediger HL, Karpicke JD. “Test-enhanced learning.” Psychological Science, 2006. DOI: 10.1111/j.1467-9280.2006.01693.x
[3] Nestojko JF, et al. “Expecting to teach enhances learning and organization of knowledge.” Memory & Cognition, 42(7), 1038-1048, 2014. DOI: 10.3758/s13421-014-0416-z
[4] Kraft MA, Blazar D, Hogan D. “The effect of teacher coaching on instruction and achievement.” Review of Educational Research, 88(4), 547-588, 2018.