← Back to Blog

Definition of Done: The Checklist Nobody Agrees On

Your team's Definition of Done is either too vague to be useful or so detailed nobody follows it. Here's how to build one that actually works.

definition-of-done agile quality scrum

A developer pushes code to production. The ticket moves to "Done." Two days later, the bug reports start rolling in. QA never tested it. Documentation wasn't updated. The feature flag is still off for 90% of users.

Is it done? Well, the code merged. But is it done done?

This is the Definition of Done (DoD) problem: every team has one, but half the time it's either too vague ("all work complete") or so exhaustive nobody actually follows it.

The DoD isn't just a formality—it's your quality bar. When it's unclear, you get half-finished features, technical debt, and endless "is this ready to ship?" Slack threads. When it's too strict, you get developers checking boxes instead of thinking, and velocity grinds to a halt.

Here's how to build a DoD that your team will actually use.

The Three Levels of Done

Most teams make one critical mistake: they try to create one Definition of Done for everything. But "done" means different things at different stages.

You need three:

1. Code Done (Developer)

What must be true before code review?

2. Story Done (Team)

What must be true before moving to "Ready for Deploy"?

3. Production Done (Product)

What must be true before calling it shipped?

Most confusion happens because someone says "done" when they mean Code Done, but the PM hears Production Done. Be explicit about which level you're talking about.

Real Example: The Checkbox That Caught Fire

I once worked with a team whose DoD included "Code reviewed by two engineers." Sounds reasonable, right?

Problem: On a 4-person dev team, this meant every PR needed half the team to stop work and review. Velocity tanked. Reviews became rubber-stamps because nobody had time for deep analysis.

We changed it to: "Code reviewed by one engineer, plus automated test coverage >80%." Single reviewer for most PRs, second reviewer only for architecture changes or security-sensitive code.

Result: Review time dropped 40%, but quality didn't suffer because the tests caught what humans missed.

Lesson: Your DoD should enable shipping, not block it.

What Doesn't Belong in a DoD

Some things teams put in their DoD that should live elsewhere:

Your DoD is about quality standards, not process steps. Keep it focused.

The Async DoD Template

For remote teams, your DoD needs to be self-service. No "check with Sarah before deploying" steps. Here's a template that works across timezones:


        ## Story Done Checklist
        
        **Code Quality:**
        - [ ] Code reviewed and approved
        - [ ] Automated tests passing (CI green)
        - [ ] No linter/type errors
        
        **Functionality:**
        - [ ] Acceptance criteria met
        - [ ] Tested in staging by developer
        - [ ] Edge cases handled (errors, loading, empty states)
        
        **Documentation:**
        - [ ] Inline code comments for complex logic
        - [ ] README updated (if public API changed)
        - [ ] Changelog entry added (if customer-facing)
        
        **Deploy Readiness:**
        - [ ] Feature flag configured (or deployment plan documented)
        - [ ] Monitoring/logs in place
        - [ ] Rollback plan identified
        

Notice: No "ask someone" steps. Everything is verifiable by the developer or automated tools.

What to Avoid

When Your DoD Is Working

You know your DoD is good when:

You know it's broken when:

Takeaways

Resources


Modern Project Management for Distributed Teams

PM Squared shares practical tools, templates, and lessons for PMs navigating remote work in 2026.

Browse Resources →