Done Criteria
A shared, measurable checklist that defines the minimum quality and completeness required for a backlog item or increment to be considered complete and potentially shippable. It is created collaboratively by the Scrum Team with the Product Owner and is applied consistently across Sprints, evolving through inspection and adaptation.
Key Points
- Defines the quality bar for "potentially shippable" and prevents partial or hidden work from slipping through.
- Different from acceptance criteria, which are story-specific; Done Criteria are team- or product-wide.
- Created collaboratively during release or sprint planning and refined in retrospectives.
- Acts as an input to Create Tasks and Create Deliverables, and as a gate in Demonstrate and Validate Sprint.
- Promotes transparency and consistent expectations across the Scrum Team and stakeholders.
- Should be objective, testable, and feasible within a Sprint to avoid unmet commitments.
Purpose
Done Criteria provide a single, unambiguous standard for completion that aligns the team on quality, risk controls, and stakeholder expectations. They reduce rework, support reliable forecasting, and enable the Product Owner to accept increments with confidence.
In the SBOK flow, they guide execution and validation by anchoring development, testing, and integration activities, ensuring each increment meets the same baseline before release considerations.
Key Terms & Clauses
- Product-level Done Criteria: A global checklist for the entire product or release.
- Team/Sprint-level Checklist: The practical, Sprint-applied form used in day-to-day work.
- Acceptance Criteria: Conditions per user story; these confirm functionality, while Done Criteria confirm overall completeness and quality.
- Nonfunctional Requirements: Performance, security, usability, and other quality attributes included in the checklist.
- Compliance/Standards: Regulatory, policy, and organizational standards the increment must meet.
- All-or-nothing: If any item is unmet, the work is not done.
How to Develop/Evaluate
Developing Done Criteria:
- Gather inputs: organizational standards, architecture guidelines, regulatory needs, and product quality goals.
- Draft a concise, testable checklist that covers build, test, integration, documentation, and deployment readiness.
- Validate feasibility within a Sprint and align with the Product Owner and stakeholders.
- Automate verification where possible through CI/CD, static analysis, and test suites.
- Publish and make it visible; refine during Retrospect Sprint based on learnings.
Evaluating completion:
- Confirm story acceptance criteria are met first, then apply the Done Criteria checklist.
- Map each clause to evidence: passing tests, code review logs, coverage thresholds, security scans, and updated documentation.
- Use Definition-of-Ready vs. Done separation to ensure clarity on entry and exit conditions.
How to Use
In planning, use Done Criteria to decompose user stories into tasks that include testing, integration, and documentation activities. In estimating, include the effort to satisfy each clause to avoid underestimation.
During execution, use them as a working agreement for code reviews, test coverage, and integration. In Demonstrate and Validate Sprint, apply them as the acceptance gate for the increment. In Retrospect Sprint, adapt the checklist to improve quality and flow.
Process linkage in SBOK: input to Create Tasks and Create Deliverables; applied in Demonstrate and Validate Sprint; refined as an output of Retrospect Sprint; referenced during Approve, Estimate, and Commit User Stories to calibrate scope and effort.
Example Snippet
- Code peer-reviewed and merged to main branch with no critical static analysis issues.
- Unit tests written and passing; minimum 80% coverage for changed code.
- Integrated and passing end-to-end tests in CI; no high-severity defects open.
- Security and performance checks passed against agreed thresholds.
- User-facing documentation and release notes updated.
- Product Owner acceptance recorded; feature toggle strategy defined if applicable.
Risks & Tips
- Risk: Vague criteria lead to debates in Sprint Review and unplanned carryover.
- Risk: Overly heavy criteria slow delivery and encourage workarounds.
- Risk: Inconsistent criteria across teams undermine integration in multi-team releases.
- Tip: Keep each clause measurable and evidence-based to enable fast verification.
- Tip: Automate tests and checks to make Done Criteria cheaper to satisfy.
- Tip: Regularly prune and tune the checklist in retrospectives to balance quality and flow.
PMP/SCRUM Example Question
A team finishes a user story that meets its acceptance criteria, but performance tests and documentation updates are not completed. What should guide the Product Owner's decision to accept or reject the story in the Sprint Review?
- Definition of Ready.
- Done Criteria.
- Sprint Goal.
- Release burnup chart.
Correct Answer: B — Done Criteria
Explanation: Acceptance criteria confirm the story's functionality, while Done Criteria ensure overall completeness and quality (tests, integration, documentation). If Done Criteria are not met, the item is not done.
HKSM