User Stor y Acceptance Criteria
User story acceptance criteria are clear, testable conditions that a user story must satisfy for the Product Owner to accept it. Created collaboratively during backlog refinement and finalized by Sprint Planning, they guide development, testing, and validation. In SBOK, they act as inputs for estimating and committing stories and as the basis for acceptance during Demonstrate and Validate Sprint.
Key Points
- Defines the minimum conditions for a user story to be accepted by the Product Owner.
- Written in clear, testable language that enables objective verification.
- Co-created by Product Owner and team during backlog refinement; confirmed in Sprint Planning.
- Acts as input for estimating, task creation, and test case design.
- Used in Sprint Review to validate done and in SBOK to support Accept Deliverables.
- Different from Definition of Done, which applies across stories; acceptance criteria are story-specific.
Purpose
The purpose of acceptance criteria is to align stakeholders and the Scrum Team on what success looks like for a specific user story. They remove ambiguity, enable precise estimation, and support effective testing and acceptance.
In SBOK-driven Scrum, they provide a clear handshake between business expectations and technical delivery, improving flow from Create User Stories through Demonstrate and Validate Sprint and Accept Deliverables.
Key Terms & Clauses
- Given-When-Then: A structured, behavior-focused way to express conditions and outcomes.
- Functional behavior: Observable outcomes when a user interacts with the system.
- Nonfunctional constraints: Performance, security, usability, and compliance conditions.
- Negative paths: Error handling, invalid inputs, or boundary cases to prevent surprises.
- Edge conditions: Limits for inputs, dates, ranges, and precision rules.
- Traceability: Each criterion maps back to the user story and test cases.
- DoD vs acceptance criteria: DoD is a shared quality bar across work; acceptance criteria are unique to a story.
How to Develop/Evaluate
- Collaborate early: During backlog refinement, the Product Owner drafts criteria with input from developers and testers.
- Focus on outcomes: Describe customer-visible behavior, not internal implementation.
- Make them testable: Each criterion must be verifiable via a test case or demonstration step.
- Use structured language: Prefer Given-When-Then or clear bullet statements for consistency.
- Cover unhappy paths: Include errors, permissions, boundary inputs, and performance targets where relevant.
- Right-size for estimation: Ensure criteria are specific enough to support accurate sizing before commitment.
- Review in Sprint Planning: Confirm feasibility and shared understanding before the team commits.
How to Use
- As an Output: From Create User Stories and Refine Prioritized Product Backlog, acceptance criteria are documented alongside the story in the Product Backlog.
- As an Input: To Estimate User Stories and Commit User Stories, the team relies on criteria to understand scope and complexity.
- As an Input: To Create Tasks, criteria inform development and testing tasks and acceptance test design.
- As an Input: During the Sprint and Daily Standup, criteria guide progress checks and clarify done for the story.
- As an Input: In Demonstrate and Validate Sprint, criteria drive the Sprint Review demonstration and Product Owner acceptance.
- As Support: For Accept Deliverables and Ship Deliverables, criteria help confirm that increments meet business needs.
Example Snippet
Example acceptance criteria for a generic user story:
- Given a valid user is logged in, when they submit the form with required fields, then the system saves the data and shows a confirmation message.
- Given an invalid input in any required field, when the user submits, then the system prevents save and shows a specific error message near the field.
- Given normal network conditions, when the user submits, then the response time is under 2 seconds for 95 percent of requests.
Risks & Tips
- Risk: Vague criteria cause rework and disputes in Sprint Review. Tip: Use concrete, measurable language.
- Risk: Overly technical criteria hide customer value. Tip: Describe behavior from the user’s perspective.
- Risk: Missing edge cases. Tip: Add negative paths and boundary values during refinement.
- Risk: Scope creep mid-sprint. Tip: Freeze criteria at commitment; handle new ideas as new backlog items.
- Risk: Criteria not testable. Tip: Pair with QA to ensure each criterion maps to at least one test.
- Risk: Confusion with Definition of Done. Tip: Keep a visible DoD and link story-specific criteria to it.
PMP/SCRUM Example Question
During Sprint Planning, the team cannot size a user story because expected behavior is unclear. What should they request to proceed confidently?
- A detailed technical design from the lead developer.
- User story acceptance criteria defined with the Product Owner.
- A longer sprint to allow for discovery and build.
- An updated Definition of Done specific to this story.
Correct Answer: B — User story acceptance criteria defined with the Product Owner.
Explanation: Acceptance criteria provide testable conditions that clarify scope and enable estimation and commitment. A technical design or longer sprint does not replace clear, story-level acceptance conditions; the DoD is shared, not story-specific.
HKSM