Metrics and Measuring Techniques
Metrics and Measuring Techniques are quantitative and visual methods used in Scrum to track progress, quality, and value delivery across sprints and releases. They convert observations into objective data such as velocity, burndown or burnup, defects, and flow to support transparency, forecasting, and continuous improvement.
Key Points
- Tool and technique in SBOK used to quantify progress, quality, and value at sprint and release levels.
- Common measures include sprint and release burndown or burnup, velocity, defect metrics, lead and cycle time, and throughput.
- Emphasizes trends and transparency to enable inspect-and-adapt, not to evaluate individuals.
- Used continuously in Scrum events such as Daily Scrum, Sprint Review, and Sprint Retrospective.
- Supports forecasting of completion dates and scope based on empirical data.
- Lightweight data collection with simple visualizations; automate where possible.
Purpose of Analysis
The aim is to make progress and quality visible, surface bottlenecks early, and enable evidence-based decisions. This helps the Product Owner manage expectations, the team plan realistically, and stakeholders understand delivery outlook and risks.
Use these techniques throughout the project: update daily during the sprint, review trends with stakeholders at the Sprint Review, and identify improvements in the Retrospective.
Method Steps
- Set measurement goals and questions, such as predictability, flow, and product quality.
- Select metrics and define clear operational definitions, including what counts as Done.
- Plan data capture and tooling: who updates, when, and how charts are generated.
- Collect data continuously during the sprint from the Scrumboard, tests, and defect logs.
- Visualize with charts and simple dashboards: burndown or burnup, velocity, flow, and defects.
- Analyze trends and variances; distinguish backlog changes from execution issues.
- Forecast using velocity or throughput and communicate confidence ranges when needed.
- Decide improvement actions, update impediments, and track follow-through in later sprints.
Inputs Needed
- Product backlog with estimates, acceptance criteria, and release goals.
- Sprint backlog, task board statuses, and Definition of Done.
- Historical velocity or throughput and prior sprint metrics.
- Automated test results, defect logs, and quality checks.
- Scope changes, impediment log, and calendar of timeboxes.
- Tool exports from ALM or Scrumboard (e.g., timestamps, state changes).
Outputs Produced
- Sprint and release burndown or burnup charts showing work remaining or completed.
- Velocity trend and basic forecast of scope or date based on empirical delivery.
- Flow insights such as cumulative flow, lead time, cycle time, and work in progress.
- Quality metrics report including defect density, escaped defects, and test pass rates.
- Updated impediment log and actionable process improvements.
- Stakeholder summaries that link progress, quality, and forecast to release goals.
Interpretation Tips
- Prefer trends over single data points; use rolling averages to smooth noise.
- Do not compare velocity across teams; use it only within the same stable team.
- When scope is volatile, use burnup to show both total scope and work completed.
- Validate that Done means potentially shippable; otherwise metrics are misleading.
- Watch WIP and aging work items to find bottlenecks before they delay delivery.
- Combine numbers with qualitative feedback from Sprint Review for full context.
Example
A Scrum Team runs two-week sprints. Their average velocity over the last four sprints is 26 points, and the release backlog shows 78 points remaining. They create a release burnup to illustrate 52 points completed and 78 total scope, which recently increased by 10 points.
Using the velocity trend, the Product Owner forecasts about three sprints to finish the current scope. The team also tracks escaped defects, which dropped from 6 to 2 per sprint after adding automated tests, indicating improved quality.
Pitfalls
- Using metrics to judge individuals instead of improving the system.
- Comparing velocities across teams or changing estimation scales mid-release.
- Ignoring scope changes, which can make burndown appear flat or misleading.
- Tracking vanity metrics like hours logged or utilization that do not reflect value.
- Over-collecting data, creating waste and confusion instead of insight.
- Gaming the numbers or redefining Done to look better rather than fixing issues.
PMP/SCRUM Example Question
During Sprint 4, the burndown chart looks flat even though the team delivered several user stories. The Product Owner added new stories mid-sprint. What measuring technique should the Scrum Master use to best communicate progress while showing the impact of scope change?
- Sprint burndown chart only.
- Burnup chart with separate lines for total scope and work completed.
- Total hours logged report.
- Team utilization graph.
Correct Answer: B — Burnup chart with separate lines for total scope and work completed.
Explanation: A burnup chart clearly shows progress and scope growth at the same time. The other options either hide scope changes or focus on activity rather than value delivered.
HKSM