Manage Quality Assurance
| Governance/Executing/Manage Quality Assurance | ||
|---|---|---|
| Inputs | Tools & Techniques | Outputs |
Inputs, tools & techniques, and outputs for this process.
A proactive, process-focused practice that builds quality into the work, verifies methods and standards are followed, and drives continual improvement so deliverables meet requirements and acceptance criteria.
Purpose & When to Use
- Build quality into processes to prevent defects rather than relying on inspection to find them later.
- Confirm teams are using agreed standards, methods, and tools to achieve consistent outcomes.
- Continuously improve ways of working using data, audits, and feedback.
- Align deliverables with acceptance criteria and compliance needs before formal verification and validation.
- Use throughout delivery, with early emphasis during planning and design, and ongoing checks during execution.
- Applies to predictive, agile, and hybrid approaches (e.g., Definition of Done, peer reviews, test automation).
Mini Flow (How It’s Done)
- Review the quality management plan, standards, and metrics; confirm acceptance criteria and definition of done.
- Tailor quality methods (e.g., checklists, peer reviews, test strategy, design-for-quality) to the project context.
- Set up measurement systems and data collection (dashboards, control charts, defect logs, test coverage).
- Embed prevention practices: requirements traceability, test-first approaches, pairing, WIP limits, automation.
- Run quality audits and process reviews to verify adherence to agreed methods and compliance obligations.
- Analyze results with root cause analysis, Pareto, trend and capability analysis; prioritize improvements.
- Recommend and implement preventive and corrective changes; raise change requests where needed.
- Coordinate with suppliers to align on standards, acceptance criteria, and quality responsibilities.
- Update risk responses for emerging quality risks and update the lessons and improvements backlog.
- Produce quality reports for stakeholders and adapt the plan based on evidence and feedback.
Quality & Acceptance Checklist
- Quality objectives, metrics, and acceptance criteria are clear, measurable, and traceable to requirements.
- Standards, methods, and templates are defined, tailored, and communicated to all contributors.
- Test strategy covers unit, integration, system, user, and nonfunctional needs (performance, security, usability).
- Peer reviews or walkthroughs are scheduled for requirements, designs, code, and key deliverables.
- Automation is planned where valuable (build, test, deployment, compliance checks).
- Measurement system is reliable and calibrated; definitions of metrics are consistent.
- Defect management process defines severity, priority, SLAs, and escalation paths.
- Risk-based testing and sampling focus effort on high-impact and high-probability areas.
- Traceability matrix links requirements to tests and delivered work items.
- Nonconformances have documented containment, root cause, and preventive actions.
- Supplier quality criteria, handoff checks, and acceptance responsibilities are agreed.
- Compliance and regulatory needs are integrated into work methods and evidence capture.
- Configuration management controls versions, baselines, and change history.
- Definition of Done and acceptance criteria are used in planning, reviews, and completion decisions.
- Quality reports are available and understood by stakeholders; decisions use current data.
- Continuous improvement items are prioritized and regularly implemented.
Common Mistakes & Exam Traps
- Confusing Manage Quality (process-focused prevention) with Control Quality (product inspection and measurement).
- Doing audits only at the end; prevention must start early and run continuously.
- Adding more testing instead of fixing the process causing defects.
- Over-documenting without verifying that methods are actually used and effective.
- Ignoring metrics quality; poor data leads to poor decisions.
- Treating audits as punitive rather than collaborative improvement activities.
- Skipping stakeholder and supplier involvement in defining standards and acceptance criteria.
- Not tailoring quality practices to project risk, complexity, and delivery approach.
- Forgetting cost of quality; prevention is usually cheaper than rework and failure.
- Assuming “passed tests” equals acceptance; customer acceptance requires meeting explicit criteria.
- Neglecting configuration control, causing defects from uncontrolled changes.
- Failing to update risks and lessons learned based on quality findings.
PMP Example Question
Midway through delivery, defect trends are rising. The sponsor suggests adding more testers. What should the project manager do next?
- Reassign more team members to perform additional inspections immediately.
- Update the quality metrics to higher thresholds so fewer defects are reported.
- Conduct a quality audit and process analysis, address root causes, and implement preventive changes per the plan.
- Notify the customer about potential delays and wait for their guidance.
Correct Answer: C — Conduct a quality audit and process analysis, address root causes, and implement preventive changes per the plan.
Explanation: Manage Quality focuses on prevention and improving the process. Auditing and fixing causes is better than adding inspection or changing metrics.
HKSM