Manage Quality Assurance

Governance/Executing/Manage Quality Assurance
Inputs Tools & Techniques Outputs

Inputs, tools & techniques, and outputs for this process.

A proactive, process-focused practice that builds quality into the work, verifies methods and standards are followed, and drives continual improvement so deliverables meet requirements and acceptance criteria.

Purpose & When to Use

  • Build quality into processes to prevent defects rather than relying on inspection to find them later.
  • Confirm teams are using agreed standards, methods, and tools to achieve consistent outcomes.
  • Continuously improve ways of working using data, audits, and feedback.
  • Align deliverables with acceptance criteria and compliance needs before formal verification and validation.
  • Use throughout delivery, with early emphasis during planning and design, and ongoing checks during execution.
  • Applies to predictive, agile, and hybrid approaches (e.g., Definition of Done, peer reviews, test automation).

Mini Flow (How It’s Done)

  • Review the quality management plan, standards, and metrics; confirm acceptance criteria and definition of done.
  • Tailor quality methods (e.g., checklists, peer reviews, test strategy, design-for-quality) to the project context.
  • Set up measurement systems and data collection (dashboards, control charts, defect logs, test coverage).
  • Embed prevention practices: requirements traceability, test-first approaches, pairing, WIP limits, automation.
  • Run quality audits and process reviews to verify adherence to agreed methods and compliance obligations.
  • Analyze results with root cause analysis, Pareto, trend and capability analysis; prioritize improvements.
  • Recommend and implement preventive and corrective changes; raise change requests where needed.
  • Coordinate with suppliers to align on standards, acceptance criteria, and quality responsibilities.
  • Update risk responses for emerging quality risks and update the lessons and improvements backlog.
  • Produce quality reports for stakeholders and adapt the plan based on evidence and feedback.

Quality & Acceptance Checklist

  • Quality objectives, metrics, and acceptance criteria are clear, measurable, and traceable to requirements.
  • Standards, methods, and templates are defined, tailored, and communicated to all contributors.
  • Test strategy covers unit, integration, system, user, and nonfunctional needs (performance, security, usability).
  • Peer reviews or walkthroughs are scheduled for requirements, designs, code, and key deliverables.
  • Automation is planned where valuable (build, test, deployment, compliance checks).
  • Measurement system is reliable and calibrated; definitions of metrics are consistent.
  • Defect management process defines severity, priority, SLAs, and escalation paths.
  • Risk-based testing and sampling focus effort on high-impact and high-probability areas.
  • Traceability matrix links requirements to tests and delivered work items.
  • Nonconformances have documented containment, root cause, and preventive actions.
  • Supplier quality criteria, handoff checks, and acceptance responsibilities are agreed.
  • Compliance and regulatory needs are integrated into work methods and evidence capture.
  • Configuration management controls versions, baselines, and change history.
  • Definition of Done and acceptance criteria are used in planning, reviews, and completion decisions.
  • Quality reports are available and understood by stakeholders; decisions use current data.
  • Continuous improvement items are prioritized and regularly implemented.

Common Mistakes & Exam Traps

  • Confusing Manage Quality (process-focused prevention) with Control Quality (product inspection and measurement).
  • Doing audits only at the end; prevention must start early and run continuously.
  • Adding more testing instead of fixing the process causing defects.
  • Over-documenting without verifying that methods are actually used and effective.
  • Ignoring metrics quality; poor data leads to poor decisions.
  • Treating audits as punitive rather than collaborative improvement activities.
  • Skipping stakeholder and supplier involvement in defining standards and acceptance criteria.
  • Not tailoring quality practices to project risk, complexity, and delivery approach.
  • Forgetting cost of quality; prevention is usually cheaper than rework and failure.
  • Assuming “passed tests” equals acceptance; customer acceptance requires meeting explicit criteria.
  • Neglecting configuration control, causing defects from uncontrolled changes.
  • Failing to update risks and lessons learned based on quality findings.

PMP Example Question

Midway through delivery, defect trends are rising. The sponsor suggests adding more testers. What should the project manager do next?

  1. Reassign more team members to perform additional inspections immediately.
  2. Update the quality metrics to higher thresholds so fewer defects are reported.
  3. Conduct a quality audit and process analysis, address root causes, and implement preventive changes per the plan.
  4. Notify the customer about potential delays and wait for their guidance.

Correct Answer: C — Conduct a quality audit and process analysis, address root causes, and implement preventive changes per the plan.

Explanation: Manage Quality focuses on prevention and improving the process. Auditing and fixing causes is better than adding inspection or changing metrics.

AI for Agile Project Managers and Scrum Masters

Become an AI-first leader and transform your agile practice by leveraging artificial intelligence as your most powerful co-pilot. This course is designed to help you drive efficiency, insight, and innovation, ensuring you stay at the forefront of a rapidly evolving project management landscape.

This isn't about replacing human intuition—it's about augmenting it. You'll master prompt engineering to automate mundane tasks, freeing up your time for high-impact strategic leadership and creative problem-solving. Learn to refine backlogs, create strategic roadmaps, and integrate AI seamlessly into your agile ceremonies.

Gain predictive power by using AI-driven insights to anticipate project risks and seize new opportunities for more reliable outcomes. We deliver practical, prompt-based workflows and proven strategies built around real-world agile challenges that you can implement immediately within your framework.

Master foundational AI concepts specifically relevant to Scrum environments while developing advanced skills to handle diverse agile scenarios. You will learn to champion an AI-enabled culture within your organization, fostering a dynamic environment of continuous improvement and superior team delivery.

Ready to lead the future of agile and make data-driven decisions that cut through complexity? Join a community of forward-thinking professionals and position yourself as an indispensable leader in the AI era. Enroll now and unlock your future!



Take Control of Project Performance!

HK School of Management helps you go beyond status reports and gut feelings. In this advanced course, you’ll master Earned Value Management (EVM) to objectively measure progress, forecast outcomes, and take corrective action with confidence. Learn how WBS quality drives performance, how control accounts really work, and how to use EAC, TCPI, and variance analysis to make smarter decisions—before projects drift off track. Built around real-world examples and hands-on exercises, this course gives you practical tools you can apply immediately. Backed by our 30-day money-back guarantee—low risk, high impact for serious project professionals.

Learn More