Multicriteria decision analysis

A structured technique for comparing alternatives against multiple weighted criteria to reach a transparent, balanced decision. It turns qualitative and quantitative inputs into comparable scores that support stakeholder-aligned choices.

Key Points

  • Evaluates options using several criteria that reflect project objectives and constraints.
  • Uses agreed criteria definitions, weights, and scoring scales to ensure consistency.
  • Combines qualitative and quantitative data by normalizing scores before aggregation.
  • Improves transparency and stakeholder buy-in by making assumptions and weights explicit.
  • Includes sensitivity analysis to test how changes in weights or scores affect rankings.
  • Produces a ranked list and a clear rationale that can be recorded in the decision log.

Purpose of Analysis

To select the most suitable option when multiple factors matter and trade-offs are necessary. It helps teams justify choices such as vendors, solutions, features, locations, risk responses, or make-or-buy decisions by showing how each alternative performs against the criteria that matter most.

Method Steps

  • Define the decision statement, objective, and any constraints or thresholds.
  • List realistic alternatives to evaluate, including the status quo if relevant.
  • Identify and clearly define evaluation criteria aligned to objectives.
  • Assign weights to criteria to reflect relative importance, ensuring weights sum to 100 percent.
  • Select or design scoring scales for each criterion and define what each score value means.
  • Gather data and expert judgments needed to score each alternative per criterion.
  • Score each alternative consistently, documenting assumptions and sources.
  • Normalize scores if scales differ, then calculate weighted totals for each alternative.
  • Review results with stakeholders, check for biases, and validate the logic.
  • Perform sensitivity analysis on weights and key scores to test result stability.
  • Decide, document the rationale, and update the decision register.

Inputs Needed

  • Decision statement, objectives, and success criteria.
  • Defined list of alternatives to consider.
  • Evaluation criteria with clear definitions and measurement approaches.
  • Criteria weights and justification for their relative importance.
  • Scoring scales and normalization or utility functions, if used.
  • Data sources, estimates, and expert judgments for scoring.
  • Assumptions, constraints, and acceptance thresholds.
  • Stakeholder preferences and any policy or regulatory requirements.
  • Tool or template for recording scores and calculations.

Outputs Produced

  • Ranked list of alternatives with total weighted scores.
  • Per-criterion score breakdown for each alternative.
  • Selected option with documented rationale and decision conditions.
  • Sensitivity analysis results and notes on stability of the outcome.
  • Assumptions, data sources, and issues log for traceability.
  • Updates to the decision register and related plans, as needed.

Interpretation Tips

  • Do not over-interpret small score differences; check if they are within estimation error.
  • Examine weightings for bias and validate they reflect stakeholder priorities.
  • Ensure criteria are independent to avoid double-counting similar factors.
  • Look for score patterns across criteria to understand strengths and weaknesses.
  • Use sensitivity analysis to see if the ranking holds under plausible changes.
  • Combine quantitative results with informed judgment and risk considerations.

Example

A team must choose a project collaboration tool. Criteria include cost, usability, security, integration, and support. Weights are set as cost 20 percent, usability 25 percent, security 25 percent, integration 20 percent, support 10 percent. Three tools are scored on a 1–5 scale per criterion, normalized if needed, and weighted totals are calculated. The tool with the highest stable score after sensitivity checks is recommended, with notes explaining why it leads under key criteria.

Pitfalls

  • Using vague criteria or undefined scoring scales, leading to inconsistent scoring.
  • Choosing weights without stakeholder input, creating bias and low buy-in.
  • Mixing incomparable data without normalization, distorting results.
  • Including overlapping criteria that double-count the same benefit or risk.
  • Ignoring uncertainty and not testing sensitivity to key assumptions.
  • Letting the tool or template drive the decision instead of the objectives.

PMP Example Question

A project manager must select a vendor using multiple factors such as cost, quality, delivery time, and risk. Stakeholders disagree on which factor matters most. What should the project manager do first to ensure a fair multicriteria decision analysis?

  1. Score each vendor using the team’s initial impressions.
  2. Assign equal weights to all criteria to avoid conflict.
  3. Facilitate agreement on criteria definitions and weights with key stakeholders.
  4. Pick the lowest-cost vendor and document the rationale.

Correct Answer: C — Facilitate agreement on criteria definitions and weights with key stakeholders.

Explanation: MCDA depends on clear, agreed criteria and weightings before scoring. Establishing these with stakeholders improves consistency and buy-in.

Agile Project Management & Scrum — With AI

Ship value sooner, cut busywork, and lead with confidence. Whether you’re new to Agile or scaling multiple teams, this course gives you a practical system to plan smarter, execute faster, and keep stakeholders aligned.

This isn’t theory—it’s a hands-on playbook for modern delivery. You’ll master Scrum roles, events, and artifacts; turn vision into a living roadmap; and use AI to refine backlogs, write clear user stories and acceptance criteria, forecast with velocity, and automate status updates and reports.

You’ll learn estimation, capacity and release planning, quality and risk management (including risk burndown), and Agile-friendly EVM—plus how to scale with Scrum of Scrums, LeSS, SAFe, and more. Downloadable templates and ready-to-use GPT prompts help you apply everything immediately.

Learn proven patterns from real projects and adopt workflows that reduce meetings, improve visibility, and boost throughput. Ready to level up your delivery and lead in the AI era? Enroll now and start building smarter sprints.



Launch your career!

HK School of Management provides world-class training in Project Management, Lean Six Sigma, and Agile Methodologies. Just for the price of a lunch you can transform your career, and reach new heights. With 30 days money-back guarantee, there is no risk.

Learn More