Bohanec, Marko – (DEXi 5.1) – 2015

01a – DEXi: A Program for Multi-Attribute Decision Making Version 5.01

Purpose

DEXi is a computer program for multi-attribute decision making. It is aimed at interactive development of qualitative multi-attribute decision models and the evaluation of options. This is useful for supporting complex decision-making tasks, where there is a need to select a particular option from a set of possible ones so as to satisfy the goals of the decision maker. A multi-attribute model is a hierarchical structure that represents the decomposition of the decision problem into subproblems, which are smaller, less complex and possibly easier to solve than the complete problem.

Further information on DEXi:

Functionality
Screenshots
Documentation
Development and history
Typical applications

Download

DEXi is implemented in Delphi and runs on Microsoft Windows platforms. It can be used free of charge.

The latest DEXi version is 5.01 and is available in two languages:

Slovene: DEXi501si_setup.exe
English: DEXi501en_setup.exe

Related software

  • DEX is the predecessor of DEXi.
  • JDEXi is an open-source Java library implementing: parsing of DEXi models and evaluation of options.
  • DEXiTree: a program for pretty drawing of DEXi trees.
  • DEXiEval: a command-line utility program for batch evaluation of options using a DEXi model.

01b – DEXi Functionality

DEXi supports two basic tasks:

  1. the development of qualitative multi-attribute models;
  2. the application of models for the evaluation and analysis of options.

The models are developed by defining:

  • attributes: qualitative variables that represent decision subproblems,
  • scales: ordered or unordered sets of symbolic values that can be assigned to attributes,
  • tree of attributes: a hierarchical structure representing the decomposition of the decision problem,
  • utility functions: rules that define the aggregation of attributes from bottom to the top of the tree of attributes.

In the evaluation and analysis stage, DEXi facilitates:

  • description of options: defining the values of basic attributes (terminal nodes of the tree),
  • evaluation of options: a bottom up aggregation of option values based on utility functions,
  • analysis of options: what-if analysis, “plus-minus-1” analysis, selective explanation and comparison of options,
  • reporting: graphical and textual presentation of models, options and evaluation results.

DEXi differs from most conventional multi-attribute decision modeling tools in that it uses qualitative (symbolic) attributes instead of quantitative (numeric) ones. Also, aggregation (utility) functions in DEXi are defined by if-then decision rules rather numerically by weights or some other kind of formula. (However, DEXi does support weights indirectly.)

In comparison with its predecessor DEX, DEXi has a more modern and more convenient user interface. Also, it has better graphical and reporting capabilities, and facilitates the use of weights to represent and assess qualitative utility functions. On the other hand, DEXi is somewhat less powerful than DEX in dealing with incomplete option descriptions: DEX employs probabilistic and fuzzy distribution of values, while DEXi facilitates only the use of crisp or unknown option values.

01c – DEXi Screenshots

Editing a decision model

Model Editing

Editing a qualitative attribute scale

Scale Editing

Defining decision rules

Rules Editing

Editing option descriptions

Options Editing

Option evaluation and analysis

Option Evaluation

Displaying charts

Chatrs

Report preview

Report Preview

01d – DEXi Documentation and Publications

Both DEXi installation packages, Slovene and English, include an English help file.

Documentation in Slovene

An early DEXi User’s Manual is available as:

Jereb, E., Bohanec, M., Rajkovič, V.: DEXi: Računalniški program za večparametrsko odločanje, Moderna organizacija, Kranj, 2003.

Further information on decision analysis, multi-attribute modeling, fundamental DEXi concepts and underlying methods is available in:

Bohanec, M.: Odločanje in modeli. DMFA – založništvo, 1. ponatis, 2012. [O knjigi…]

Documentation in English

The English help file, which is distributed with the installation, is up-to-date and describes DEXi version 4.01.

The DEXi 5.00 User’s Manual in English is available as:

Bohanec, M.: DEXi: Program for Multi-Attribute Decision Making, User’s Manual, Version 5.00. IJS Report DP-11897, Jožef Stefan Institute, Ljubljana, 2015.
[Also available: a printer-friendly version without hyperlinks.]

Selected Publications

M. Bohanec, V. Rajkovič: Večparametrski odločitveni modeli. Organizacija 28(7), 427-438, 1995.

Bohanec, M., Rajkovič, V.: Multi-attribute decision modeling: Industrial applications of DEX. Informatica 23, 487-491, 1999.

Bohanec, M., Zupan, B., Rajkovič, V.: Applications of qualitative multi-attribute decision models in health care, International Journal of Medical Informatics 58-59, 191-205, 2000.

Cestnik, B., Bohanec, M.: Decision support in housing loan allocation: A case study, IDDM-2001: ECML/PKDD-2001 Workshop Integrating Aspects of Data Mining, Decision Support and Meta-Learning: Positions, Developments and Future Directions (eds. Giraud-Carrier, C., Lavrač, N., Moyle, S., Kavšek, B.), Freiburg, 21-30, 2001.

Mladenić, D., Lavrač, N., Bohanec, M., Moyle, S. (eds.): Data mining and decision support: Integration and collaboration. Kluwer Academic Publishers, 2003. Chapters:

  • Bohanec, M.: Decision support. 23-35.
  • Bohanec, M., Rajkovič, V., Cestnik, B.: Five decision support applications. 177-189.
  • Moyle, S., Bohanec, M., Ostrowski, E.: Large and tall buildings: A case study in the application of decision support and data mining. 191-202.

Moyle, S., Ostrowski, E., Bohanec, M.: Knowledge development using data mining: A specific application in the construction industry. Leveraging corporate knowledge (ed. Truch, E.), Gower, 181-197, 2004.

Vintar, M., Grad, J. (ur.): E-uprava: Izbrane razvojne perspektive, Univerza v Ljubljani, Fakulteta za upravo, 2004.:

  • Leben, A., Bohanec, M.: Vrednotenje portalov življenjskih situacij, 123-140.
  • Bohanec, M.: Odločanje in večparametrsko modeliranje, 205-219.

Bohanec, M., Džeroski, S., Žnidaršič, M., Messéan, A., Scatasta, S., Wesseler, J.: Multi-attribute modeling of economic and ecological impacts of cropping systems, Informatica 28, 387-392, 2004.

Leben, A., Kunstelj, M., Bohanec, M., Vintar, M.: Evaluating public administration e-portals. Information Polity 21(3/4), 207-225, 2006.

Bohanec, M., Messéan, A., Angevin, F., Žnidaršič, M.: SMAC Advisor: A decision-support tool on coexistence of genetically-modified and conventional maize. Proc. Information Society IS 2006, Ljubljana, 9-12, 2006

Verdev. M., Bohanec, M., Džeroski, S.: Decision support for a waste electrical and electronic equipment treatment system. Proc. Information Society IS 2006, Ljubljana, 89-92, 2006

Taškova, K., Stojanova, D., Bohanec, M., Dleroski, S.: A qualitative decision-support model for evaluating researchers. Informatica 31(4), 479-486, 2007.

Omerčević, D., Zupančič, M., Bohanec, M., Kastelic, T.: Intelligent response to highway traffic situations and road incidents. Proc. TRA 2008, Transport Research Arena Europe 2008, 21-24 April 2008, Ljubljana, Slovenia (ed. A. Žnidarič). Ljubljana: DDC svetovanje inženiring: ZAG, Zavod za gradbeništvo Slovenije: DRC, Družba v cestni in prometni stroki Slovenije, 1-6, 2008.

Žnidaršič, M., Bohanec, M., Kok, E.J., Prins, T.W.: Qualitative risk assessment for adventitious presence of unauthorized genetically modified organisms. Proceedings of ISIT 2009, 1st International Conference on Information Society and Information Technologies, Novo mesto: Faculty of information studies. 12.-13.10.2009, Dolenjske Toplice, 7 p., 2009.

Žnidaršič, M., Bohanec, M., Lavrač, N., Cestnik, B.: Project self-evaluation methodology: The Healthreats project case study. Proc. Information Society IS 2009, Ljubljana, 85-88, 2009.

Bohanec, M., Žnidaršič, M.: Izkušnje z večparametrskimi odločitvenimi modeli pri podpori odločanja o gensko spremenjenih organizmih. DAES 2010: Sodobni izzivi menedžmenta v agroživilstvu (ur. Č. Rozman, S. Kavčič), Pivola, 18.-19.3.2010, 29-37, 2010.

Marinič, S., Bohanec, M.: Večparametrsko vrednotenje variant v odvisnosti od konteksta: Model za vrednotenje strešnih kritin Proceedings of the 15th International Conference Information Society IS 2012, 8.-12.10.2012, Ljubljana, 76-79, 2012.

Bohanec, M., Rajkovič, V., Bratko, I., Zupan, B., Žnidaršič, M.: DEX methodology: Three decades of qualitative multi-attribute modelling. Informatica 37, 49-54, 2013.

Alić, I., Siering, M., Bohanec, M.: Hot stock or not? A qualitative multi-attribute model to detect financial market manipulation. eInnovation: Challenges and impacts for individuals, organizations and society, Proceedings of 26th Bled eConference (ed. D.L.Wigand), June 9-13, 2013, Bled, Slovenia, Kranj: Moderna organizacija, 64-77, 2013.

Trdin, N., Bohanec, M., Janža, M.: Decision support system for management of water sources. Proceedings of the 16th International Conference Information Society IS 2013, 7.-11.10.2013, Ljubljana, 118-121, 2013.

Bohanec, M., Aprile, G., Costante, M., Foti, M., Trdin, N.: A hierarchical multi-attribute model for bank reputational risk assessment. DSS 2.0 — Supporting Decision Making with New Technologies (eds. Phillips-Wren, G., Carlsson, S., Respício, A., Brézillon, P.), Amsterdam: IOS Press, ISBN 978-1-61499-398-8, 92-103, 2014.

Mileva Boshkoska, B., Bohanec, M., Boškoski, P., Juričić, Đ.: Copula-based decision support system for quality ranking in the manufacturing of electronically commutated motors. Journal of Intelligent Manufacturing 26, 281-293, 2015.

Bohanec, M., Delibašić, B.: Data-mining and expert models for predicting injury risk in ski resorts. Decision Support Systems V – Big Data Analytics for Decision Making, First International Conference ICDSST 2015, Belgrade, Serbia, May 27-29, 2015, Springer, 46-60, 2015.

01d – DEXi Applications

DEXi is particularly suitable for solving complex decision problems, which typically involve:

  • many (say, 15 or more) attributes,
  • many options (10 or more),
  • judgment, which prevalently requires qualitative reasoning rather than numerical evaluation,
  • inaccurate and/or missing data,
  • group decision making, which requires communication and explanation.

For successful application, DEXi requires sufficient resources, particularly expertise and time for developing a DEXi model.

Some typical application areas and decision problems, in which DEX and DEXi have been used so far, are the following:

  1. Information technology
    • evaluation of computers
    • evaluation of software
    • evaluation of Web portals
  2. Projects
    • evaluation of projects
    • evaluation of proposals and investments
    • product portfolio evaluation
  3. Companies
    • business partner selection
    • performance evaluation of companies
  4. Personnel Management
    • personnel evaluation
    • selection and composition of expert groups
    • evaluation of personal applications for jobs
  5. Medicine and Health-Care
    • risk assessment
    • diagnosis and prognosis
  6. Other Areas
    • assessment of technologies
    • assessments in ecology and environment
    • granting personal/corporate loans

02 – JDEXi: Open-source DEXi Java Library Version 3.0

Purpose

JDEXi3.zip contains a library of open-source Java classes that implement the evaluation of decision alternatives based on qualitative multi-attribute models produced by DEXi software.

JDEXi (version 3) supports:

  • parsing and reading DEXi models from .dxi files or strings (XML format) [constructor Model()]
  • obtaining information about model attributes and attribute scales [methods getAttribute*(), getScale*(), …]
  • obtaining information about utility functions and decision rules [methods getRule*(), rule*(), function*(), getFunctionString(), …]
  • clearing and setting model input values [methods setInputValue(s), …]
  • carrying out the evaluation [methods evaluate(), …]
  • obtaining evaluation results [methods getOutputValue(s), …]
  • modification of decision rules [method getFunctionString()]

JDEXi3 supports only a fairly limited modification of decision rules. DEXi software should be used for any more extensive modification of models.


Authors: Marko Bohanec, Dušan Omerčević, Andrej Kogovšek

This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version.

This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

Contents of JDEXi3.zip

bin/ – contains compiled java files
src/ – contains java source files
doc/ – contains javadoc
JDEXi3.jar – JDEXi classes; see javadoc for the list of all classes
JDEXi3Eval.jar – runnable jar for running JDEXi (command-line)
TestJDEXi3.jar and DumpFile.jar – runnable jars for showing the use of JDEXi library (see the sources in src/test/)
Licence.txt
lesser.txt – GNU LESSER GENERAL PUBLIC LICENSE
Car.dxi – a sample DEXi file (car selection demo)
readme.html – this file

JDEXi3Eval.jar

Runnable jar for running JDEXi application through the command line. It takes 2 mandatory parameters:

  1. DEXi_file_name: file containing a DEXi model (.dxi)
  2. Variables: a “;”-separated list of name=value pairs

Example

java -jar JDEXi3Eval.jar Car.dxi "BUY.PRICE=low;MAINT.PRICE=low;#PERS=more;#DOORS=more;LUGGAGE=big;SAFETY=high"

TestJDEXi3.jar

Loads a DEXi model, displays some of its elements (attributes, value scales, decision rules) and evaluates a random decision alternative.

java -jar TestJDEXi3.jar Car.dxi

DumpFunctions.jar

Loads a DEXi model and prints out all utility functions it contains.

java -jar DumpFunctions.jar Car.dxi

03 – DEXiTree: A Program for Pretty Drawing of DEXi Trees Version 0.94

Purpose

DEXiTree is a companion program to DEXi, aimed at making nice drawings of DEXi’s trees of attributes. Actually, DEXiTree is quite a general and powerful tree-drawing program that:

  • offers four different tree-drawing algorithms (called “Distribute”, “Align”, “Walker”, and “QP”);
  • draws trees in four different directions (Top-Down, Left-Right, Bottom-Up and Right-Left);
  • provides an extensive set of parameters for controlling the appearance of trees and their components.

Please see some DEXiTree screenshots.

Download

DEXiTree is implemented in Delphi and is available for Microsoft Windows. The latest version is 0.94 and is compatible with DEXi 4.01 and later.

Download DEXiTree Version 0.94: DEXiTree094.zip

No installation is required; just unpack the zip file and run DEXiTree.exe.

Usage

DEXiTree is typically used in the following steps:

  1. load (File/Open) a DEXi model from a .dxi file;
  2. interactively alter DEXiTree’s drawing parameters until you are satisfied with the drawing;
  3. save the drawing to a file (File/Save as…) or transfer it to other applications through the clipboard (Edit/Copy).

Drawings can be saved and/or transferred in two different graphic formats:

  • Windows Enhanced Metafile (.emf) [vector graphic format], or
  • Windows Bitmap (.bmp) [bitmap graphic format].

DEXiTree uses its own XML-based “DEXiTree file” format (.dxt) for representing the currently drawn tree of attributes and corresponding drawing parameters, DEXiTree can both load (File/Open) and save (File/Save as…) such files. Loading can be selective so that only a tree structure or only drawing parameters are loaded from file, leaving the other component intact.

See the file DEXiTree.txt for more detailed instructions and conditions of use.

If you use this software for any purpose, an acknowledgment/citation in your product and an informative message to the author would be appreciated.

Copyright © 2007-2015 Marko Bohanec. All rights reserved.

Notes

Editing of trees is not supported in DEXiTree. Use DEXi to modify the structure of trees.

DEXiTree has been provided in the hope that it will be useful, but without warranty of any kind.

Any feedback on DEXiTree will be greatly appreciated. Please send your comments, suggestions, bug reports, etc., to Marko Bohanec.

Publications

04 – DEXiEval: Command-Line Utility for Batch Evaluation of DEXi Options

Purpose

DEXiEval is a command-line utility program for batch evaluation of options (decision alternatives) using a DEXi model. Basically, DEXiEval reads a DEXi model from a DEXi file and loads option data from another input file. It evaluates these options and writes the evaluation results to output option data files. In one turn, DEXiEval can create several output files in different formats.

Download

DEXiEval is implemented in Delphi and is available for Microsoft Windows and Linux. The latest version is 4.0 and is compatible with DEXi 2.0, DEXi 3.0, DEXi 4.0 and possibly later.

Windows: DEXiEval40.zip
Linux: DEXiEval40.tgz

No installation is required; just unpack the archive and run DEXiEval.

Usage

See the file DEXiEval.txt for detailed instructions and conditions of use.

If you use this software for any purpose, an acknowledgment in your product and an informative message to the author would be appreciated.

Ramik, Jaroslav – Decision Analysis Module for Excel (DAME) – 2014

01 – DAME Tool – Group Decision Making and Evaluation Made Simple

Group evaluation techniques are many and group decision-making is always challenging. There are whole software packages today specializing only in just that like Expert Choice, for example. But what if you don’t want to shell out thousands of dollars and achieve a comparable result? Believe it or not, there is a way and it is rather simple and virtually free. DAME stands for Decision Analysis Module for Excel and it is quite a useful solution.

Group evaluation techniques are many and group decision-making is always challanging. There are whole software packages today specializing only in just that like Expert Choice, for example. But what if you don’t want to shell out thousands of dollars and achieve a comparable result? Believe it or not, there is a way and it is rather simple and virtually free. DAME stands for Decision Analysis Module for Excel and it is quite cool.

Take an example of Expert Choice. Within a space of two decades it grew from a niche decision making software application to a big group decision making tool. It is based on AHP (Analytical Hierarchy Process) originally developed in Pittsburgh by Prof. Thomas Saaty. It is good and you can certainly try it for free. However for “production” purposes it will cost you (or your company) thousands of dollars a year.

Of course, the calculations can all be done “by hand” but, truly, who would have time for that? You’re looking for a tool after all to save time.

This small university developed a simple but very useful Excel add-in that can do just that. Courtesy of Prof. Jaroslav Ramik who was behind the project it can be downloaded here. The Tool is called DAME (Decision Analysis Module for Excel) and this article is aiming at showing some of its very useful features. In other words by using it, you save not only time but money too.

So how does it work? The best start would be to consider a simple comparison of 4 products that was part of one of my previous articles. The products have the following prices:

A = 190B=230
C = 320D=290

It is the simplest evaluation task possible but even in this case hand calculations would still be quite time-consuming. You’re going to get exactly the same results (a final rank of evaluated alternatives) with DAME without using any calculations whatsoever in a matter of minutes.

All in all, DAME is an extremely neat solution.

It can be used for much more advanced tasks like requirements prioritization.

02 – Theory behind the AHP method (Analytic Hierarchy Process)

Analytic Hierarchy Process can be useful as a decision-support method in project management in instances where a few options (be it requirements, risks or other “alternatives” need to be prioritized or selected. Let’s take a look at some of the key theory concepts behind it.

There are two fundamental principles used in the general decision-making theory: that of deduction and that of induction (sometimes referred to as a system approach).  Deduction arrives at particulars from the general by applying logic. In other words it goes top-down, narrowing down more general truths into detailed conclusions.

The system approach on the other hand, is based on the premise that particulars are not as important. You are going bottom-up, as if trying to seize general truth.

The essence of the Analytic Hierarchy Process is based on employing both approaches. It is a method that first decomposes a complex problem in single components, and the components (sometimes referred to as variables), are put in a form of a hierarchy. They are then given numerical values. Each variable gets a value according to its importance in relation to other variables. (That is depends on whether it is quantitative or qualitative.) What follows is a synthesis of the values which will determine what weight each variable has in influencing the overall evaluation of the problem. Eventually all evaluated outcomes (alternatives/variants) will receive its total numerical value to form a ranking.

Many decision-making situations entail both physical as well as psychological aspect. Physical aspect can be regarded as objective, as it is from “tangible” realm, something that can be taken hold of or at least measured. Price as a criterion, for example can be quantified by money units, size or distance is quantifiable by units of measurement etc. On the other hand, the psychological aspect of the decision-making problem is more tricky. It is “intengible” in essence and there is no scale or range that might sufficiently, universally and unambiguously express it. They are often a product of subjective ideas, gut-feeling  or assumptions of an individual, a group or the whole society. Let’s take design qualities of a product as an example. The AHP deals with both of those aspects and is able to incorporate them as equal inputs of a unique decision-making support system.

Breaking down a seemingly complex problem into a clear hierarchy and only then focusing on different aspects of the decision, substantially expands possibilities of those who make the decision.

Analytic Hierarchy Process has been developed in Pittsburgh, USA in 1984, originally by Dr. Thomas L. Saaty – internationally recognized scholar and innovator of the decision-making theory.  It has since become one of the most successful and widely used decision support systems of today. It has grown into a comprehensive software tool used in collaborative, teamwork corporate decision-making.

Stages of AHP Using Expert Choice Software

As mentioned above, the largest contribution of AHP is its support in decision-making process by employing both subjective and objective factors when it comes to evaluating different alternatives of outcomes. Unlike in other methods, both quantitative data (that are clearly represented by numbers) and qualitative data (that often regarded as subjective) can be fed into the process. They are then assessed depending on importance that the decision maker has given to them but also in what layer of hierarchy they were placed. In a few steps, apparently advanced and complex decision-making problems can be solved in a relatively simple way.

The whole process can be divided into 5 stages:

1. Breaking down the problem into a hierarchy (analysis)
2. Evaluating criteria and decision alternatives on different levels of the hierarchy (setting priorities)
3. Measuring consistency of evaluation (finding consistency ratio)
4. Synthesis – generating overall weight for each evaluated decision alternative and their ranking
5. Sensitivity analysis

Breaking Down the Problem Into a Hierarchy (Analysis)

Decomposing the problem into a hierarchy is the first basic step of the Analytic Hierarchy Process. A hierarchy means a system of several levels, each including a finite number of elements. There is a mutual relationship between each two vertically-neighbouring levels. The higher the level is, the more general role it plays. Elements placed higher in the hierarchy controlled and managed by elements immediately underneath them. The element at the very top of the hierarchy is always the Goal of the decision-making process. The Goal has a weight that equals 1. 1 is than divided among the elements of the second level of the hierarchy, evaluation of elements in the second level of the hierarchy are then “dissolved” into the third level etc.

Hierarchy chosen depends on the character of the decision-making problem. There are a few types of the hierarchy:

  • Goal – Criteria – Alternatives
  • Goal – Criteria – Subcriteria – Alternatives
  • Goal – Experts – Criteria – Alternatives
  • Goal – Criteria – Intensity Levels – Multiple Alternatives

In most decision-making problems we will make do with the first mentioned type of hierarchy, that is Goal – Criteria – Alternatives but it can always be extended.

Evaluating Criteria and Decision Alternatives on Hierarchy Levels (Setting Priorities)

To set priorities for individual elements of the hierarchy, one must first know whether the data has a quantitative or qualitative nature. If quantitative, they may be ruled either by maximizing or minimizing. The maximizing rule will regard the highest value to be the best, while minimizing rule will regard the lowest value to be the best.

Evaluating by Quantitative Criteria

Price is a kind of a quantitative criterion that typically has a minimizing ruling when considered from the consumer’s point of view. The less it costs, the better. Let’s evaluate 4 products by price: A, B, C, and D, with the goal of assigning each one a numeric weight. The prices are:

Product            Price

A                190B 230
C                320D 290

Because the criterion is minimizing (lower values are considered better), the first step to calculate the weights is converting the values by using the following formula to get a coefficient kj:

kj=1Price100

This coefficient actually converts a minimizing criterion into a maximizing one (the higher the better because with price it’s the other way round):

Product kj
A 0.526
B 0.436
C 0.313
D 0.349

The resulting quantitative pj weights are calculated by a normalization formula:

\begin{equation}p_{j}=\frac{k_{j}}{\sum_k_{j}}\end{equation}

Product kj pj
A 0.526 0.324
B 0.436 0.268
C 0.313 0.193
D 0.349 0.215



Evaluating by Qualitative Criteria – Pairwise Comparisons

Pairwise comparisons belong to one of the most basic concepts of Analytical Hierarchy  Process. It is for evaluation of the criteria that are not clearly quantifiable but in the overall decision-making process play a crucial part. It is very difficult to assign weights to qualitative assessments by guessing and intuition, the AHP derives the information from comparing all the alternatives among themselves on every level of the hierarchy. In other words it slices the overall information into pairs of information. It is then used as a base for calculating numerical weights of each alternative.
Each pair of criteria being compared is assessed by 9-degree numerical scale that was developed specifically for this purpose:

Numeric Scale Description Explanation
1 Equal Both elements having same importance
3 Moderate Moderate importance of one over another
5 Strong Strong/essential importance of one over another
7 Very strong Very strong or demonstrated importance
9 Extreme Extreme importance of one over another

Besides the ones mentioned, there are also half-grades of 2, 4, 6, 8.

The relationship between elements in pairwise comparisons is called ‘importance’ of one over another but it can just as well be referred to as ‘preference’ or ‘likelihood’ of their occurrence. It always depends on the type of the problem being solved.

After pairwise comparisons of k-number of elements, pairwise comparison matrix (also known as Saaty’s matrix) is constructed. It is basically a reciprocal matrix consisting of k2 elements with 1’s on its diagonal and inverted values on each side. Typical pairwise comparison of 3 evaluated “Options” by a qualitative criterion can look as follows:

Option A Option B Option C
Option A 1 2 8
Option B ½ 1 6
Option C 1

For better clarity usually only the values in bold are shown. The number of those values can be calculated by the following formula:

n(n1)2

While AHP appears to be rather straightforward at first sight, the background mathwork needed for calculating numerical weights out of pairwise comparisons is not as straightforward. Eigenvalues and eigenvectors are involved and computer software such as Expert Choice does the hard work.

There is however an approximation method – an algorithm that can calculate rough weights in three steps without using a computer.

Algorithm for Calculating Approximate Weights

Step 1: Add up values in each column of pairwise comparison matrix

Option A Option B Option C
Option A 1 2 8
Option B ½ 1 6
Option C  ⅙ 1
Total 13/8 19/6 15


Step 2: Each item is divided by the total of its column thus getting a normalized matrix

Option A Option B Option C
Option A 8/13 12/19 8/15
Option B 4/13 6/19 6/15
Option C 1/13 1/19 1/15
Total 1 1 1



Step 3: Total of each row will be divided by the number of items in the row

Option A Option B Option C
Option A (8/13 + 12/19 + 8/15) / 3
Option B (4/13 + 6/19 + 6/15) / 3
Option C (1/13 + 1/19 + (1/15) / 3
Total 13/8 19/6 15

The calculated means will then serve as the approximate weights of each alternative (called Options here). From the numbers below we can see that Option A “won”.

Approximate weight

Option A        0.593

Option B        0.341

Option C        0.066

Total            1

Measuring Consistency of the Evaluation

Consistency of the measurements is way of expressing a certain ‘compactness’ of the preferences created during pairwise comparisons. It shows to what extent the data fed into the computer is logically cohesive. If for example, Alternative 1 is twice as important as Alternative 2 and Alternative 2 is three times as important as Alternative 3, then Alternative 1 must be 6 times (2 x3) as important as Alternative 3. This would be a case of a perfect consistency, in other words, the consistency coefficient would equal 0.

Inconsistency of judgements is at the background of human thinking. Humans don’t just use logic when drawing conclusions but rather stick to intuition, emotions, experience that all influence their attitudes and swing their decisions. If someone prefers apples to oranges and at the same time the person likes oranges better than bananas, shouldn’t it automatically be assumed that apples will be preferred over bananas? And yet the same people will still go for bananas rather than apples because there are other things to consider, like say, time of the day, season, etc. all eventually causing that they illogically, or ‘inconsistently’ choose this alternative and not the other.

In practical applications perfect consistency is rare because new and new information is constantly added in evaluation and it changes the previous relationships. Therefore pairwise comparisons permit a certain amount of inconsistency of preferences. The AHP works with the so called Consistency Ratio, with the rule of thumb that, if the inconsistency be more than 10 per cent, the evaluation should be revisited.

High inconsistency (bad consistency ratio) implies one of the following problems:

  • Ill logic in pairwise comparisons
  • Badly structured hierarchy
  • Errors/typos during data input

Expert Choice calculates consistency ratios automatically with each pairwise comparison. In case that the inconsistency is too high, it even has a feature that discloses the elements where inconsistency is highest. It could be repeated until inconsistency goes back to an acceptable level.

For illustration let’s look at how approximate inconsistency can be calculated without using a computer. The previous example is used:

Option A Option B Option C
Option A 1 2 8
Option B ½ 1 6
Option C 1

If Option A is preferred twice over Option B and Option B is preferred 6 times over Option C, then Option A should be preferred 12 times over Option C. That would be an ideal situation or perfectly consistent evaluation. However, because Option A is only 8 times more preferred than Option C, consistency ratio needs to be found.
Working out the approximate consistency ratio is dealt with in the next chapter.

Algorithm of Calculating Approximate Consistency Ratio

Step 1: Each column element of the original pairwise comparison matrix is multiplied by the resulting weight of their alternative and then rows are summarised:

Option A Option B Option C
(0.593) (0.341) (0.066)
Option A 1 2 8
Option B 0.5 1 6
Option C 0.125 0.167 1

will become

Option A Option B Option C
Option A 0.593 0.682 0.528
Option B 0.297 0.341 0.396
Option C 0.074 0.057 0.066

Resulting totals then are:

Option A        1.803

Option B        1.034

Option C         0.197

Step 2: Each resulting total is divided by its weight:

Option A        1.803 / 0.593 = 3.04

Option B        1.034 / 0.341 = 3.032

Option C         0.197 / 0.066 = 2.985

Step 3: Mean is calculated:
Lmax=3.04+3.032+2.9853=3.019

Step 4: Consistency Index CI=Lmaxnn1 is then calculated, where is number of elements being compared:

CI=3.01932=0.0095

Step 5: Consistency ratio is calculated using the so called Random Index – which is an average consistency index of a randomly generated n x n – size matrix:

CR=CIRI

Random Index (RI) doesn’t need to be calculated as it is already provided in the following chart:

n        RI
2        0
3        0.58
4        0.9
5        1.12
6        1.24
7        1.32
8        1.41

For n = 3 RI is 0.58 and the Consistency Ratio is:

CR=0.00950.58=0.016

The approximate Consistency Ratio of the three evaluated Options equals 0.016 and meets the previously stated expectation of CR < 0.1.

The evaluations are therefore considered to be sufficiently consistent.

Practical Application of Expert Choice

Analytical Hierarchy Process and Expert Choice has a wide application in real-life business-related situations. Generally, it has large usage in marketing in product comparisons but it can just as well be used to support decision-making or in planning, investing, conflict resolution, forecasting or risk management to name a few.
IBM used Expert Choice when applying for Malcolm Baldridge National Quality Award. General Motors used it in its design projects when evaluating prototypes of its new products. Xerox used it for the portfolio management, evaluating new technologies and as a support tool in marketing decisions. It has been used by government in rating of buildings by historic significance, or in assessing the condition of highways so the engineers could determine the optimum scope of the project and justify the budget to lawmakers.

In project management it can  be used in the scope management knowledge area, in estimating cost of work packages through control account level, then aggregating them into the overall project cost estimates.
It has a large usage in Human Resources in Acquire Team process to evaluate employees or potential team members from large number of applicants against the set of defined criteria. They are quickly rated and scored to select the ones that best suit the criteria.

Portfolio management is another ara of application where it helps decision-makers rate the business value of their potential projects. AOL project portfolio management can be an example.
AHP can also be used in risk management, identification and prioritization where both subjective inputs (qualitative risk analysis) and quantitative data (quantitative risk analysis) need to be assessed.

03 – Selecting Seller by Source Selection Criteria using DAME

In procurement management it is often necessary to evaluate sellers based on proposals they have sent. Evaluation is crucial for the award decision.

Proposal evaluation is an assessment of the proposal and the offeror’s ability to perform the prospective contract successfully. Here is a simplified example how this process can be done with a little known tool called DAME.

Let’s say you’re evaluating 4 Sellers according to 4 evaluation criteria, which are:

Bid ($) – money asked to carry out the job -> “MIN” (more about “minimizing” and “maximazing” here)

Past Performance – Excellent (2), Satisfactory (1.5), N/A (1), Bad (0.5) -> “MAX”

Know how possessed by sellersPairwise comparison (we’re comparing sellers against one another) -> PAIRWISE COMPARISON

Own resources available – Pairwise comparison (we’re comparing sellers against one another) -> PAIRWISE COMPARISON

There are 2 key authorities submitting their judgements. That means 2 “scenarios”. One of them will be the CEO and the other will be the PM (Project Manager).

Download the whole example from the link below.
Download
Selecting Seller by Source Selection Criteria using DAME

04 – Requirement Prioritisation using DAME

Sometimes little means more. This little-known and free Excel add-in can do exactly what many powerful propriatary software packages. DAME stands for Decision Analysis Module for Excel.

If you haven’t worked with DAME before, here is a short instruction manual.
Why don’t you make your life easier. Using this little-known free Excel add-in you can save thousands. DAME stands for Decision Analysis Module for Excel.

Let’s say you lead a development team and you prioritize the development features. Based on the Pareto principle you know that 80% of of effort will be spent on only 20% of features. In other words you need to spend the bulk of your money on one fifth of the most important features. You narrow the feature list down to only say 10 most important ones – those that are an absolute Must Have for the customer.  However you need a better picture when it comes to prioritizing those 10 key requirements.  Evaluations is being carried out by 5 experts including a customer representative. Their estimates must be synthesized into a final outcome to have a better idea how the budget should be shaped for the next ‘iteration’, sprint, run, or whatever agile terminology is used at your organisation.

One way of doing this could be by using DAME.

Let’s say, with a certain amount of simplification, that you already have a complete and up-to-date requirements breakdown list with say 12 categories of user requirements. The requirements are coded by: a number –  representing the category and a letter – standing for the actual requirement.  E.g. 1a, 4c, etc.

You know that the 10 most important requirements for the moment are:
1=> 1a – users able to record, view, edit info of all clients who have ever entered service
2=> 1b – records will be modeled on a used government reporting tool
3=> 3a – users can record clinic activity (prerequisite in applying for gov grants)
4=> 3b – clinic info will be categorized
5=> 5a – information is confidential (storing on secure server, only for the righ eyes)
6=> 6a – communications book ready with no editing allowed (24/7 shifts need to pass critical information)
7=> 8a – client exclusion list – a list of clients currently banned from service available from anywhere to everybody
8=> 10b – assigned staff will be able to edit and delete client information
9=> 11d – Client will have a special ID that matches the currently used government reporting tool
10=> 11f – One database will be shared by two physical facilities (on different physical addresses). Each organisation will only see data related to it.

Now let’s assume you think that those requirements should be seen as follows (1 = most important; 10 = least important):
Order; Requirement Code; Rank

1. 11d
2. 5a
3. 8a
4. 3a
5. 6a
6. 11f
7. 10b
8. 1a
9. 3b
10. 1b

That’s your evaluation. However there are 4 more stakeholders involved and they might see the whole thing differently.

This problem (when being resolved in DAME) calls for the following input:
10 variants (10 requirements being evaluated)
5 scenarios (5 decision-makers or evaluators)
1 criterion (highest likelyhood that the customer will be happy if a particular requirment is met by the end of the next development cycle; it is a kind of an expert judgement) The important thing here is that is is a minimizing criterion because (you rank the requirements by your judgement from 1 to 10 where 1 is most important and 10 is least important – that means – the less the better)

You decide that the weights (decision-making power) of the stakeholders (based on their position) is as follows:

Customer Rep = 50%
Project Mngr = 20%
Developer = 20%
Tester = 5%
Coordinator 5%

The results from the 5 decision-making parties are then synthesized into one final outcome for each requirement. This way you receive the final ranks for your ten most important requirements:

1. 11f = 0.206557377
2. 11d = 0.133911394
3. 10b = 0.133518493
4. 3b = 0.105744479
5. 5a = 0.093889717
6. 1a = 0.059748002
7. 8a = 0.056868988
8. 1b = 0.05641512
9. 6a = 0.053082238
10. 3a = 0.050264192

If your budget is 100,000yousplititagainsttheweightstoseehowmuchmoneyshouldgotowardsmeetingeachrequirment:1.11f=0.206557377>20,656
2. 11d = 0.133911394 -> 13,3913.10b=0.133518493>13,352
4. 3b = 0.105744479 -> 10,5745.5a=0.093889717>9,389
6. 1a = 0.059748002 -> 5,9757.8a=0.056868988>5,687
8. 1b = 0.05641512 -> 5,6429.6a=0.053082238>5,308
10. 3a = 0.050264192 -> $5,026

The above example is simplified of course for demonstration purposes only it is only to show that using tools such as DAME is elegant, quick and neat, and most importantly, it doesn’t cost anything.

Optimizing Ontario’s investments in a “basket” of core mental health services for children and youth – background

Some background

The Ministry of Children and Youth Services (MCYS) in Ontario funds service providers to deliver community-based child and youth mental health (CYMH) services under the authority of the Child and Family Services Act, R.S.O. 1990, c.C.11 (CFSA). The paramount purpose of the CFSA is to promote the best interests, protection and well-being of children.

Some terms

Client

The MCYS defines a client  as “the intended recipient of the child and youth mental health core service.” The client is a child or youth under 18 years of age who is experiencing mental health problems. In addition to mental health needs, clients may also be experiencing additional challenges related to their development or have specific impairments and/or diagnoses, including a developmental disability, autism spectrum disorder or substance use disorder. Other conditions or diagnoses do not preclude clients from receiving mental health services, but may add to the complexity of their needs, and the services they require. Similarly, where children and youth are involved in other sectors (e.g. youth justice and child welfare) these circumstances do not preclude them from receiving core services. Where children and youth have additional needs and are receiving a range of services, the focus must be on how the services connect. A coordinated approach to service delivery must be supported. Families (including parents, caregivers, guardians, siblings and other family members) may also receive services from a core service provider, in order to address the identified needs of the child or youth client. This may occur when the participation in treatment is recommended to support the child or youth’s service plan.

Continuum of needs-based mental health services

Children, youth and their families can benefit from access to a flexible continuum of timely and appropriate mental health services and supports, within their own cultural, environmental and community context. Mental health promotion, prevention, and the provision of services to address mental health problems represent different points along the continuum. Children, youth and their families may enter the continuum of needs-based services and supports at any point. The actual services a child/youth needs will vary. For example, some children/youth may benefit from targeted prevention services, while others will require more specialized mental health services. In addition, a child or youth’s mental health service needs may change over the course of their treatment.

The following schematic outlines the full continuum of needs-based mental health services and supports, and shows how core services fit within this continuum. It also represents the relative demand for services – level one reflects all children and youth, while level four focuses on a smaller subset of the child/youth population with the most severe, complex needs. This schematic is for service planning only – it is not used for diagnosis or for determining the appropriateness of specific mental health interventions.

MCYS - Continuum of core mental health services
Continuum of CYMH Needs-Based Services and Supports. *Includes members of a group that share a significant risk factor for a mental health problem(s).

Service areas

After a thorough review – including an assessment of Statistics Canada’s census divisions – the MCYS has identified 34 service areas in Ontario for the purpose of:

  • ensuring that all clients across the province will be able to access the same core services
  • facilitating planning, and
  • creating pathways to care.

The defined service areas are not barriers to service. Clients will be able to access service from any service area.

Core services

The MCYS has defined a set of core children and youth mental health services (“core services”) to be available within every service area and has established minimum expectations for how core services are planned, delivered and evaluated. Core services may not be available in every service area immediately – the expectation is that they will be made available over time as lead agencies assume their roles and responsibilities.

Core services represent the range of MCYS-funded CYMH services that lead agencies are responsible for planning and delivering across the continuum of mental health needs within each service area. It is recognized that children and youth in receipt of core mental health services may also require other services and supports. For example, children and youth may receive more than one core service as part of a service plan, as well as other services funded by MCYS or other partners.

Seven core services are to be available across all service areas:

  • Targeted Prevention
  • Brief Services
  • Counselling and Therapy
  • Family Capacity Building and Support
  • Specialized Consultation and Assessments
  • Crisis Support Services, and
  • Intensive Treatment Services.

The MCYS funds providers of core services through the following detail codes:

  • A348 – Brief Services
  • A349 – Counselling and Therapy
  • A350 – Crisis Support Services
  • A351 – Family/Caregiver Capacity Building and Support
  • A352 – Coordinated Access and Intake
  • A353 – Intensive Treatment Services
  • A354 – Case Management and Service Coordination
  • A355 – Specialized Consultation and Assessment
  • A356 – Targeted Prevention Term

The MCYS has identified a target population for each core service. This is the population for whom the service is designed, and for whom the service is intended to provide better mental health outcomes. The act of defining a target population is not meant to be exclusionary. Rather, it is a means to support planning and delivery in a way that benefits the children and youth who are in greatest need of the mental health service. In general, the target population for core services includes those children and youth under 18 years of age and their families who are experiencing mental health problems along levels two, three and four of the CYMH continuum. Additional target populations may also be identified within specific core services.

Lead agency

In every service area, the MCYS has identified a lead agency that will be responsible for the planning and delivery of high-quality core services across the continuum of mental health services in the service area.

A lead agency may either directly deliver core services or work with other providers of core services to deliver the full range of core services within the service area. Lead agencies are responsible for engaging cross-sectoral partners in the health and education sectors, including the relevant Local Health Integration Network (LHIN) and school boards. Lead agencies will connect with other providers to plan and enhance mental health service pathways for children and youth and improve transparency, so that everyone will know what to expect.

Providers of core services are required to comply with the Program Guidelines and Requirements #01 (PGR #01): Core Services and Key Processes.

The core services, key processes and functioning of the CYMH service sector will require refinement from time to time as other provincial initiatives and activities are developed and implemented. Within the broader context of these new initiatives, it is important that the roles and responsibilities of all core service providers are made clear and that the linkages between these services are transparent.

Planning to transform child and youth mental health services in Ontario

A key driver of Moving on Mental Health is the need to build a system in which children, youth and their families:

  • Have access to a clearly defined set of core child and youth mental health services
  • Know what services are available in their communities and how they are connected to one another, and
  • Have confidence in the quality of care and treatment. In a mature system, one of the ways in which this vision will be realized is through identification of lead agencies with planning and funding accountability for core child and youth mental health services within defined service areas.

Within each defined service area, the lead agency will be responsible for:

  • Delivering and/or contracting for the range of defined core CYMH services
  • Making them accessible to parents, youth, and children, and

Establishing inter-agency and inter-sectoral partnerships, protocols and transparent pathways to care. These responsibilities fall into two broad categories:

  • Core Service responsibilities – which relate to the defined core services delivered by the community-based child and youth mental health sector, as well as the key processes that enable high-quality service, and
  • Local System responsibilities – which relate to the collaboration of the community-based sector with other parts of the service continuum such as those supports and services delivered by health care providers, schools and others.

The Core Services Delivery Plan and the Community Mental Health Plan for children and youth will set out how the lead agency carries out these responsibilities. The lead agency will be responsible for developing these plans, and is expected to work collaboratively with other mental health service providers and with all sectors that support children and youth and respond to their mental health needs.

The Core Services Delivery Plan will, together with the Community Mental Health Plan, provide critical insight into each service area and guide activities as we move forward with transforming the experience of children, youth and families. The intent is that over time, both of these plans will have a three-year horizon and will be updated annually, since they inform one another. They will also provide content for the Accountability Agreement entered into between the lead agency and MCYS.

Core Services Delivery Plan

The development of a Core Services Delivery Plan (CSDP) is a key planning and communication tool that will document expectations, obligations and commitments for the provision of core services and associated key processes in each defined service area. This reflects the need to establish a consistent approach that will support critical insights into local and provincial child and youth mental health service issues, while recognizing the unique circumstances of lead agencies and service areas. The Core Services Delivery Plan documents how core services will be delivered in the defined service area. It consists of three areas of content:

  • Service Commitments
  • Continuous Improvement Priorities, and
  • Budget.

In developing the plan, the lead agency and child and youth mental health service providers should ask themselves some key questions:

  • Can we demonstrate that the full range of core services is available in our service area, and that minimum expectations set out in the Service Framework are being met?
  • Can we show how our services are getting better at meeting the mental health needs of children and youth in our communities?
  • Are we making the best possible use of limited resources to deliver high-quality services?
A. Service Commitments

This section of the plan will:

  • Identify, with specific activities and time frames:
    • How the lead agency and other child and youth mental health service providers in the service area will address the expectations set out in the Service Framework, including who will deliver what services over the projected three-year time horizon
    • Where changes to services or service providers are proposed, the plan will document how the changes will result in improvement to child and youth mental health outcomes, service quality and efficiencies
  • Indicate how, if a change in service providers or in contracted relationships is proposed, it will be handled in a transparent manner with due regard to minimizing disruption to service
  • Set out how services will be designed and delivered in a culturally responsive manner to address diverse populations including francophone and Aboriginal populations
  • Document how a clear, stable point of contact for children and youth with mental health needs and their families, as well as those seeking services on their behalf, will be established and/or maintained
  • Report on the reach and efficacy of programs and services, including how input from parents and youth has been incorporated to ensure that what has been developed works for them, and
  • Describe the process by which the lead agency has engaged and will continue to engage respectfully with all core child and youth mental health service providers in the service area.
B. Continuous Improvement Priorities

This section of the plan will:

  • Monitor and report on the impact of current programs and services
  • Identify improvement priorities, taking into account priorities established by MCYS and the expectations set out in the Service Framework, in areas such as service quality and outcomes, a purposeful approach to wait list and wait time management, and others over the three-year horizon of the plan
  • Set out specific activities and time frames that will support continuous improvement goals and priorities, and
  • Address matters such as data sharing protocols between the lead agency and other child and youth mental health agencies in the service area, that will support monitoring and reporting on performance indicators in order to enable tracking of trends, challenges and opportunities for continuous improvement.
C. Budget

This section of the plan will:

  • Forecast activities, resource allocations and budget over the three-year horizon, including financial implications of planned changes to service delivery.

Community Mental Health Plan for children and youth

System responsibilities are built on key partnerships and collaborations developed at the local level to support young people and their families across the full continuum of needs. Although service areas may differ in terms of their service profile, service patterns, as well as the degree of pre-existing cooperation and collaboration across systems and sectors, the lead agency will be responsible for bringing partners together to create coherence for children, youth and their families. MCYS is working, together with the Ministry of Health and Long-Term Care and the Ministry of Education, to put in place conditions that will support this important work.

The Community Mental Health Plan for children and youth will be a public document that is developed by the lead agency and describes the processes by which:

The lead agency has engaged and will continue to engage respectfully with sector partners such as organizations funded by Local Health Integration Networks, District School Boards, public health units, hospitals, primary health care providers, and those delivering MCYS-funded services (e.g., child welfare, autism services) and others, and

Input from parents and youth has been incorporated to ensure that what has been developed works for them. It will cover the following topic areas:

  • Understanding current needs and services
  • Collaborative planning, and
  • Pathways to, through and out of care.

In developing the plan, the lead agency, child and youth mental health service providers and partners from all sectors involved with child and youth mental health should ask themselves some key questions:

  • Are all those who serve children and youth working together systematically to address mental health needs in the service area?
  • Are the roles and responsibilities of everyone across the continuum of needs and services clear to parents, youth and those seeking services on their behalf, including how services are accessed?
  • Are there shared commitments to address service gaps and expand on opportunities to better meet identified needs?
A. Understanding current needs and services
  • Report on a needs assessment of the current state of child and youth mental health services across the service area, identifying gaps and opportunities for meeting needs across the continuum, and
  • Identify and maintain an inventory of who is providing which services to meet the needs identified.
B. Collaborative planning
  • Establish mechanisms to explore, on an ongoing basis, opportunities to leverage resources, reduce duplication, enhance outcomes, and create added value for children and youth with mental health needs through collaboration and joint planning, and
  • Identify and document commitments and actions to be taken to address shared and agreed upon priorities, together with associated timelines and measures to assess results.
C. Pathways to, through and out of care

Develop and document protocols, processes and partnerships that exist, or will be developed, that will streamline and strengthen clear pathways to, through and from care across sectors.

Next: Applying Multi-Criteria Decision Analysis to the “basket” of core mental health services for children and youth in Ontario

An invitation to Portfolio Decision Analysis

Source: Salo, A, Keisler, J and Morton, A – An invitation to Portfolio Decision Analysis – Ch 1 in Portfolio Decision Analysis: Improved methods for resource allocation (2011).

Organizations and individuals have goals that they seek to attain by allocating resources to actions that consume resources. These scenarios involve one or several decision makers who are faced with alternative courses of action which, if implemented, consume resources and enable consequences.The availability of resources is typically limited by constraints while the desirability of consequences depends on preferences concerning the attainment of multiple objectives. Furthermore, the decision may affect several stakeholders who are impacted by the decision even if they are not responsible for it. There can be uncertainties as well, for instance, at the time of decision making, it may be impossible to determine what consequences the actions will lead to or how much resources they will consume.

These, in short, are the key concepts that characterize decision contexts where the aim is to select a subset consisting of several actions with the aim of contributing to the realization of consequences that are aligned with the decision maker’s preferences.

Portfolio Decision Analysis (PDA)
A body of theory, methods, and practice which seeks to help decision makers make informed multiple selections from a discrete set of alternatives through mathematical modeling that accounts for relevant constraints, preferences, and uncertainties.

PDA differs from the standard decision analysis paradigm in its focus on portfolio choice as opposed to the choice of a single alternative from a set. There are analytical arguments as to why the pooling of several single choice problems into a more encompassing portfolio choice problem can be beneficial.

  1. The solution to the portfolio problem will be at least as good, because the combination of single choice problems, when considered together, constitutes a portfolio problem where there is a constraint to choose one alternative from each single choice problem. Thus, when considering these single choice problems together, the removal of these (possibly redundant) single choice constraints may lead to a better solution.
  2. If the single choice problems are interconnected – for instance, due to the consumption of shared resources or interactions among alternatives in different subsets – the portfolio frame may provide a more realistic problem representation and consequently better decision recommendations.

A key question in PDA is therefore what alternatives can be meaningfully analyzed as belonging to the “same” portfolio. While PDA methods do not impose inherent constraints on what alternatives can be analyzed together, there are nevertheless considerations which suggest that some alternatives can be more meaningfully treated as a portfolio:

  • when the alternatives consume resources from the same shared pool
  • when the alternatives are of the same “size” (measured, e.g. in terms of cost, or the characteristics of anticipated consequences)
  • when the future performance of alternatives is contingent on decisions about what other alternatives are selected, or
  • when the considerationof alternatives together as part of the same portfolioseems justified by shared responsibilities in organizational decision making.

The fact that there are more alternatives in portfolio choice suggests also that stakes may be higher than in single choice problems. Thus, the adoption of a systematic PDA approach may lead to particularly substantial improvements in the attainment of desired consequences.

But apart from the actual decision recommendations, there are even other rationales that can be put forth in favor of PDA-assisted decision processes. For example, PDA enhances the transparency of decision making, because the structure of the decision process can be communicated to stakeholders and the process leaves an auditable trail of the evaluation of alternatives with regard to the relevant criteria. This, in turn, is likely to enhance the efficiency of later implementation phases and the accountability of decision makers.

Evolution of Portfolio Decision Analysis (FYI)

  • financial portfolio optimization
  • capital budgeting models
  • quantitative models for project selection
  • decision analysis
  • from decision analysis to portfolio decision analysis

Embedding PDA in organizational decision making (FYI)

  • embedding PDA in organizational decision making
  • extending PDA theory, methods and tools
  • expanding the PDA knowledge base