 Home Mathematics  # Technique of Order Preference by Similarity to the Ideal Solution

In 1981, Hwang and Yoon [HwangYoonl981] introduced the Technique of Order Preference by Similarity to the Ideal Solution (TOPSIS) as a multicriteria decision analysis method that is based on comparing the relative “distances” of alternatives from a theoretical best solution and a theoretical worst solution. The optimal alternative will have the shortest geometric distance from the best or positive ideal solution, and the longest geometric distance from the worst, or negative ideal solution. The method is a compensatory aggregation that compares a set of alternatives by identifying weights for each criterion, normalizing the scores for each criterion, and calculating a distance between each alternative and the theoretical ideal alternative based on the best score in each criterion. TOPSIS requires that the criteria are monotoni- cally increasing or decreasing. Normalization is usually required as the criteria often have incompatible dimensions. Compensatory methods allow trade-offs between criteria, where a poor result in one criterion can be negated by a good result in another criterion. This compensation provides a more realistic form of modeling than non-compensatory methods which often include or exclude alternative solutions based on strict cut-off values.

A 2012 survey by Behzadian et al.° finds the main areas of application of TOPSIS include

• • Supply Chain Management and Logistics,
• • Design, Engineering and Manufacturing Systems,
• • Business and Marketing Management ,
• • Health, Safety and Environment Management,
• • Human Resources Management,
• • Energy Management,
• • Chemical Engineering and
• • Water Resources Management.

We begin with a brief discussion of the framework of TOPSIS as a method of decomposing a problem into sub-problems. Typically, a decision maker must choose from many alternatives each having a set of attributes or characteristics that can be measured subjectively or objectively. The attributes can relate to any tangible or intangible aspect of the decision problem. Attributes can be

4Behzadian, Khanmohammadi, Yazdani, and Ignatius, “A state-of the-art survey of TOPSIS applications,” Expert systems with Applications, 39 (2012): 13051-13069.

carefully measured or roughly estimated, or be well or poorly understood. Basically, anything that applies to the decision at hand can be used in the TOPSIS process.

Methodology

The TOPSIS process is carried out as follows.

Step 1. Create an evaluation matrix X = [x4] consisting of m alternatives and n criteria where хц is alternative i’s value for criterion j. Step 2. X is normalized to form R = [ry]mxn using the normalization for i = 1 ..m and j = l..n.

Step 3. Calculate the weighted normalized decision matrix T. Weights must total 1 (100%), and can come from either the decision maker directly, or by computation such as from the eigenvector of a comparison matrix using Saaty’s nine-point scale. T is given by i.e., multiply each column by its weight.

Step 4. Determine each criterion’s best alternative Аь and worst alternative Aw. Examine each attribute’s column and select the largest and smallest values. If the criterion’s values imply larger is better (e.g., profit), then the best alternatives are the largest values; if the values imply smaller is better (such as cost), the best alternative is the smallest value. (Whenever possible, define all criteria in terms of positive impacts.) Separate the index set of the criteria into two classes: and Now define the best, the ideal positive alternative, as and the worst, the ideal negative alternative, as Step 5. Calculate the Euclidean distances between each alternative and the ideal positive alternative then the ideal negative alternative Step 6. Calculate each alternative’s similarity to the worst condition Note that 0 < slw < 1 for each i, and that Step 7. Rank the alternatives by their SiW values.

Normalization

Wojciech Salabun presents four methods of normalization: 3 methods of linear normalization and the vector method we used in Step 2. Each method has variants for “profit” (larger is better) and “cost” (smaller is better). Vector normalization, which uses nonlinear distances between single dimension scores and ratios, should produce smoother trade-offs [HwangYoon 1981].

Strengths and Limitations

TOPSIS is based on the concept that the chosen alternative should have the shortest geometric distance from the positive ideal solution and the longest geometric distance from the negative ideal solution. See Figure 8.5. FIGURE 8.5: TOPSIS with Two Criteria

Two main advantages of TOPSIS are its ease of use and ease of implementation. A standard spreadsheet can handle the computations, and deep mathematical expertise is not required.

The main weakness of TOPSIS is that subjectivity in setting criteria and weights can inordinately influence the rankings produced. As always, sensitivity analysis is a must.

Sensitivity Analysis

Sensitivity analysis is essential to good modeling. The criterion weights are the main target of sensitivity analysis to determine how they affect the final ranking. The same procedures discussed for AHP are all useful for TOPSIS. We will again use Equation 8.4 (pg. 364) to adjust weights in our sensitivity analysis [AlinezhadAmini2011].

Examples using TOPSIS

We’ll revisit examples from AHP and SAW so as to compare results, method efficacy, and computational complexity.

Example 8.8. Selecting a Car Redux.

The decision maker’s weights used for AHP will be used here slightly modified with cost not inverted. Use the PCM matrix again, but with cost not inverted. The input data for the alternatives must be in the same order as the prioritized criteria. Use the TO r bib program from the book s TbMv2 package. The ranked order of alternatives are: Camry (0.9087), Sonata (0.8098), Fusion (0.7159), Prius (0.6156), Leaf (0.1766) and Volt (0.06138). How does this compare to the AHP rankings?

To begin sensitivity analysis, reduce the weight of cost by steps of 0.05, modifying the other weights linearly to keep ^ w, = 1, until cost is overtaken as the highest weighted criteria. We find no changes in the rank-ordering of our alternatives until 10 steps. We see the rankings are stable until some switching at the 10th step.

Modifying the other weights and determining the results’ sensitivity to those changes is left to the exercises.

Example 8.9. The Kite Network Redux.

Revisit analyzing the Kite Network, this time with TOPSIS, to find the main influencers in the network.

Use the same four criteria as Example 8.7 (pg. 370):

• 1. Total Centrality(TC) 2. Betweenness(BTlV)
• 3. Eigenvector Centrality {EC) 4. Closeness Centrality{CC) Use the same weights as in AHP which have a good CR of 0.01. To begin sensitivity analysis, change the weight for the largest criteria, Total Centrality. Adjust the other weights linearly to keep ^ w% = 1.  The plot of the results from sensitivity analysis shows that two sets of alternatives change place. This shift is a significant change, and again emphasizes the importance of sensitivity analysis.

Exercises

• 1. Redo Section 8.3’s Exercises (pg. 359) using TOPSIS. Compare with your previous results using SAW and using AHP.
• 2. In each of the problems above, perform sensitivity analysis by changing the weight of your dominant criteria until it is no longer the highest. Did the change affect your rankings?
• 3. In each of the problems above, find break points, if any exist, for the weights.
• 4. Suppose the dominant criterion has no break point. What does this indicate about the sensitivity of the solution to that criterion’s weight?
• 5. Suppose the least weighted criterion has no break point. Can this criterion be eliminated without affecting the rankings?

Projects

Project 1. Write a program using the technology of your choice to implement:

• (a) SAW,
• (b) AHP, and
• (c) TOPSIS.

Project 2. Enhance your program to perform sensitivity analysis.

Project 3. Perform and discuss a comparative analysis using the Kite network and each MADM method.

•  W. Safaban, “Normalization of attribute values in TOPSIS method,” Chapter 4 inBehzadian et al, Nowe 'Prendy w Naukach Inzynieryjnych, CREATIVETIME, 2012.

 Related topics