Methodology for Prioritizing Asset Management Activities for Complex Asset Portfolios

Abstract

Given climate change, aging assets and increasing demands for performance and profitability, it has become critical for organizations with complex asset portfolios to prioritize asset management investments. This research provides a methodological framework to support decision-making. The framework takes into account the presence of complex systems and the need to reduce the uncertainty associated with these systems. A case study from an electricity company has validated the framework, showing it effectively optimizes the allocation of resources to the most critical systems and equipment. The AHP (Analytic Hierarchy Process) and BWM (Best-Worst Method), combined with the WSM (Weighted Sum Method), are compared in terms of efficiency to weight criteria and assess the impact of projects. Results show that AHP is the most effective method for weighting criteria to align investments with the strategic objectives of the organization, while taking the inherent uncertainties of complex systems into account.

Share and Cite:

Biard, G., Nour, G. A., & Komljenovic, D. (2025). Methodology for Prioritizing Asset Management Activities for Complex Asset Portfolios. American Journal of Industrial and Business Management, 15, 592-609. doi: 10.4236/ajibm.2025.154028.

1. Introduction

In recent years, organizations managing large fleets of assets have faced several unusual challenges. These include aging infrastructures, labour shortages, changing working practices, disruption of the global economy and inflation. Unusual weather phenomena have also highlighted the vulnerabilities of infrastructures, networks and industries in general. For companies with large fleets of assets, the consequences of these challenges are undeniable. Indeed, the challenge of aging assets is even greater, as the related risks are present on a larger number of elements. The investments required to meet these challenges are therefore substantial. In most cases, the financial and human resources available are limited and unable to cope with this increase. In such cases, it is necessary to optimize the allocation of resources according to their impact on the organization’s overall medium- and long-term objectives. However, optimizing allocation of resources is a challenge for companies with asset portfolios of multiple categories and technologies. In this type of organization, there are several heterogeneous asset categories, with different maintenance and replacement strategies. This raises the issue of dependency in terms of performance and resource allocation. In this context, advanced analysis methods must be used for decision-making and investment prioritization, as conventional performance measures are incompatible across all assets (Petchrompo & Parlikad, 2019).

Companies in electricity generation, transmission and distribution sector fall into this category. The current context imposes several challenges on these organizations. Firstly, the phenomenon of aging power grid assets will be amplified by climate change. These disruptions also represent the risk of an increase in the number of extreme weather events with impacts on network reliability (Khaliq, Mahmood, & Das, 2015). Secondly, the migration of businesses towards the integration of Industry 4.0 technological tools, as well as the electrification of transport, will have consequences on customer demand for electricity, as well as an increased need in grid reliability requirements (Mamun & Islam, 2016). To address these challenges, electricity generation, transmission and distribution networks require optimized asset management interventions, in order to meet the growing needs of electricity consumption demand, reliability, environmental maintenance, etc. Power grids are also recognized as large complex systems given the set of interrelationships between each of their component parts (Mahmood, Kausar, Sarjoughian, Malik, & Riaz, 2019; Xu, Jia, & He, 2010). Consequently, the prioritization of asset management investments in utilities represents a challenge and must attempt to limit the effect of uncertainty.

Furthermore, efficient Asset Management helps organizations to align with sustainable development goals by providing a structured approach to managing assets that optimizes their value throughout their lifecycle. In fact, by incorporating social responsibility elements into the decision-making process, Asset Management ensures that these goals are part of the success criteria. This alignment helps organizations contribute to sustainable development (ISO/TC 251 WG3, 2018).

The question then becomes: how do we prioritize resource sharing among asset management investment projects to achieve the various objectives of organizations? To answer this question, the main objective of this research is to develop a methodological framework to support asset management investment decision-making in a complex system. The research specifically targets electricity industries.

In addition, the proposed methodological framework considers factors relating to the emerging context in which organizations holding multi-class portfolios operate. This context includes, but is not limited to:

  • The presence of complex systems and the need to reduce related uncertainty by developing robust methodologies for prioritizing investment projects;

  • Interdependencies between limited resources;

  • The multiplicity of distinct objectives to be met.

A case study of a company that produces, transmits and distributes electricity is used to validate the proposed methodological framework for prioritizing projects. The implementation steps are also presented. The results of the case study confirm that the proposed methodological framework optimizes resource allocation to the most critical activity in order to achieve corporate objectives.

2. Literature Review

In complex systems, there is a significant level of uncertainty regarding decision-making (Dezfuli, Stamatelatos, Maggio, Everett, & Youngblood, 2010). Complex systems are characterized by their unintuitive behaviour and nonlinear dynamics. A complex system is defined not just by the many elements that make it up, but also by their interrelationships. The overall performance of the system therefore reflects the unpredictable outcome of the interactions between the constituent elements, rather than the sum of each element’s individual outcomes (Mahmood, Kausar, Sarjoughian, Malik, & Riaz, 2019). Therefore, asset management decision-making with regard to complex systems must consider the notion of uncertainty. Three types of uncertainties (EPRI, 2006) apply in the context of this research:

  • Parametric uncertainty: Related to failures or random events. We know these events will happen and the rate at which they happen, but we do not know precisely when they will happen.

  • Epistemic uncertainty: Related to the lack of a sufficient quality or quantity of data, the limitations of analytical or modelling methods, and the level of knowledge of the phenomena under study. In the case of this type of uncertainty, research and development projects aim to mitigate its impact and improve the quality of knowledge, data and methods.

  • Completeness uncertainty: Related to scope limitations and unknown (and therefore unrepresented) elements.

In this context, Catrinu and Nordgård (Catrinu & Nordgård, 2011) developed a methodology for managing risks in the face of uncertainty and security measures. They used multi-criteria analysis to support asset management decision-making in electric power distribution systems by prioritizing asset maintenance and renewal. Multi-criteria analysis makes it possible to overcome the limitation of using risk-based methods. Risk-based methods require knowledge of probabilities or frequencies. Since data are not always available in sufficient quantity and quality, multi-criteria analysis methods supported by an expert committee can be used.

Furthermore, multi-criteria analysis methods can help bring together all elements of corporate objectives and prioritize the projects that best fit them. For example, authors propose the use of multi-criteria analysis to align asset management strategies with organizations’ overall objectives. Publications on electric utilities demonstrate significant gains in grid reliability (Soares, Abaide, & Bernardon, 2014).

In addition, traditional performance measures are not compatible with all systems in multi-class asset portfolios. Therefore, multi-criteria analysis must be used. In this regard, authors (Soares, Abaide, & Bernardon, 2014) have studied the efficient use of resources to improve the performance of electric power distribution networks in Brazil. One author proposes a methodology based on the combination of two multi-criteria analysis methods (AHP and PROMETHEE) to prioritize investment projects. The results show improved network reliability.

However, while multi-criteria decision-making methods can address uncertainty, they do not address the full range of issues associated with complex systems (Komljenovic, Abdul-Nour, & Boudreau, 2019a). Multi-criteria analysis methods are subject to sources of uncertainty that can have significant effects on the results. Possible uncertainties, in addition to differences in methods used, include variations due to the parameters chosen, i.e. thresholds or weights. To understand the effect of uncertainty, a sensitivity analysis must be performed. This involves quantifying the variability for which the outcome may be sensitive enough that it affects the final prioritization conclusion (EPRI, 2008).

In short, a number of publications exist with objectives similar to those of this research. The results of these researches show improvements in grid reliability (Soares, Abaide, & Bernardon, 2014; Gómez, Fernández, Guillén, & Márquez, 2019) and availability, as well as lower asset lifecycle costs (Gómez, Fernández, Guillén, & Márquez, 2019; Cahyo, 2017). A literature review (Chong, Mohammed, Abdullah, & Rahman, 2019) identifies the methods used to prioritize maintenance activities only. The authors conclude that the Analytical Hierarchy Process (AHP) is one of the most commonly used methods for multi-criteria analysis.

Research Contribution

The contribution of this research is a proposed methodology for prioritizing asset management activities that addresses a broader framework than the prioritization of maintenance activities only or a single equipment class. The scope of the study covers all asset management activities. The literature presents a limited number of methods to prioritize the activities of a multi-class asset portfolio throughout their lifecycle. Another contribution of this research is that the prioritization method includes, but is not limited to, achieving a reliability threshold and a minimal level of expenditure. It also compares the effectiveness of two multi-criteria analysis methods adapted to asset portfolios to prioritize investment projects in the context of complex systems and the presence of uncertainty.

In short, this research is characterized by its scope, but also by the integration of complex systems and many distinct and difficult-to-compare objectives of organizations. It is also set apart by its ability to reduce uncertainty in decision-making for complex systems, as well as its ability to adapt to available data and organizational objectives.

3. Method

According to the literature review, some authors (Biard, Abdul-Nour, Komljenovic, & Pelletier, 2022) have determined that AHP and BWM (Rezaei, 2015) are the most appropriate multi-criteria prioritization methods for weighting criteria. Therefore, these methods were compared in terms of efficiency. The BWM was identified as the most appropriate choice after the AHP for the study context. However, this method seems to be less sophisticated for comparing the opinions of a number of people. To counter the limitation regarding the time to complete the method for multiple projects, it is proposed to combine these two methods with the weighted sum method (WSM) to assess the impact of projects on weighted criteria. The research therefore analyzes the effectiveness of the BWM and AHP methods, each combined separately with the WSM for prioritizing asset management activities.

To do so, a questionnaire is used to collect data on the preferences of decision makers represented by the expert committee. Criteria assessment results are obtained using the Delphi method as well as the geometric mean method. The authors mainly use the geometric mean method (Rivest, 2019). Examples of AHP application can be found in several publications. (Komljenovic, 2008; US Nuclear Regulatory Commission, 2003) In the case of the Delphi method, the expert panel and decision makers first compare the criteria individually. Then compiled results are presented to them with the objective of reaching consensus after several meetings, with consensus being when the whole committee agrees with the final assessments. With respect to the geometric mean, compiled individual results are used without seeking consensus. Thus, the two methods can produce different results. Figure 1 shows the entire proposed methodology.

The proposed method also includes assumptions. First, the methodology assumes the independence of the criteria. Second, when optimizing asset management activities, several types of decisions need to be made (Institute of Asset Management, 2015). These decision types can then be associated with each phase of the asset life cycle (British Standard Institute, 2015). The decision types, according to the phases of the asset life cycle, are presented in Table 1. It should be noted that this list is not exhaustive but represent the types of asset management activities targeted by the proposed prioritization method.

Table 1. Decision types according to asset lifecycle phases.

Lifecycle phase

Type of decision

Acquisition

Demand response strategy

Use or operation

Strategy for operating and operating assets

Maintenance

Maintenance strategy

Renewal or disposal

End-of-life replacement strategy

Figure 1. Proposed methodological framework.

Finally, the research hypothesis is that the multicriteria analysis method provides a solution to the problem of prioritizing asset management activities for complex systems. This is demonstrated by applying the proposed method to a case study in a company that generates, transmits and distributes electricity. The methodological framework is adaptable based on available data, organizational objectives and the views of decision makers.

4. Case Study

The case study is on an electric utility. The following sections detail the completion of steps 1 to 9, as well as the relevance of each step.

Step 1: Setting Up the Expert and Decision-Maker Committee

In the context of this research, data may not be available in sufficient quantity and quality to provide an accurate assessment of each of the criteria. There is also a significant level of uncertainty in decision-making for complex systems (Dezfuli, Stamatelatos, Maggio, Everett, & Youngblood, 2010). These two issues may be adequately addressed by setting up an expert panel and using the Delphi method to assess criteria and projects.

The committee must be composed of recognized internal experts and asset management decision makers for the utility under study. One of the committee’s roles is to validate and weight the criteria. Another role is to identify, as experts, the characteristics of investment projects. To assess the investment projects, committee members may consult other experts on the technical aspects of the projects.

Step 2: Identifying the Projects to Compare

To confirm that identified investment projects to be compared are already validated and deemed relevant, they must first have:

  • Undergone a techno-economic analysis confirming their validity;

  • Received an expert recommendation validating their relevance.

As part of the case study, projects selected to test the proposed methodology were chosen so as to cover all criteria. The objective of this exercise is to confirm the applicability of the proposed methodological framework to all the criteria. The projects were also approved by the expert committee.

Step 3: Defining Comparison Criteria

The following three steps were taken to define the comparison criteria:

A literature review was conducted to target project comparison criteria used by companies in similar industries in reference (Biard, Abdul-Nour, Komljenovic, & Pelletier, 2022). This literature review specifies the comparison criteria applicable to the electrical domain.

  • Asset management objectives were defined based on the organization’s strategic plan and criteria to guide investment prioritization were determined.

  • A detailed analysis of all criteria currently used for asset management activities by the utility under study. To do this, it was necessary to identify criteria used for asset management activities such as:

  • Asset management objectives

  • Asset management strategy

  • Prioritization of investment projects

  • Prioritization of asset management activities

  • Risk assessment

These lists of criteria were then combined and analyzed in order to identify the criteria selected and validated by the committee of experts and decision makers. If necessary, criteria can be grouped under similar themes to avoid redundancy or overvaluation of certain elements.

Step 4: Defining Criteria Calculation Parameters

To assess investment projects, parameters for calculating criteria had to be identified. Based on the criteria listed in the preceding step, we were able to identify the elements that correspond to the calculation parameters associated with these criteria. To do this, we:

1) Conducted a literature review to target calculation parameters or methods for calculating identified criteria. This literature review is presented in reference (Biard, Abdul-Nour, Komljenovic, & Pelletier, 2022).

2) Performed a detailed analysis of all parameters currently used by the business under study for each of the criteria identified by examining the following elements:

  • Calculation factors for prioritizing asset management activities

  • Calculation factors and risk assessment scales

Calculation parameters must be developed so as not to be considered in multiple comparison criteria. This was required to avoid overvaluation. The calculation parameters had to be validated by the expert committee.

Step 5: Defining the Assessment Scale by Criterion

Calculation parameters identified in the preceding step helped define the assessment scale by criterion. The assessment scale was used to rate investment projects and the extent to which they met identified criteria. Within this proposed methodological framework, each criterion is assessed on a 5-level ordinal scale. In order to respect the risk tolerance currently in effect in the organization, distinguishing factors were considered in terms of severity for the different criteria used in the organization. If the policy directions of the organization under study do not provide sufficient data, scales may be proposed by the expert committee or found in the literature. Assessment scales must also be validated by the expert committee.

Step 6: Weighting the Criteria

The AHP and BWM were used to weight comparison criteria based on the opinions of several decision makers. The comparison matrix was then used to identify the contribution of one criterion compared to another in achieving the overall objective.

The scale used for the comparison is defined by the author of the method (Saaty, 1987):

  • 1: Equally important

  • 3: A little more important

  • 5: More important

  • 7: Much more important

  • 9: Clearly the most important.

Despite the challenges of choosing a scale, it is still the preferred option (Franek & Kresta, 2014). However, intermediate levels were removed to enable more distinctive differences in the weights of the criteria.

For the AHP method, the results obtained using the Delphi method and the geometric mean are presented in Table 2 and Table 3, respectively.

Table 2. Pairwise assessment matrix based on Delphi method with AHP method.

criteria are more or less important than criteria

C1

C2

C3

C4

C5

C6

C7

C1

1

3

3

1/7

3

5

5

C2

1/3

1

3

1/7

3

5

5

C3

1/3

1/3

1

1/5

1

3

5

C4

7

7

5

1

7

7

7

C5

1/3

1/3

1

1/7

1

1

3

C6

1/5

1/5

1/3

1/7

1

1

1/3

C7

1/5

1/5

1/5

1/7

1/3

3

1

Table 3. AHP Geometric mean pairwise assessment matrix.

criteria are more or less important than criteria

C1

C2

C3

C4

C5

C6

C7

C1

1.00

1.93

1.53

0.42

1.11

1.63

1.69

C2

0.52

1.00

1.11

0.31

0.90

1.53

1.53

C3

0.65

0.90

1.00

0.25

1.00

1.84

2.37

C4

2.37

3.27

4.08

1.00

4.36

5.16

3.94

C5

0.90

1.11

1.00

0.23

1.00

1.55

1.25

C6

0.61

0.65

0.54

0.19

0.64

1.00

0.64

C7

0.59

0.65

0.42

0.25

0.80

1.55

1.00

Once the preference evaluations have been carried out, the priority vector (normalized eigenvector) of each comparison matrix are generated. This vector represents the weight of each element in a level of the structure in relation to the element above it in the hierarchy. The priority vector V is obtained using the equation (1) where aij is the normalized evaluation value of criterion i with respect to j and n is the number of criteria compared.

j=1 n a ij n (1)

Table 4 and Table 5 present the standardized matrices associated with the results of these two methods.

Table 4. Delphi standardized matrix and priority vector with AHP.

Standardized Matrix

C1

C2

C3

C4

C5

C6

C7

Priority Vector (PV)

Weighted sum vector (WV)

C1

0.11

0.25

0.22

0.07

0.18

0.20

0.19

0.17

1.51

C2

0.04

0.08

0.22

0.07

0.18

0.20

0.19

0.14

1.11

C3

0.04

0.03

0.07

0.10

0.06

0.12

0.19

0.09

0.67

C4

0.74

0.58

0.37

0.52

0.43

0.28

0.27

0.46

4.09

C5

0.04

0.03

0.07

0.07

0.06

0.04

0.11

0.06

0.49

C6

0.02

0.02

0.02

0.07

0.06

0.04

0.01

0.04

0.27

C7

0.02

0.02

0.01

0.07

0.02

0.12

0.04

0.04

0.32

Table 5. Geometric mean standardized matrix and priority vector with AHP.

Standardized Matrix

C1

C2

C3

C4

C5

C6

C7

Priority Vector (PV)

Weighted sum vector (WV)

C1

0.15

0.20

0.16

0.16

0.11

0.11

0.14

0.15

1.05

C2

0.08

0.10

0.11

0.12

0.09

0.11

0.12

0.10

0.75

C3

0.10

0.09

0.10

0.09

0.10

0.13

0.19

0.12

0.82

C4

0.36

0.34

0.42

0.38

0.44

0.36

0.32

0.37

2.68

C5

0.14

0.12

0.10

0.09

0.10

0.11

0.10

0.11

0.77

C6

0.09

0.07

0.06

0.07

0.07

0.07

0.05

0.07

0.48

C7

0.09

0.07

0.04

0.10

0.08

0.11

0.08

0.08

0.57

Then, to evaluate consistency, we first divide the priority vector by the weighted sum vector. The weighted sum vector defines the product of each column of the comparison matrix and the weight of the corresponding criterion. The consistency ratio (CR) compares the random consistency index (RI) of the matrices with the consistency index (CI). The consistency ratio must not exceed 10% (Saaty, 1987). The random consistency index relative to the quantity of criteria is presented in Table 6. The equation (2) and (3) are used to evaluate CI and CR where λ is the average value of the priority vector divided by the weighted sum vector and n is the number of criteria compared

CI= λn n1 (2)

CR= CI RI (3)

Table 6. Random consistency index.

Number of criteria

1

2

3

4

5

6

7

8

RI

0

0

0.58

0.90

1.12

1.24

1.32

1.41

The consistency ratio for the Delphi method with AHP is 0.12. This ratio is not within the maximum consistency limit. The consistency ratio for the geometric mean method with AHP is 0.02. The geometric mean method thus provides a lower consistency ratio than the Delphi method and complies with the maximum limit.

For the BWM method, Table 7 shows the pairwise comparison of the worst and best criteria against other criteria for the BWM method. Table 8 shows the weighting of criteria. The assessment is based on the Delphi method and the geometric mean. The Excel solver (Rezaei, BWM Solvers, n.d.) developed by the authors is used to weight criteria and calculate the consistency ratio.

Table 7. Pairwise comparison of best and worst criteria using Delphi method.

Criterion

C1

C2

C3

C4

C5

C6

C7

Best

C4

7

7

5

1

7

7

7

Worst

C7

1/5

1/5

1/5

1/7

1/3

1

1

Table 8. Weighting of criteria with the BWM method.

Method

C1

C2

C3

C4

C5

C6

C7

Delphi

0.09

0.09

0.12

0.47

0.09

0.05

0.09

Geometric mean

0.19

0.14

0.11

0.24

0.11

0.09

0.12

The evaluation of the coherence index is similar to the AHP method. That is, a coherence ratio (CR) compares the maximum matrix coherence index (CM) with the coherence index (CI). We can then equation (4):

CR= CI CM (4)

Using the solver facilitates the evaluation of the coherence index. The consistency ratio corresponds to a value between 0 and 1. The closer the value is to 0, the better the consistency of the results. The solver (Rezaei, BWM Solvers, n.d.) also indicates a maximum ratio as a function of the number of criteria used, which are presented in Table 9.

Table 9. Random consistency index.

Number of criteria

1

2

3

4

5

6

7

8

CM

0

0.44

1.00

1.63

2.30

3.00

3.73

4.47

The consistency ratio obtained is 0.15 and 0.21 for the Delphi and geometric mean methods, respectively. With the BWM method, there is a slight inconsistency since the lowest weight is not assigned to the “worst” criterion. This is confirmed by a higher consistency ratio. It can be seen that despite the fact that the expert committee agreed that the worst criterion is C7, the preference distribution shows that C6 has a lower weighting and is therefore the criterion that should have been identified as the least important. This negatively influences the level of consistency.

Step 7: Applying Weighted Sum Method

Once decision-maker preferences are aggregated and represented by weighting (Step 7) and the assessment scale is set by criterion (Step 8), the weighted sum method is used to compare projects based on the impact assessment on each of the criteria. To do this, the final score for each alternative is calculated based on equation (5) where mi is the assessed level of the project compared to criterion i on the scale developed and wi is the weight of criterion i to achieve the overall objective.

Project score= i=1 c w i m i (5)

Thus, the score for each project is calculated by multiplying the weight of the criteria obtained at the assessed level of each project against the scale identified. The criteria weights are those identified by the AHP method as well as by the BWM method (with Delphi method and the geometric mean).

The assessed level of each project against the identified scale is presented in Table 10. Project scores are presented in Table 11. By example, equation (6) details the score of the project T1 with the AHP method using geometric mean:

T 1 Score =( 4×0.15 )+( 2×0.10 )+( 0×0.12 )+( 4×0.37 )+( 0×0.11 )+( 0×0.07 )+( 5×0.08 ) (6)

T 1 Score =2.68

Table 10. Project assessment by rating scale.

ID

C1

C2

C3

C4

C5

C6

C7

T1

4

2

0

4

0

0

5

T2

4

-4

0

4

0

1

1

P1

1

0

0

4

2

0

1

P2

1

0

0

4

1

3

1

D1

0

5

3

2

0

0

1

D2

0

0

0

3

5

4

1

Table 11. Project scores.

ID

BWM—Geometric Mean

BWM—Delphi

AHP—Geometric Mean

AHP—Delphi

T1

2,60

2,87

2,68

3,00

T2

1,37

2,02

1,83

2,04

P1

1,49

2,24

1,93

2,17

P2

1,65

2,30

2,03

2,23

D1

1,63

1,84

1,68

1,93

D2

1,75

2,15

2,02

1,88

Step 8: Identifying Priority Projects for the Asset Management Plan

Once project scores are calculated, a hierarchy of projects in order of importance is obtained. This step identifies projects to be prioritized in the asset management plan. Projects with the highest scores contribute the most to the organization’s objectives. The resulting project prioritization is presented in Table 12. Project ranking varies based on the method chosen.

Table 12. Project priority hierarchy.

BWM—Geometric Mean

BWM—Delphi

AHP—Geometric Mean

AHP—Delphi

T1

T1

T1

T1

D2

P2

P2

P2

P2

P1

D2

P1

D1

D2

P1

T2

P1

T2

T2

D1

T2

D1

D1

D2

Once the project prioritization hierarchy has been completed, available resources determine the number of projects that will be completed. The asset management plan may contain as many projects as available resources permit. Expected performance levels can also help determine whether available resources are sufficient to achieve acceptable or optimal performance thresholds.

Step 9: Performing Sensitivity Analysis

To understand the effect of uncertainty, it is necessary to quantify the sensitivity of the final results. This step involves varying the weighting of the criteria to determine the weighting at which the preferred project (ranked first, highest score) is replaced by the second preferred project (ranked second).

To complete this step, the weightings of the criteria for which the highest-ranked project is assessed higher than the project ranked second are iteratively modified. The weightings of the other criteria are then proportionally adjusted to keep the sum of all the weightings at 1. For example. for BWM with the Delphi method. project T1 is ranked in first place. Project P2 is ranked second. Based on the data in Table 10. Project P2 gets a higher assessment than project T1 for criteria C5 and C6. For project P2 to exceed the project T1 score, the following changes must be made (independently) to these two criteria:

  • C5: increase by 0.15 points for a total weight of 0.24. 267% increase over current weight of 0.09;

  • C6: increase by 0.15 points for a total weight of 0.20. 390% increase over the current weight of 0.05.

The sensitivity analysis of all the methods is presented in Table 13. It was noted that weightings need to more than double in order to shift project priorities. The results are therefore considered robust.

Table 13. Sensitivity analysis by method.

Method

Criterion

Required Increase

Original Weighting

Difference

AHP—Delphi

C5

0,26

0,04

750%

AHP—Delphi

C6

0,23

0,06

483%

AHP—Geo Avg.

C5

0,14

0,07

343%

AHP—Geo Avg.

C6

0,13

0,11

218%

BWM—Delphi

C5

0,15

0,05

400%

BWM—Delphi

C6

0,15

0,09

267%

BWM—Geo Avg

C5

0,22

0,09

344%

BWM—Geo Avg

C6

0,13

0,11

218%

5. Discussion

The purpose of this section is to assess the robustness of the results and compare methods. In order to compare the effectiveness of the methods, we first need to assess the consistency ratio. The consistency ratios obtained by each method are as follows:

  • AHP—geometric mean: 2%

  • AHP—Delphi: 12%

  • BWM—geometric mean: 21%

  • BWM—Delphi: 15%

For the BWM method, the solver (Rezaei, BWM Solvers, n.d.) indicates that a ratio below 31.44% satisfies the consistency level. The BWM method consistency ratios are therefore within the limit. For the AHP, the geometric mean method is the only method that provides a result that meets the consistency ratio criterion of less than 10%. In order to obtain a lower consistency ratio. the intermediate levels (2, 4, 6, 8) must be used in the pairwise comparison of criteria. These intermediate levels make it possible to identify the slight difference specified by the expert committee for criterion C2 compared to criterion C3. Figure 2 shows the weightings of the criteria by method and their variants.

Figure 2. AHP and BWM method criteria weightings.

Based on the figure above, we see that in all cases C4 is the most important criterion. Furthermore, the results show that the criterion with the lowest weight is C6. This weighting confirms the level of disagreement among decision makers regarding the “worst” criterion determined by the BWM method.

Indeed, in the case of the BWM method, even though committee members identified a “best” and “worst” criterion, assessments show a different result regarding identification of the worst criterion. When the method was used with the expert committee. decision makers had difficulty choosing between two criteria as the “worst” criterion. This is reflected in the level of consistency as well. The BWM method also provides similar weightings for several different criteria, which limits the distinction of project assessments and prioritization.

The choice of method evidently influences project prioritization. For the AHP, prioritization of the first two projects remains the same regardless of the variant and even despite the limited consistency of the Delphi method. For the BWM, the prioritization differs by variant despite the acceptable consistency of the two variants. The prioritization is also different from that of the AHP method. So, results obtained using the BWM method seem inconsistent.

In short, although the BWM method saves time when assessing criteria, the results do not ensure accurate and distinct representativeness of the decision makers’ preferences. Time saved when assessing criteria is offset by a longer discussion session for the Delphi method. This longer session is due to a more painstaking discussion of the “worst” criterion. Therefore, the time saved in the pairwise assessment of criteria is offset by the time required to achieve consensus on the “best” and “worst” criteria.

For the AHP, in addition to quickly yielding a low consistency ratio, it takes much less time to compare criteria using the geometric mean instead of the Delphi method. This time savings is explained by the fact that the geometric mean makes it possible to avoid the many meetings required for the Delphi method. Despite these meetings, consistency is not guaranteed and a second adjustment is required to achieve a consistency ratio that is acceptable, but that remains higher than the geometric mean.

Thus, based on these observations, the AHP method that involves the compilation of each decision maker’s results by geometric mean is the preferable method for prioritizing asset management activities for complex asset portfolios. It is also the method that offers the results that are most in line with the expert committee’s project prioritization expectations, as well as those of the decision-making committee, given the simplicity of the process.

However, to ensure the effectiveness of the method and a sufficient level of precision, clear and documented explanations and definitions must be provided for each criterion. The expert committee must also be met with in advance to ensure a common and accurate understanding of each criterion.

6. Conclusions

This research presents only a high-level methodological framework. The proposal defines an initial asset management decision support model for a multi-class asset portfolio. The proposed methodological framework is therefore intended to define a structure for combining alternative approaches and methodologies in this single decision support model in a follow-up phase. The goal of this modular design is to allow for greater adaptability in future research. Additional research can therefore improve the proposed decision support method.

Regarding limitations related to the scope and hypothesis of the research, further research should incorporate the notion of dependency in developing criteria (Komljenovic, Delourme, & Lavoie, 2019b). In addition, this research includes projects already proposed and documented by the experts. Further research could aim to develop a tool to identify investment projects prior to expert technical and economic analysis. In this sense, modelling and simulation could identify projects based on the probability of asset failure based on their condition, as well as the potential impact of a failure. To do this, expected performance levels and risk tolerance thresholds must first be set in order to identify assets that do not meet or will not meet these thresholds in the short, medium or long term.

Along the same lines, additional research that includes the addition of system modelling could aim to assess the impact of the choice of one investment project over another on a given corporate objective. Also, the use of alternative methods of assessing criteria, such as the analysis of multiple matches, the frequency table, etc. could also be compared in terms of the effectiveness of carrying out the proposed framework. The effectiveness of data envelopment analysis (DEA) to set weights could also be considered.

Furthermore, the assessment of performance parameters for each alternative has been shown to be uncertain when making decisions in the context of complex systems. Further research could expand our understanding of the impact of uncertainty in decision-making. Methods that mitigate the effects of uncertainty for each criterion (and overall) must also be assessed. Considering current technological developments, each criterion could be represented by a complementary analysis, including Industry 4.0 tools and, more specifically, AI for big data processing. The methodology framework is developed to allow the inclusion of inputs from technological developments and the results of Artificial Intelligence Algorithms in the evaluation of criteria, but this has not been clearly demonstrated. It would therefore be interesting to investigate its effectiveness, and the gains associated to the inclusion of this type of input to the proposed framework, mainly for criteria evaluation.

Finally, it would also be interesting to assess the effectiveness of the proposed methodology in other business areas or sectors holding asset portfolios of multiple categories.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

References

[1] Biard, G., Abdul-Nour, G., Komljenovic, D., & Pelletier, S. (2022). Multi-Criteria Prioritization of Asset Management Investments in the Power Industry. IFAC-PapersOnLine, 55, 1804-1809.
https://doi.org/10.1016/j.ifacol.2022.09.660
[2] British Standard Institute (2015). PAS55:2008-2: 2008-Guidelines for the Application of PAS 55-1. British Standard Institute.
[3] Cahyo, W. N. (2017). A Modelling Approach for Maintenance Resources-Provisioning Policies in a Wind Farm Maintenance System. In International Education Evaluation (Ed.), 2017 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM) (pp. 1261-1265). IEEE.
https://doi.org/10.1109/ieem.2017.8290095
[4] Catrinu, M. D., & Nordgård, D. E. (2011). Integrating Risk Analysis and Multi-Criteria Decision Support under Uncertainty in Electricity Distribution System Asset Management. Reliability Engineering & System Safety, 96, 663-670.
https://doi.org/10.1016/j.ress.2010.12.028
[5] Chong, A. K. W., Mohammed, A. H., Abdullah, M. N., & Rahman, M. S. A. (2019). Maintenance Prioritization—A Review on Factors and Methods. Journal of Facilities Management, 17, 18-39.
https://doi.org/10.1108/jfm-11-2017-0058
[6] Dezfuli, H., Stamatelatos, M., Maggio, G., Everett, C., & Youngblood, R. (2010). Risk-Informed Decision Making Handbook. NASA.
https://ntrs.nasa.gov/api/citations/20100021361/downloads/20100021361.pdf
[7] EPRI (2006). Guideline for the Treatment of Uncertainty in Risk-Informed Applications: Applications Guide. Electric Power Research Institute.
https://www.epri.com/research/products/000000000001013491
[8] EPRI (2008). Treatment of Parameter and Model Uncertainty for Probabilistic Risk Assessments. Electric Power Research Institute.
https://www.epri.com/research/products/1016737
[9] Franek, J., & Kresta, A. (2014). Judgment Scales and Consistency Measure in AHP. Procedia Economics and Finance, 12, 164-173.
https://doi.org/10.1016/s2212-5671(14)00332-3
[10] Gómez, J. F., Fernández, P. M., Guillén, A. J., & Márquez, A. C. (2019). Risk-Based Criticality for Network Utilities Asset Management. IEEE Transactions on Network and Service Management, 16, 755-768.
https://doi.org/10.1109/tnsm.2019.2903985
[11] Institute of Asset Management (2015). Asset Management—An Anatomy. Institute of Asset Management.
[12] ISO/TC 251 WG3 (2018). Asset Management Achieving the UN Sustainable Development Goals.
[13] Khaliq, S. A., Mahmood, M. N., & Das, N. (2015). Towards a Best Practice Asset Management Framework for Electrical Power Distribution Organisations. In International Education Evaluation (Ed.), 2015 IEEE PES Asia-Pacific Power and Energy Engineering Conference (APPEEC) (pp. 1-5). IEEE.
https://doi.org/10.1109/appeec.2015.7381065
[14] Komljenovic, D. (2008). An Analysis to Determine Industry’s Preferred Option for an Initial Generic Reliability Database for Candu. In Canadian Nuclear Society (Ed.), 29th Annual Conference of the Canadian Nuclear Society and 32nd CNS/CNA Student Conference 2008. Canadian Nuclear Society.
https://doi.org/10.13140/2.1.2647.9360
[15] Komljenovic, D., Abdul-Nour, G., & Boudreau, J. F. (2019a). Risk-Informed Decision-Making in Asset Management as a Complex Adaptive System of Systems. International Journal of Strategic Engineering Asset Management, 3, 198-238.
https://doi.org/10.1504/ijseam.2019.108468
[16] Komljenovic, D., Delourme, B., & Lavoie, M. (2019b). Ice Storm Canada, January 1998, Case Study Series of Extreme Weather, Dynamic Resilience to Extreme Weather, Learning from Best Practice Examples. World Energy Council.
[17] Mahmood, I., Kausar, T., Sarjoughian, H. S., Malik, A. W., & Riaz, N. (2019). An Integrated Modeling, Simulation and Analysis Framework for Engineering Complex Systems. IEEE Access, 7, 67497-67514.
https://doi.org/10.1109/access.2019.2917652
[18] Mamun, K. A., & Islam, F. R. (2016). Reliability Evaluation of Power Network: A Case Study of Fiji Islands. In International Education Evaluation (Ed.), 2016 Australasian Universities Power Engineering Conference (AUPEC) (pp. 1-6). IEEE.
https://doi.org/10.1109/aupec.2016.7749359
[19] Petchrompo, S., & Parlikad, A. K. (2019). A Review of Asset Management Literature on Multi-Asset Systems. Reliability Engineering & System Safety, 181, 181-201.
https://doi.org/10.1016/j.ress.2018.09.009
[20] Rezaei, J. (2015). Best-Worst Multi-Criteria Decision-Making Method. Omega, 53, 49-57.
https://doi.org/10.1016/j.omega.2014.11.009
[21] Rezaei, J. (n.d.). BWM Solvers.
https://bestworstmethod.com/software/
[22] Rivest, R. (2019). Techniques de Simulation pour la Recherche sur le Perfectionnement de la Méthode AHP. HEC Montréal.
http://biblos.hec.ca/biblio/memoires/m2019a612764.pdf
[23] Saaty, R. W. (1987). The Analytic Hierarchy Process—What It Is and How It Is Used. Mathematical Modelling, 9, 161-176.
https://doi.org/10.1016/0270-0255(87)90473-8
[24] Soares, B. N., da Rosa Abaide, A., & Bernardon, D. (2014). Methodology for Prioritizing Investments in Distribution Networks Electricity Focusing on Operational Efficiency and Regulatory Aspects. In International Education Evaluation (Ed.), 2014 49th International Universities Power Engineering Conference (UPEC) (pp. 1-6). IEEE.
https://doi.org/10.1109/upec.2014.6934727
[25] US Nuclear Regulatory Commission (2003). Formal Methods of Decision Analysis Applied to Prioritization of Research and Other Topics. US Nuclear Regulatory Commission.
[26] Xu, Q., Jia, X., & He, L. (2010). The Control of Distributed Generation System Using Multi-Agent System. In International Education Evaluation (Ed.), 2010 International Conference on Electronics and Information Engineering (pp. V1-30-V1-33). IEEE.
https://doi.org/10.1109/iceie.2010.5559832

Copyright © 2025 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.