Cultural Difference and Cognitive Biases as a Trigger of Critical Crashes or Disasters
—Evidence from Case Studies of Human Factors Analysis

Abstract

On the basis of the analysis of past case studies of crashes or disasters, it has been clarified how cultural difference and cognitive biases become a trigger of serious crashes or disasters. Heuristic-based biases such as confirmation bias, groupthink, and social loafing surely appeared in the process of crash or disaster breakout. Overconfidence-bases biases such as illusion of control, fallacy of plan, and optimistic bias are also ubiquitous in the route to a critical crash or disaster. Moreover, framing biases contribute to the distorted decision making, and eventually turn into the main cause of critical crash or disaster. In this way, as well as human factors or ergonomics approaches for designing man-machine systems, the prevention and the deletion of cognitive biases are indispensable for the preventing serious crashes or disasters from occurring. Until now, the distortion of decision making has not been discussed from the cultural differences of way of thinking. As well as a variety of cognitive biases, cultural difference in behavior is expected to be important for understanding the root causes of critical crash or disaster. We found that cultural difference distorted judgment through case studies of critical crashes or disasters. It was also demonstrated that considering cultural difference, as well as cognitive biases, is important to prevent irrational and biased decision making from occurring in safety management.

Share and Cite:

Murata, A. (2017) Cultural Difference and Cognitive Biases as a Trigger of Critical Crashes or Disasters
—Evidence from Case Studies of Human Factors Analysis. Journal of Behavioral and Brain Science, 7, 399-415. doi: 10.4236/jbbs.2017.79029.

1. Introduction

Different from the traditional economics, the bounded rationality is commonly assumed in behavioral economics [1] - [6] . Due to bounded rationality, we generally cannot make decision rationally, and thus suffer from cognitive biases pointed out by Kahneman [7] , Tversky and Kahneman [8] , and Kahneman and Tversky [9] . Our cognitive information processing is conducted by System 1, which operates quickly, automatically, without time consuming, and intuitively with little or no efforts, or System 2, which requires us to conduct effortful, demanding and deliberate mental activities. Although heuristic approaches are based on System 1, such approaches constantly suffer from cognitive biases.

One of the major causes of the Challenger space shuttle disaster [10] [11] is regarded to be due to groupthink [12] . Although the manufacturer recognized the risk of malfunction of O-ring under the severely cold temperature, the manufacturer agreed with the launch of the Challenger space shuttle because of illusion of unanimity. After some serious crash occurred, one tends to overestimate the occurrence probability of such an event. We show hesitation to get on the plane immediately after a serious crash due to the overestimation of a fatal crash (hindsight bias). Such a bias hinders the objective survey of a crash.

In Murata and Nakamura [13] and Murata, Nakamura, and Karwowski [14] , it was discussed how cognitive biases distort decision making and induce serious crashes or disasters. However, these studies did not discuss the importance of understanding decision making across different cultures and how this lead to distorted decision making. It is possible that the cultural differences in thinking, behavior or decision making, as well as a variety of cognitive biases, contribute as root causes of critical collisions, crashes, or disasters. Therefore, it is necessary to get insight on how cultural differences distorts judgment, as well as well- known cognitive biases such as optimistic, outcome, confirmation biases, and eventually lead to a mistaken behavior [15] . The aim of this study was to demonstrate how the potential risk of cultural difference leads to a biased cognitive judgment and induces a critical crash or disaster throughout case studies of serious crashes or disasters for recognizing that cultural difference is also an important risk factor in safety management (detection of a variety of safety- threatening errors, extraction of lesson from errors, plan of counter measures, and prevention of serious crashes, collisions, or disasters).

2. Cognitive Biases as a Trigger of Critical Crashes or Disasters

Before proceeding to how cultural differences distort judgment and induce apparently irrational behavior, it will be briefly shown how cognitive biases distort decision making. As shown in Figure 1, it is hypothesized that the distorted decision making or behavior caused by cognitive biases leads to human errors in judgment, decision making, and behavior and eventually (at the worst case) triggers serious crashes or disasters if the commitment to the biased judgment, decision making, and behavior is escalated.

Based on Bazerman and Moore [3] , it is assumed that the heuristics such as

Figure 1. Relational model between cognitive biases and unsafe behaviors or accidents [15] .

Figure 2. Mechanism of cognitive biases due to heuristics, overconfidence, and framing [15] .

availability, representativeness, confirmation, or affect cause the biases such as confirmation biases, and anchoring and adjustment. In Figure 2, not only heuristics but also overconfidence and framing are shown as causes of cognitive biases. Moreover, it is hypothesized that our bounded awareness and uncertain situations forms the basis of heuristics, overconfidence, and framing. Due to such bounded rationality, it is valid that we humans cannot behave rationally but irrationally. We frequently tend to behave irrationally, and are in most cases unaware of how and to what extent these irrational behaviors influence us. Such irrational tendencies are sure to distort our decisions, and in the worst cases this becomes a trigger of serious crashes or disasters according to Figure 1 and Figure 2. Without consideration of our bounded rationality (irrationality), we cannot approach the prevention of serious crashes or disasters and analyze the root (genuine) cause (source) of collisions, crashes, or disasters.

As shown in Figure 3, it must be explored how cognitive biases distorts decision making, induce preconception, and become a trigger of critical collisions, crashes or disasters on the basis of case studies of such events. To do this, we must further clarify the mechanism on why we suffer from cognitive biases, what type of cognitive bias is potentially dangerous, and when or how cognitive biases distorts decision making and become a trigger of serious error, violation, and crashes or disasters. Moreover, we need to identify what is in common for the undesirable stream that cognitive biases induce error or violation of regulations or safety rules, and how this leads to unsafe behavior or serious crashes or disasters.

Using two examples, it is demonstrated how the cognitive biases in Figure 2 are related to critical crashes. It must be noted that the cultural factors are not

Figure 3. Necessity of effective countermeasures to cut undesirable stream from cognitive biases to human error and violation of rules and regulations to unsafe behaviors and accidents. Further understanding of the mechanism of cognitive biases is necessary.

referred to in these two examples. Brafman and Brafman [16] pointed out that loss aversion strongly contributed to the KLM Flight 4805 crash. On KLM Flight 4805, the Boeing 747 was leaving Amsterdam and bounding for Las Palmas Airport in Canary Islands. Due to the terrorist bomb exploded at the airport flower shop in Las Palmas Airport, they were emergently forced to land on Tenerife airport.

In this crash, the losses of the captain of the flight included: the downside of the mandated rest period caused by the delay of flight, the cost of accommodating the passengers for making passengers stay at a hotel until the situation improves and the flight gets possible, the chain reaction of delayed flight such as time pressing stress imposed on the captain, and the blot on the captain’s reputation for being punctual in his flight. The more meaningful the potential loss becomes, the more loss aversive we tend to be. Therefore, the captain must be preoccupied with the urge to getting back as early as possible, lose his sense of safety flight, and force to take off without the permission of takeoff clearance by the air traffic control. For no apparent logical reason, we tend to fall trapped to such a cognitive bias. Our loss aversive property apparently and unexpectedly affected the decision making of the seasoned captain of the flight, and induced a serious crash.

Gladwell [17] discussed the Challenger disaster from the viewpoint of groupthink. It is regarded that one of the major causes of the Challenger disaster is groupthink, especially illusion of unanimity. Groupthink stems from confirmation heuristic. Although the manufacturer of O-ring recognized the risk of malfunction of O-ring under the severely cold temperature, the manufacturer agreed with the launch of the Challenger space shuttle because of illusion of unanimity [12] . This eventually led to the disaster. However, it should be noted that there were enormous number of shuttle components that NASA deems to be as risky as O-ring. Referring to the concept of normal accident proposed in Perrow [18] , Gladwell [17] concluded that such a disaster is unavoidable as long as we continue developing such a high-technology and large-scale system with high risk for the profit of human being. The only thing to be born in mind is that we must continue to pursue safety.

3. Case Studies of Crashes and Disasters to Which Cultural Differences Contributed

In this section, the proposed cross-cultural model of safety culture is validated that by indicating that cultural difference can potentially lead to a biased cognitive judgment and a serious crash or disaster.

3.1. Korean Air Flight 801 Crash [19]

Cultural difference of pilot’s behavior in airplane cockpit is demonstrated through the analysis of Korean air crash in Guam international airport (Flight 801, Bowing 747-3B5, August, 1997). In this crash, the root cause of the crash is speculated to be absolute obedience to higher positional rank (high-power distance culture) and no obedience to lower positional rank (low-power distance culture) peculiar to Korean culture. Different from western countries, such a culture is observable in other Asian countries such as China and Japan.

The landing approach to Guam is usually not so difficult. Guam airport has a glide scope which emits a beam of light stretching upward to the sky so that the pilot can land safely. Unfortunately, the glide scope was under repair. What was worse, they did not know that the guide scope was broken. Moreover, due to the bad weather condition, the crews had to use a complicated VOR (VHF Omnidirectional Range)/DME (Distance Measuring Equipment) approach which required the pilots to coordinate many times to set the VOR/DME approach for landing. VOR is like a beacon to send a signal so that pilots can calculate the craft’s altitude as they approach an airport. Pilots must continue to coordinate with this system until they land safely. The captain misunderstood a different VOR as the VOR installed on the airport. It goes without saying that these two factors contributed to the crash, but none of these factors would be sufficient for causing a crash. The airplane was about to land to an inappropriate place.

After the analysis of the flight recorder transcript, it was found out that the cultural factor [20] [21] [22] in Korea contributed mainly to the crash. That was “absolute obedience to higher positional rank (high-power distance culture) and no obedience to lower positional rank (low-power distance culture).” The captain was in charge of all and was able to do anything what he wanted, while other crews sat and could do nothing. In this culture, the crews other than a captain were not permitted to disobey what a captain conducted. Such a culture appears even in the table manner during the dinner and so on. A low-ranking person must wait until a higher-ranking person sits down and starts eating. This is true even in Japan. In eastern Asian countries such as Korea or Japan, social behavior or actions are generally conducted according to the order of seniority or organizational or social ranking (Obedience to seniority or social ranking seems to be especially strong in Korea and Japan). Although this is not a bad manner by itself, the excessiveness of such a behavior had become the trigger of the crash.

Although the flight engineer noticed that the captain misunderstood the VOR as the one installed at Gum international airport, he could not directly tell this to the captain. The analysis of the flight recorder script clarified that the flight engineer indirectly made an attempt to tell the captain that his recognized VOR was not from Gum international airport. It may be difficult and strange for western people to believe this (the flight engineer cannot point out that the decision making of the captain is completely wrong). But, we must bear in mind that even such a cultural difference potentially and unconsciously becomes a root cause of crucial crash. Although the author does not completely deny such an eastern culture and think such a culture in a sense represents the distinctive property of Korean or Japanese people, such a culture should not rear one’s head so that safety is damaged especially in the practical field of safety management. It is important to do a right thing in a right place irrespective of seniority or ranking. The culture that everyone might rap you with his or her hand if you make a mistake is essential.

3.2. Columbia Airliner Flight 052 Crash [19]

Columbia airliner Avianca flight 052 in January, 1990 is also affected by the difference of culture. The direct cause of the crash was soon identified as fuel exhaustion. The true root cause was hidden in the culture peculiar to Columbia. The fog was so dense that the cockpit crew could not recognize their location. The cockpit crew could not tell ATC (Air Traffic Controller) that their airplane was running out of fuel, and they were in a very emergent and dangerous situation with higher risk of crash. Without being able to inform ATC of the emergency, the airplane actually ran out of fuel, and crashed.

In this case, the cockpit crew must by all means and definitely tell ATC that they don’t have enough fuel to comply with what ATC were trying to do, and they must dare land while they did not ran out of the fuel. The culture that tolerates ambiguity and obeys blindly to the power of high-power distance such as the relationship between the cockpit crews and ATC even during the emergency [20] [21] is peculiar to Columbian people]. Rationally, the cockpit crew had to definitely tell ATC their emergent situation and make ACT permit to land by all means. However, the cultural difference in Columbia distorted such rational behavior. Even in this case, the cultural difference intervened to suffocate rational behavior, which makes us recognize the importance of taking into account cultural differences into account, and incorporating this factor into the concept of traditional safety culture.

3.3. Fukushima Daiich Nuclear Power Plant Disaster [23] [24] [25]

First, the conceivable primary causes of Fukushima Daiichi disaster at Plant 1 are listed below, and it is discussed later how cultural difference penetrated this disaster.

3.3.1. Insufficient Design of Multiple Safety System

Design of multiple-safety system such as power supply, coolant system, and water supply system at Plant 1 did not satisfy the conditions of independence and robustness. The ordinary power supply, the spare power supply and the emergent power supply were not independent with each other, since these were placed at the same site of the plant. These must be placed at different sites so that the ordinary power supply can be replaced by the spare system and the emergent system whenever the ordinary power supply was damaged by some cause such as Tsunami. The spare supply system and the emergent system must be robust so that they are on standby and operable whenever they were required.

Here, a variety of biases such as overconfidence, optimistic bias, normalcy bias, and confirmation bias must have worked for the mindset of Tokyo Electric Power Company (TEPCO). Although it is evident that the conditions of independence and robustness must be warranted for certain, TEPCO had not recognized the importance of satisfying such conditions and had no knowledge on this safety skill due to the optimistic bias and overconfidence that the plant will not be so severely damaged by Tsunami.

The condition of independence and robustness is satisfied under the ordinary operational condition. Therefore, due to normalcy, confirmation, and outcome biases, it becomes difficult for TEPCO to imagine that such conditions are readily violated under the emergent situation such as earthquake or Tsunami attack. Confirmation bias is induced by our tendency to filter out any information that contradicts our existing views and not to imagine a variety of states (that is, not consider options). When get trapped into an outcome bias, we tend to evaluate decisions based on the outcome rather than on the process to get the outcome. In spite of the insufficient design of multiple safety system, TEPCO must judge that the system is sufficient based not on the process but on the outcome.

3.3.2. Mechanism of Emergency Diesel Generator

The mechanism of emergency diesel generator is air-flow-driven (Plant 4, 5, and 6 plan) or water flow-driven (Plant 1, 2, and 3). The large the plant number is, the more recently the plant was constructed.

In comparison with the cooling function of Plants 1 - 3, the cooling function of Plant 5 and 6 was not lost because of the different mechanism of emergency diesel generator. The air-flow-driven emergency diesel cooling system in Plant 5 and 6 can be installed on any sites where the effects of Tsunami is subtle, while the emergency diesel cooling system in Plant 1 - 3 were water flow-driven and must be installed along the sea and under water. Therefore, these systems were readily damaged by Tsunami.

The difference between air flow-driven and water flow-driven mechanism has a significant meaning for the safety of plant in case of Tsunami attack. However, Japanese government and TEPCO say nothing on this even after years from the meltdown of Fukushima Daiichi nuclear power plant. It is postulated that they dare wanted to avoid further criticisms and arguments on this disaster, and were afraid of further accuse due to the neglect of safety countermeasures to Plant 1 - 3 by replacing the mechanism of water-flow driven by air-flow driven. Although this fact is not reported widely and formal documentation of the accident never tells such a significant difference between water-flow driven to air-flow driven mechanisms, the mechanism of emergency diesel generator affected the outcome (meltdown or aversion of meltdown) damaged by Tsunami.

From the viewpoint of enhanced safety, it is desirable, that the old water-flow driven mechanism should be replaced by air-flow driven mechanisms. It is speculated that TEPCO had technically recognized the importance of such a replacement. Due to the optimistic bias and overconfidence, they must have judged that the plant will not be attacked by Tsunami (The risk of Tsunami attack is extremely low), and have not thought that the water-flow driven mechanism should be replaced by air-flow driven mechanisms by any means in preparation for Tsunami. The water-flow driven mechanism works appropriately enough under the ordinary operational condition. Outcome bias forced TEPCO to judge that the status quo (water flow-driven mechanism) is sufficient for safety warranty without imagining the process induced by the mechanism when an enormous Tsunami attacked the seashore along the plant. Therefore, due to normalcy, outcome, and confirmation biases, it becomes difficult for TEPCO to introduce up-to-date emergency diesel generator (air-flow driven type) in preparation for the abnormal situation such as earthquake or Tsunami attack. Such biases potentially induce the imbalance between safety and profit (efficiency), and unconsciously makes TEPCO emphasize profit [26] .

3.3.3. Operation Skill of IC (Isolation Condenser)

Unfortunately, no personnel in Plant 1 had experience operating IC (Isolation Condenser, exactly expressed as Reactor Core Isolation Cooling Condenser). No test of IC pre-operation like US nuclear power plants had been carried out for years in Japanese 54 nuclear power plant. Therefore, it was impossible for the staff to know whether IC in Plant 1 was operating or not. They did not know that roar and loud sound was heard while IC was operating. They must fall into the trap of confirmation bias that no experience of IC operation will not be a barrier to operate the plant safely, because no disaster occurred in Japan so that IC must be operated to prevent station blackout from occurring. There may be a few cases in which no experience of a system did not induces a major problem if they succeeded in managing the situation by referring to the operation manual. It goes without saying that they were too optimistic about the safe operation of the plant, and overconfidence, optimistic bias, and the normalcy bias hid behind their mind.

Not only TEPCO but also other Japanese electric power companies were not willing to acquire operation skill of IC due to optimistic and normalcy biases. Optimistic, normalcy, and outcome biases make them judge that a tough situation such as station blackout never happens even if they have no experience of station blackout or no skill of IC.

3.3.4. Old-Fashioned Nuclear Reactor

Japan manufacturers don’t develop a reactor of their own, and are completely dependent on US manufacturers. Therefore, due to optimistic bias which stems from the complete dependency to US manufacturers, Japanese manufacturers do not try to work hard, learn and gather a variety of information for securing nuclear power plant safety. At the time of outbreak of the disaster, the nuclear reactor (Type: Mark I (GE (General Electric) Company) at Plant 1 had been operating for nearly 40 years. At the start of nuclear power generation, according to not a formal but unspoken rule, Japanese government tacitly determined that a nuclear reactor more than 40 years of usage should be decommissioned. A lot of experts inside and outside of Japan in nuclear engineering consistently warned the low earthquake-proof property of Mark I type reactor and the smaller capacity of this reactor. The smaller capacity of the reactor was one of the causes that emission of radiation was not controlled to the minimum.

Japanese electronic company and government have not acquired such knowledge actively due to the optimistic, outcome, and normalcy biases that we have not encountered a crucial disaster and will never experience a disaster because we are using a reliable technology that is apparently different from one used in Chernobyl in spite of such warnings on the disadvantage of Mark I reactor. As a matter of fact, TEPCO must understand that Mark I type reactor with a smaller capacity of the reactor is hazardous for preventing the explosion of reactor and the readiness of vent in case of emergency, and think it desirable to replace Mark I type reactor by one with larger capacity. Outcome bias further made TEPCO judge that Mark I type reactor is sufficient for safety warranty without imagining emergent events. Therefore, due to normalcy, outcome, and confirmation biases, it becomes difficult for TEPCO to introduce an up-to-date nuclear reactor with larger capacity in preparation for the abnormal situation such as station blackout or vent. Such biases potentially induce the imbalance between safety and profit, and unconsciously make TEPCO pursuit profits [26] .

3.3.5. Other Contributing Factor: Cultural Difference

Difference of nuclear power plant safety culture between Japan and US is other contributing factor of the disaster. In addition to a variety of cognitive biases mentioned above, the following cultural difference-related biases are observable and contributed to the root cause of Fukushima Daiichi nuclear disaster. Experience at Three Mile Island nuclear power plant accident is not fully exploited in Japan due to the following cultural properties. Japanese people do not actively learn from the failure of others, although Japanese do not criticize the failure openly and conspicuously. We seem not to master safety culture to learn (draw a lesson) from the failure of others. We must learn more to profit by other’s experience of failure as pointed out by Dekker [27] and Syed [28] .

The most important thing is not to criticize the improper treatment of the disaster by TEPCO or government. It is actually impossible for the non-expert of nuclear engineering or safety engineering such as the top management of TEPCO or government executives including a prime minster to think how we could actually control the situation more appropriately. These people are not experts of nuclear power plant. We should firstly consider who should be appropriately in charge of the disaster, and the top management or the government executives with higher ranking must delegate the full authority to the expert in charge of the accident, and standby to support blindly the activity of experts.

IAEA (International Atomic Energy Agency) guideline states that the full authority of decision making in the remission process of the disaster must be remitted to the emergency director (usually, the director of the plant). Not Prime Minister or the president of the company but the engineer who have the most detailed knowledge, skill, and information of the plant should be in charge. However, the recommendation by IAEA was ignored due to the effect of culture that puts importance of the ranking of the organization or the government. As already described in Korean air crash (Section 3.1), it tends that the opinion of the higher ranking is more valued in the process of decision making in eastern countries such as Japan and Korea than in western countries. In eastern countries, we cannot make decision without hearing the opinion of the high ranking personnel even if we understand that they have no expertise. Such a cultural difference influenced the capacity, the time, and the speed for dealing with and managing the emergency. This process took time, and made the emergent situation more complicated even in Fukushima Daiichi disaster. It is possible that the TEPCO might have much time to deal with the emergency if the director of the nuclear power plant is responsible for and in charge of all actions after the station blackout. The government should have obeyed completely to the direction by the director of Fukushima Daiichi nuclear power plant.

In summary, as well as cognitive biases stated in Sections 3.3.1 - 3.3.4, we cannot help concluding that the cultural factor (blindly obedience to organizational or social ranking) produced the deviation of correspondence in case of emergency from IAEA recommendation and induced delayed countermeasures to prevent meltdown.

3.4. Union Carbide’s Bhopal Chemical Plant Disaster [10]

The difference of safety culture between developing countries and advanced countries is further investigated through Union Carbide’s Bhopal Chemical Plant Accidents [10] . The plant released deadly methal isocyanate gas (MIC) in December, 1984, and at least 4000 people died and over 20,000 people were injured. At that time, Union Cardide’s profit was declining, especially in this plant. The plant laid off key persons who are accustomed to the situation of Union Carbide’s Bhopal Chemical Plant. They decreased the shift size from eleven to five and reduced the maintenance crew by one half. They further cut the maintenance cost, shut down the refrigeration unit for saving costs, and left safety flares and washing towers remain unrepaired. The leaky valves, inaccurate instrumentation, poor training of workers, and inadequate safety devices were also pointed out in this plant. Such potential causes are observable irrespective of the difference of culture, and a similar disaster was reported even in Union Carbide’s US plant.

The most important point is that it is more improbable in western countries or in Japan to invite and construct a chemical plant in residential areas than in developing countries such as India. In case of an emergent situation, it is apparent that the damage of accident gets serious in residential areas than in other areas. The catastrophic damage of the disaster is mainly due to the location of the plant. If the plant were to be located at the site distant from the residential areas, it is postulated that the damage had not been so remarkable like this. The cultural factors that permit the manufacturer to construct a plant in residential areas in developing countries must be the most significant cause of the severity and huge damage of this disaster.

4. Safety Culture that Takes Cultural Differences into Account

Based on case studies in Section 3, the significance of cultural difference, as well as cognitive biases, as a trigger of tragic crashes or disasters need to be summarized to propose that cultural differences should be taken into account in the framework of safety culture and management.

First of all, the rationale of safety culture with cultural differences considered must be described using the concept of population stereotype as follows. A particular option is formed by a choice of a large proportion of a given population, and this is generally called population stereotype. The population stereotype also corresponds to an expectation, interpretation, or a manner of perceiving, thinking, or behaving that is prominent within the population, and is affected by a cultural difference. A larger proportion of people in western countries have an expectation that a knob on an electrical appliance increases the appliance's output when turned clockwise, while they expect a water or gas tap or faucet to decrease the flow when turned clockwise and increase the flow when turned counterclockwise. The operation of a water or gas tap or faucet is in exactly the opposite way to that of a knob. If controls do not function in the expected ways, this leads to human errors or failures. In the research fields of human factors and ergonomics, it is one of the important topics to design man-machine systems by taking into account the population stereotype.

The concept of population stereotype was proposed by Lovelessn [29] , and is specific to particular cultural groups due to experience with unique display- control relations within their culture. In such a way, the problem of usability in the framework of man-machine systems cannot be separated from the cultural difference. Proctor and Vu [30] examined the universal and cultural aspects of display-control compatibility for applying the knowledge to user-friendly design of interfaces. Moray [31] also emphasized the importance of consideration of cultural differences of population stereotype for the compatible fitting between man and machine. On the basis of the consideration of cultural difference in the framework of man-machine system, we judged that it is rational to introduce the concept of cultural difference to the safety culture. Thus, it is reasonable to consider cultural difference as one of the contributing factor to safety enhancement and management.

A proposed concept on how cultural differences is related to the distorted judgment, and cognitive biases, and eventually lead to a mistaken behavior is depicted in Figure 4. The traditional concept of safety culture [32] [33] consists of four layers. The bottom layer corresponds to the valuation of safety, that is, how we potentially find a value in safety, and evaluate it. The second layer is related to organizational strategy for assuring safety. The organizational mission, leadership, strategies, norms, history of safety management activities are the requisite in this layer. The third layer corresponds to the safety climate where the

Figure 4. Model on safety culture that takes cross-cultural differences into account.

attitudes and the opinions for enhancing safety are required. The top is the behavioral aspect, that is, the actual activities to enhance safety. It goes without saying that such cultures cannot built in a day. However, it must be noted that difference of cultures among countries, or among organizations also contributes to the traditional safety culture in Figure 4.

Taking into account of cognitive biases in Figure 2 that distort our decision making is not sufficient to stop and restrain cognitive bias-related undesirable stream that cognitive biases induce error or violation of regulations or safety rules, and at the worst cases this leads to critical crashes or disasters. Therefore, the consideration of cultural difference (integrate cultural difference into the concept of safety culture) is essential for further enhancing the preventive countermeasures to crucial crashes, collisions, or disasters.

5. Discussion

The importance of understanding decision making across different cultures has been demonstrated as the proposed model in Figure 4. It was also discussed how the cultural difference led to distorted decision making throughout three analyses of critical crashes and disasters, that is, Korean air crash, Columbian air crash, Fukushima Daiichi disaster, and Union Carbide’s Bhopal chemical plant accident.

As well as a variety of biases shown in Figure 2, the cultural differences in thinking, behavior or decision making are important for understanding the root causes of critical crashes or disasters. Therefore, the concept on how cultural differences distort judgment, and induce cognitive biases, and eventually lead to a mistaken behavior has been proposed. A few evidences have been presented on how these cultural differences lead to a biased and irrational decision making and induce serious crashes or disasters. In other words, the potential risk of cultural difference leading to a biased cognitive judgment and inducing a serious crash or disaster was successfully demonstrated throughout analyses of such events. In conclusion, we must definitely recognize that the consideration of cultural difference is also important in the area of accidental prevention and analysis as well as the cognitive biases.

It is predictable that cultural differences cause irrational decision making and behavior, and this eventually can be a trigger or a risk of leading to a critical crash or disaster. Therefore, it is desirable to incorporate this factor into the safety culture concept as in Figure 1 and Figure 4. As demonstrated in case studies of critical crashes or disasters, it is certain that the cultural difference can be a cause of such events as in Figure 1. Rather, we had better classify the irrational behavior triggered by cultural difference as one of the cognitive biases depicted in Figure 2.

As mentioned above, it is reasonable to think that cognitive bias (in this case, groupthink) became a trigger of NASA Challenger disaster. By referring to the fact that there were enormous number of shuttle components that NASA deems to be as risky as O-ring and the concept of normal accident proposed by Perrow [18] , Gladwell [17] commented that such a serious disaster like NASA Challenger disaster is unavoidable as long as we continue developing a large-scale system with high risk for the profit of human being. The most important thing is that we must struggle to pursue safety in spite of the fact that a complicated and highly risky system inevitably and unexpectedly induces a serious crash or disaster(called normal accident in Perrow [18] ).

The culture to learn from failures or critiques from a third party group is necessary for enhancing safety. As pointed out by Heath and Heath [34] , the progress of organizational culture to listen to a variety of opinions works for restraining the ego that will not change the status quo in track. In such a situation where learning from a failure is infeasible, as pointed out by Mullainathan and Shafir [35] , scarcity of mindset toward safety improvement and preparedness to a low risk event induces tunneling and focusing on the status quo, weakens the bandwidth for cognitive processing to take proper countermeasures and enhance safety (cannot imagine other options or possibilities for solution). Eventually, such a situation further gets worse, and make it more difficult to from failures. Therefore, such a cultural difference hinders a proper mindset toward safety.

Without learning from a failure, a similar failure is repeated endlessly. To learn from a failure, an open culture is essential. As suggested in Section 3, a culture ruled by power distance (authority) hinders an appropriate correspondence to an unexpected and emergent event. If a culture is open and honest about mistakes or errors, the man-machine system such as an airplane cockpit or control room of a plant can actively learn from a failure [28] . Only through such an approach, safety management system can progress. The learning from a failure must be hardwired into the man-machine system. Social pressures and the inhibiting effect of authority destroy effective team work and communication among crews or members, which makes it further difficult to build a culture to learn from a failure. This implies that taking into a cultural difference, restraining an inhibiting effect of authority represented by power distance, and creating an open culture is effective for the enhancement of safety and the smooth and proper communication between members within a system.

As seen and recognized above, serious crashes or disasters include cognitive biases (including irrational behavior which stem from cultural difference) as one of main causes of crashes or disasters. The correction or modification of biases in decision making must be one of the promising measures for preventing serious crashes or disasters. When the designers, the engineers, and the managers of modern technologies such as transportation systems, nuclear power plants, and social inflation systems don’t understand humans’ fallibility (error-prone properties) and cultural difference related to our irrational mind, we tend to design inappropriate systems that don’t take our limitation (irrationality) into account, that is, man-machine incompatible systems. Moreover, we cannot master a proper ability to operate and run a system safely. To realize a man-machine compatible system, an open culture without authority or power distance is one of promising measures. Such an open culture will inhibit scarcity of mindset toward safety improvement, which induces tunneling and focusing on the status quo, weakens the bandwidth for cognitive processing to take proper countermeasures, and make it possible to have an open mindset to a variety of options for securing safety.

Future research should build a new theory in the domain of safety, which accounts for cultural differences as part of a cognitive bias or a root cause leading to serious crashes or disasters.

6. Conclusions

The aim of this study was to demonstrate that considering cultural difference, as well as cognitive biases, is important to prevent irrational and biased decision making from occurring in safety management.

Consequently, we inevitably distort our decisions, and make our errors or mistakes serious, and these distortions or errors lead to such critical crashes or disasters as analyzed in Section 3. Without a proper understanding of our irrationality together with cultural differences, we unwillingly repeat crashes or disasters, and cannot get out of vicious circles. The understanding of how cognitive biases (including ignorance of cultural difference) distort decision making and lead to critical crashes or disasters is essential in order to avoid such vicious circles as pointed out by Dekker [26] .

Conflicts of Interest

The authors declare no conflicts of interest.

References

[1] Altman, M. (2012) Behavioral Economics for Dummies. John Wiley & Sons Canada, Ltd., Toronto.
[2] Angner, E. (2012) A Course in Behavioral Economics. Palgrave Macmillan, New York.
[3] Bazerman, M.H. and Moore, D.A. (2001) Judgment in Managerial Decision Making, Harvard University Press, Cambridge.
[4] Ariely, D. (2009) Predictably Irrational—The Hidden Forces that Shape Our Decisions. Harper, New York.
[5] Ariely, D. (2010) The Upside of irrationality—The Unexpected Benefits of Defying Logic at Work and at Home. Harper, New York.
[6] Ariery, D. (2012) The (Honest) Truth about Dishonesty—How We Lie to Everyone Especially Ourselves. Harper, New York.
[7] Kahneman, D. (2011) Thinking, Fast and Slow. Penguin Books, London.
[8] Tversky, A. and Kahneman, D. (1974) Judgment under Uncertainty: Heuristics and Biases. Science, 185, 1124-1131.
https://doi.org/10.1126/science.185.4157.1124
[9] Kahneman, D. and Tversky, A. (1984) Choices, Values, and Frames. American Psychologist, 39, 341-350.
https://doi.org/10.1037/0003-066X.39.4.341
[10] Reason, J. (1990) Human Error. Cambridge University Press, Cambridge.
https://doi.org/10.1017/CBO9781139062367
[11] Vaughan, D. (1997) The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. University of Chicago Press, Chicago.
https://doi.org/10.7208/chicago/9780226346960.001.0001
[12] Janis, I.L. (1982) Groupthink: Psychological Studies of Policy Decisions and Fiascoes. Cengage Learning, Stamford.
[13] Murata, A. and Nakamura, T. (2014) Basic Study on Prevention of Human Error How Cognitive Biases Distort Decision Making and Lead to Crucial Accidents. Proceedings of AHFE 2014, Krakow, 21-23 July 2014, 136-141.
[14] Murata, A., Nakamura, T. and Karwowski, W. (2015) Influence of Cognitive Biases in Distorting Decision Making and Leading to Critical Unfavorable Incidents. Safety, 1, 44-58.
[15] Murata, A. (2017) Cultural Influences on Cognitive Biases in Judgment and Decision Making: On the Need for New Theory and Models for Accidents and Safety. In: Cohn, J.V., Schatz, S., Freeman, H. and Combs, D.J.Y., Eds., Modeling Sociocultural Influences on Decision Making-Understanding Conflict, Enabling Stability, CRC Press, Boca Raton, 103-109.
[16] Brafman, O. and Brafman, R. (2008) Anatomy of an Accident. In: Brafman, O. and Brafman, R., Eds., Sway, Crown Business, New York, 9-24.
[17] Gladwell, M. (2009) Blowup. In: Gladwell, M., Ed., What the Dog Saw, Little, Brown and Company, New York, 345-358.
[18] Perrow, C. (1999) Normal Accidents-Living with High-Risk Technologies. Princeton University Press, Princeton.
[19] Gladwell, M. (2008) The Ethnic Theory of Plane Crash. In: Gladwell, M., Ed., Outliers, Back Bay Books, New York, 206-261.
[20] Helmreich, R.L. and Merritt, A. (2000) Culture in the Cockpit: Do Hofstede’s Dimensions Replicate? Journal of Cross-Cultural Psychology, 31, 283-301.
https://doi.org/10.1177/0022022100031003001
[21] Helmreich, R.L. (1994) Anatomy of a System Accident: The Crash of Avianca Flight 052. International Journal of Aviation Psychology, 4, 265-284.
https://doi.org/10.1207/s15327108ijap0403_4
[22] Sohn, H. (1993) Intercultural Communication in Cognitive Values: Americans and Koreans. Language and Linguistics, 9, 93-136.
[23] Amano, Y. (2015) The Fukushima Daiichi Accident: Report by the Director General. International Atomic Energy Agency (IAEA) Re-port.
http://www-pub.iaea.org/MTCD/Publications/PDF/Pub1710-ReportByTheDG-Web.pdf
[24] Nakamura, A. and Kikuchi, M. (2011) What We Know, and What We Have Not Learned: Triple Disasters and the Fukushima Nuclear Fiasco in Japan. Public Administration Review, 71, 893-899.
https://doi.org/10.1111/j.1540-6210.2011.02437.x
[25] Winfield, D.J. (2015) Black Swan Accidents Predicting and Preventing the Unpredictable. In: Kumar, V., Ed., Safety and Reliability: Methodology and Applications, Taylor & Francis, London, 127-134.
[26] Murata, A. and Moriwaka, M. (2017) Anomaly in Safety Management: Is It Constantly Possible to Make Safety Compatible with Economy? In: Arezes, P., Ed., In Advances in Safety Management and Human Factors (Advances in Intelligent Systems and Computing 604), Springer, London, 45-54.
[27] Dekker, S. (2006) The Field Guide to Understanding Human Error. Ashgate Publishing, Farnham.
[28] Syed, M. (2016) Black Box Thinking: Marginal Gains and the Secrets of High Performance. John Murray Publishers Ltd., London.
[29] Lovelessn, E. (1963) Direction-of-Motion Stereotypes: A Review. Ergonomics, 5, 357-383.
https://doi.org/10.1080/00140136208930601
[30] Proctor, R.W. and Vu, K.P. (2010) Universal and Culture-Specific Effects of Display-Control Compatibility. American Journal of Psychology, 123, 425-435.
https://doi.org/10.5406/amerjpsyc.123.4.0425
[31] Moray, N. (2004) Culture, Context, and Performance. In: Salas, E., Ed., Advances in Human Performance and Cognitive Engineering Research, Vol. 4, 31-59.
[32] Patankar, M.S., Brown, J.P., Sabin, E.J. and Peytom, T.G. (2012) Safety Culture-Building and Sustaining a Cultural Change in Aviation and Healthcare. Ashgate, Farnham.
[33] Antonsen, S. (2009) Safety Culture: Theory, Method and Improvement. Ashgate, Farnham.
[34] Heath, C. and Heath, D. (2016) Decisive—How to Make Better Choices in Life and Work. Crown Business, New York.
[35] Mullainathan, S. and Shafir, E. (2014) Scarcity: The New Science of Having Less and How It Defines Our Lives. Picador, New York.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.