The Influence of Cognitive Heuristics and Associated Bias On Rational Decision Making The following essay attempts to analyze the influence of heuristics, specific strategies or shortcuts to speed thought using readily available information and perceptions to speed decision making, and the influence of bias emanating from the use of these heuristic methods that move us from accurate rational decision making, to non-optimal decision making.
The concept of heuristics was introduced by Simon (1957) in his discussion of “limited rationality’, in which he argued that because of cognitive limitations, humans have little option but to construct simplified models of the world. Simon saw heuristics as adaptive strategies used by humans to cope with their limited information processing capacity (Shanties). I will attempt to evaluate specific instances of bias as a result of heuristics, their effect on the decision making process, and make recommendations for avoiding such bias in cognitive decision making.
One must understand the process of how decisions or Judgments are made, and the influence of available time and information, as well as the relative importance of he decision within a particular process instance. Cognitive activity tends to process available information within a time frame, and then attempts to use reason to form an understanding or Judgment of a situation or problem. When time and information are limited, or the importance of a decision is considered to have minimal risk, the use of heuristics helps to arrive at quick and typically reasonable decisions, to keep us from getting mired in these frequent day to day events.
Unfortunately, the human mind tends to rely on these heuristic methods which lend themselves to bias, which n turn negatively influences important decision making, and can lead to faulty or non-optimal Judgments. Research has identified and defined twelve biases linked to certain cognitive heuristics, and explains the basis for them so they are recognizable to us. Knowing to recognize and be able to describe the various biases, as well as our personal susceptibility to enlist them, can help us to effectively use a prescriptive model for decision making to avoid bias, and use diligence to accurately evaluate our decision making processes.
The key to improved Judgment lies in learning to extinguish between appropriate and inappropriate uses of heuristics, when your judgment is likely to rely on heuristics, and how to avoid them (Baseman, Moore). The twelve common bias heuristics (see Table 1) alluded to above are those that we frequently over-rely on in our day to day decision making processes. In addition, it is possible and likely probable that one or more of these biases can be at play in any decision making process at the same time.
The current financial crisis in Europe has its roots in decision making associated with the purchase of mortgage backed securities and credit default swaps. At the height of this activity, brokerage houses in were still purchasing these investment vehicles stateside, as warning signs were starting to indicate that many banks were under-capitalized and a slowing of the economy would put a number of them at risk.
In an effort to keep profits going, securities traders targeted foreign nations and their city governments to market these securities as ‘guaranteed’ (triple AAA rated) investments, the argument being that the United States real-estate market and the lending institutions that financed mortgages were historically a very safe, low risk environment for stable returns. A DOD number of these countries and their local city governments bought ‘bundled mortgage securities’, again as other investors were turning them down due to risk.
One can assume a number of heuristics played a role in the purchase of these investments, where likely “regression to mean”, “the confirmation trap”, and “over- confidence” were influencing those responsible for making these investment decisions. An analysis of building rates and new mortgages should have illustrated an over-heated building boom, which would likely need to correct to a “mean” average, and that in doing so, some percentage within these bundled mortgage securities would be in default, affecting future returns.
As many cities were looking for low risk investments to grow their retirement accounts to fund future obligations, at a cursory glance, the overall history of the American housing market indicated stability and little risk. Had these governments and city managers done due diligence and sought non-confirmatory evidence, they would have seen the ‘housing bubbles’ generated during low mortgage interest rate eras, and dissolving when rates started to go back up or the economy slowed. Over-confidence” likely played a role, as most f these officials were elected to their positions, and felt power and prestige allowed them to feel infallible in their Judgments when pressed about their decisions to purchase these new investment securities, with little or no history. The resulting failure of these investments and the ensuing fiscal crisis for these governments and their cities, begs the question of how they arrived here and where did their decision making fail so dismally?
In this situation, these governments first needed to recognize the importance of the decision in front of them, and that a rational decision making process was in order to guarantee optimal results. The first item to undertake would be to define their problem; funding growing future pension obligations with investments outside their traditional low return options. The second item would be to identify the criteria associated with decision; what type of return would be reasonable, at what risk, what were other options to fund the obligations.
The third item would be to weigh the criteria; what investments would optimally serve the purpose set forth. The fourth item would be to generate or search for alternatives; was there a way to sell or privative national assets for funding, or to reduce the obligation by reducing benefits, increasing the retirement age, etc. , where this effort would continue until the cost of the search outweighed the value of the additional information. The fifth item would be to weigh and rate each alternative based on the criterion, to later evaluate the solutions derived from them.
Finally, the sixth step would compute the optimal decision, based on the ratings in the prior step. In the course of working as a production process and project engineer, I find my decision making or Judgment has been flawed at times by “ease of recall” and o make a Judgment on a machine’s projected downtime and the impact to operations ‘off the top of my head’. I tend to produce events and recollections that are more easily recalled from memory based on vividness or how recently they last occurred.
When estimating repair or project costs, I find myself ‘anchored’ to the original cost of a machine, and typically do not make adjustments to this Judgment until realizing after the fact that the costs will be much higher. In both these cases, poor decisions can result based on these communications, where if I misidentify an actual problem u to “ease of recall”, maintenance or production resources dedicated to restoring the equipment can now be working on the wrong thing, and additional time and production are lost.
Should I set an expectation of cost based on an “anchoring” bias, the decision to move forward on a project could lead to the loss of upper management’s confidence if the project’s actual cost is significantly more than the original estimate. In the first case identifying the “ease of recall” bias, conferring with the maintenance or production supervisor to corroborate my recollection or provide additional information would be a simple way to avoid this bias.
In the case of “anchoring”, simply qualifying a response in the moment “as my best guess”, or asking for time to research is much more effective than setting unachievable expectations that lead to a loss of confidence later. At the heart of both of these uses of cognitive heuristics is whether or not they are appropriate for the situation, and if not, establish a plan using the six steps for achieving optimal decision making.
Table 1 Bias Type Description Ease of Recall Individuals Judge events that are more easily recalled from memory, eased on vividness or regency, to be more numerous than events of equal frequency whose instances are less easily recalled. Retrievable Individuals are biased in their assessments of the frequency of events based on how their memory structures affect the search process. Insensitivity to base rates When assessing the likelihood of events, individuals tend to ignore vase rates if any other descriptive information is provided – even if it is irrelevant.
Insensitivity to sample size When assessing the reliability of sample information, individuals frequently fail to appreciate the role of the sample size. Misconceptions of chance Individuals expect that at sequence of data generated by a random process will “look” random, even when the sequence is too short for those expectations to be statistically valid. Regression to the mean Individuals tend to ignore the fact that extreme events tend to regress to the mean on subsequent trials.
The conjunction fallacy Individuals falsely Judge the conjunctions (the two events occurring) are more probable than a more global set of occurrences of which the conjunction is a subset. The confirmation trap Individuals tend to seek confirmatory information for what Individuals make estimates for values based upon an initial value (derived from past events, random assignment, or whatever information is available) and typically make insufficient adjustments from that anchor when establishing a final value.
Conjunctive- and disjunctive-events bias Individuals exhibit a bias toward overestimating the probability of conjunctive events and underestimating the probability of disjunctive events. Overconfidence Individuals tend to be overconfident of the infallibility of their Judgments when answering moderately or extremely difficult questions. Hindsight and the curse of knowledge After finding out hither or not an event occurred, individuals tend to overestimate the degree to which they would have predicted the correct outcome.