DEFINITION: "Heuristics" are practical rules that people use to come to conclusions and make decisions (Tversky and Kahneman, 1974).
For example, consider the following description of a person: "My third cousin Steve is very shy and withdrawn. He is invariably helpful, but has little interest in people, or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail” (Tversky and Kahneman, 1974).
What do you think Steve most likely does for work? A) Farmer B) Salesperson C) Librarian D) Airline Pilot E) Physician
Based on the description, most people would conclude that Steve is most likely to be a librarian.
However, there are over 100 times more salespeople in the United States than librarians (Bureau of Labor Statistics, 2018)! Clearly, Steve is much more likely to be a salesperson than a librarian. Therefore, by basing our conclusion on the resemblance of Steve to our stereotypes about different professions, we came to a conclusion that was most likely false. Our decision reflected a cognitive bias instead of a reasonable estimation.
Researchers have identified many cognitive biases that affect decision-making. Three important biases are "Representativeness," "Availability," and "Anchoring" (Tversky and Kahneman, 1974).
1) REPRESENTATIVENESS: Basing decisions on the resemblance of things to categories or stereotypes.
People's evaluation of the likely career of "My third cousin Steve" above is an example of the Representativeness Heuristic. People systematically allow stereotypes to bias their decisions.
The Representativeness Heuristic can also affect academic performance. Representativeness is one of the heuristics that students use to answer exam questions (Maeyer and Talanquer, 2010). Students often indicate that they select answers because they "seem right" instead of citing specific reasons that the answer is correct. Students who use heuristics to make decisions are unlikely to perform well on exams based on reasoned evaluation.
Representativeness can also affect other aspects of academics such as Student Evaluations of Teaching (SETs). SETs may better reflect how representative professors seem of experienced teachers than course effectiveness itself (Langbein, 1994).
2) AVAILABILITY: Assessing probability based on ease of recall.
For example, people often assess risk based on easily-remembered events rather than on the actual risks involved (Tversky and Kahneman, 1974). Consider travel, where many people who frequently drive in cars take out extra insurance to fly on airplanes. Airplanes do occasionally crash, and airplane crashes are usually widely covered in the news media. However, driving in cars (in the U.S.) is far more dangerous than flying in commercial aircraft. For example, in 2015 car travel averaged 1.13 fatalities per 100 million miles driven, whereas airplanes had zero fatalities (NTHSA, 2015).
Nonetheless, many people use the availability heuristic to assign risk to events based on memorable events, instead of taking known probabilities into consideration.
3) ANCHORING: Being influenced by how information is presented.
For example, in less than 5 seconds, please give an estimate of the following products:
Many would guess that (A) is larger than (B). However, on closer inspection we notice that both (A) and (B) have the same product: 479,001,600. Because of "anchoring," many people are influenced by the fact that the first numbers of (A) are larger than the first numbers of (B), and therefore estimate that the product (A) is larger.
"Anchoring" also causes people to systematically overestimate the probability of conjunctive events (events happening together or in succession) and systematically underestimate the probability of disjunctive events (events NOT happening together or in succession; Tversky and Kahneman, 1974).
For example, consider a project that involves a number of steps, each with a certain probability of success. If the first step of the project succeeds, because of "anchoring" people are more inclined to think that subsequent steps will also succeed and systematically over-estimate the overall probability of success of the project. Conversely, consider a system like a nuclear reactor with many parts (a complex system), where failure of any single part can cause the entire reactor to fail. Because each part has a low probability of failure, people tend to under-estimate the probability that the entire system will fail. In reality, the more parts that a system has, the higher probability of failure of the entire system (because the probabilities multiply).
Heuristics and other cognitive biases can influence decision-making and lead to unreasonable conclusions. Employing specific, forthright reasoning can help prevent cognitive biases from affecting judgments.