Yes, we freely acknowledge that we’re hopelessly fallible, illogical beings. Just ask the legendary Star Fleet First Officer “Spock.” Of course, that’s not what we see in the mirror. Study after study confirms our universal cognitive dysfunctions. For convenience, vanity, emotion and an endless array of reasons we’re so artful in self-deception that our biases have been categorized with subcategories. A simple Wikipedia search yields over 100 types of human biases (or fallacies). Enjoy a seven-course feast of the tastiest of our collective, delicious self-delusions:
Future projection – Sure, we understand attitudes change over time, just not our own. A common human bias is projecting that one’s current thinking will remain constant well into the future. It’s why we should never grocery shop when hungry, take on a gym membership to fulfill a New Year’s resolution, or base any major decision on impulse emotion. Multiple studies reveal our attitudes and commitments are much more malleable than we realize. Find more info on future projection here.
Social projection – To ourselves, the way to view an event or issue may seem obvious. However, it’s likely less obvious to others. Social projection codifies our overestimation of our personal beliefs being shared by others, both widely and by our social circles. The family-dinner and water-cooler norms of avoiding issues of religion and politics tend to foster the perpetuation of this bias. Though, perhaps a few social media posts of friends and associates have hit you from left field. Explore social projection further here.
Anchoring – Also known as “framing,” this bias refers to our tendency to overly rely on a single piece of information or source, usually the first information acquired on a subject. In our age of info overload, anchoring can serve expedience, but can also lead to skewed judgments due to a lack of full perspective. Perhaps most commonly when shopping. Shoe prices in store windows set our reference points for shoe prices inside. A pair of black pumps on display priced at $1,000 will make any similar pumps priced at $500 seem like a bargain. And darn if we don’t eat it up with a spoon. Learn more about anchoring here.
Clustering illusion – Popularly called “gambler’s fallacy,” clustering is the most familiar form of an ‘apophenia bias’ involving the tendency to think that past occurrences impact future odds. The classic example is a coin toss. The odds of heads or tails with each toss are the same regardless of the results of all tosses prior. Casinos are well aware of this bias, enticing more action on roulette tables by showcasing electronic boards with a running display of previous spins. Closely related is the “hot-hand fallacy,” relating to the notion that someone on a winning run at a game of chance, like rolling craps, is more likely to keep beating the house. It’s another bias that does not displease casino owners. Find out more about clustering here.
Availability heuristic – Another bias of expedience is favoring the most-available information. Availability heuristic takes several identifiable forms, the most common type is ‘recency bias,’ or the tendency to weight the most-recent data the heaviest. Another prime example is ‘anthropomorphism,’ denoting the interpretation of animal behavior, as well as machines and inanimate objects in terms of personal emotions and conjecture. Certainly, basing decisions on information that’s easiest to summon can yield desired outcomes. Adding wider perspective will only improve those odds. Read on about availability heuristic here.
First-instinct fallacy – Perhaps every test prep course ever given teaches to never change a multiple-choice answer unless you’re absolutely sure you got it wrong. However, a study presented in Psychology Today shows the opposite to be true. Marilyn Von Savant “The World’s Smartest Person” famously proved the counterfactual answer to the Monty Hall problem in which the “Let’s Make a Deal” host offers three doors, one hiding a car and two hiding a goat. You choose door 1. Before the reveal, he opens door 3, revealing a goat. Then he asks if you want to switch doors. To the chagrin of mathematicians across the globe and incredulous readers of her Parade column, repeated trial simulations proved her assertion, revealing that switching to door 2 wins the car two-thirds of the time versus one-third when sticking with door 1. Read a NY Times article on the heated affair here.
Hindsight delusion – Our “knew-it-along” syndrome takes a couple of forms. First identified through research in the 1970s, this bias refers to the tendency to overestimate one’s ability to predict past events. An initial study contrasted participants’ predictions of the likelihood of various outcomes of President Nixon’s trip to China and the USSR versus their blind recollections, which varied greatly. An associated form of hindsight delusion is conferring significance to an otherwise innocuous or unrelated occurrence prior to a momentous event, such as a motorist who narrowly avoids a bridge collapse crediting their good fortune to having worn a lucky green shirt. Investigate hindsight delusion further here.
So yes, our thinking is skewed in ways we don’t realize or acknowledge. Our cognitive biases can even creep into decision logic and machine learning models. InRule delivers no-code intelligence automation solutions with embedded bias detection. Non-technical users can access historical records of machine learning models to track bias infections to their source. Find out about bias detection and the full suite of Intelligence Automation tools firsthand. Visit here to request a 30-day trial.