24 Common Cognitive Biases: A Visual List of the Psychological Systems Errors That Keep Us From Thinking Rationally

There’s been a lot of talk about the Dunning-Kruger effect, the cognitive bias that makes people wildly overconfident, unable to know how ignorant they are because they don’t have the basic skills to grasp what competence means. Once popularized, the effect became weaponized. People made armchair diagnoses, gloated and pointed at the obliviously stupid. But if those finger-pointers could take the beam out of their own eye, they might see four fingers pointing back at them, or whatever folk wisdom to this effect you care to mash up.

What we now call cognitive biases have been known by many other names over the course of millennia. Perhaps never have the many varieties of self-deception been so specific. Wikipedia lists 185 cognitive biases, 185 different ways of being irrational and deluded. Surely, it’s possible that every single time we—maybe accurately—point out someone else’s delusions, we’re hoarding a collection of our own. According to much of the research by psychologists and behavioral economists, this may be inevitable and almost impossible to remedy.

Want to better understand your own cognitive biases and maybe try to move beyond them if you can? See a list of 24 common cognitive biases in an infographic poster at yourbias.is, the site of the nonprofit School of Thought. (The two gentlemen popping up behind brainy Jehovah in the poster, notes Visual Capitalist, "happen to represent Daniel Kahneman and Amos Tversky, two of the leading social scientists known for their contributions to this field. Not only did they pioneer work around cognitive biases starting in the late 1960s, but their partnership also resulted in a Nobel Prize in Economics in 2002.")

Granted, a Wikipedia list is a crowd-sourced creation with lots of redundancy and quite a few “dubious or trivial” entries, writes Ben Yagoda at The Atlantic. “The IKEA effect, for instance, is defined as ‘the tendency for people to place a disproportionately high value on objects they partially assembled themselves.’” Much of the value I’ve personally placed on IKEA furniture has to do with never wanting to assemble IKEA furniture again. “But a solid group of 100 or so biases has been repeatedly shown to exist, and can make a hash of our lives.”

These are the tricks of the mind that keep gamblers gambling, even when they’re losing everything. They include not only the “gambler’s fallacy” but confirmation bias and the fallacy of sunk cost, the tendency to pursue a bad outcome because you’ve already made a significant investment and you don’t want it to have been for nothing. It may seem ironic that the study of cognitive biases developed primarily in the field of economics, the only social science, perhaps, that still assumes humans are autonomous individuals who freely make rational choices.

But then, economists must constantly contend with the counter-evidence—rationality is not a thing most humans do well. (Evolutionarily speaking, this may have been no great disadvantage until we got our hands on weapons of mass destruction and the tools of climate collapse.) When we act rationally in some areas, we tend to fool ourselves in others. Is it possible to overcome bias? That depends on what we mean. Political and personal prejudices—against ethnicities, nationalities, genders, and sexualities—are usually buttressed by the systems errors known as cognitive biases, but they are not caused by them. They are learned ideas that can be unlearned.

What researchers and academics mean when they talk about bias does not relate to specific content of beliefs, but rather to the ways in which our minds warp logic to serve some psychological or emotional need or to help regulate and stabilize our perceptions in a manageable way. “Some of these biases are related to memory,” writes Kendra Cherry at Very Well Mind, others “might be related to problems with attention. Since attention is a limited resource, people have to be selective about what they pay attention to in the world around them.”

We’re constantly missing what’s right in front of us, in other words, because we’re trying to pay attention to other people too. It’s exhausting, which might be why we need eight hours or so of sleep each night if we want our brains to function half decently. Go to yourbias.is for this list of 24 common cognitive biases, also available on a nifty poster you can order and hang on the wall. You'll also find there an illustrated collection of logical fallacies and a set of “critical thinking cards” featuring both kinds of reasoning errors. Once you've identified and defeated all your own cognitive biases—all 24, or 100, or 185 or so—then you'll be ready to set out and fix everyone else's.

via Visual Capitalist

Related Content:

Research Finds That Intellectual Humility Can Make Us Better Thinkers & People; Good Thing There’s a Free Course on Intellectual Humility

Why Incompetent People Think They’re Amazing: An Animated Lesson from David Dunning (of the Famous “Dunning-Kruger Effect”)

The Power of Empathy: A Quick Animated Lesson That Can Make You a Better Person

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

How Does the Rorschach Inkblot Test Work?: An Animated Primer

A frightening monster?

Two friendly bears?

Say what!?

As anybody with half a brain and the gift of sight knows, the black and red inkblot below resembles nothing so much as a pair of gnomes, gavotting so hard their knees bleed.

...or perhaps it’s open to interpretation.

Back in 2013, when Open Culture celebrated psychologist Hermann Rorschach’s birthday by posting the ten blots that form the basis of his famous personality test, readers reported seeing all sorts of things in Card 2:

A uterus

Lungs

Kissing puppies

A painted face

Little calfs

Tinkerbell checking her butt out in the mirror

Two ouija board enthusiasts, summoning demons

Angels

And yes, high-fiving bears

As Rorshach biographer Damion Searls explains in an animated Ted-ED lesson on how the Rorschach Test can help us understand the patterns of our perceptions, our answers depend on how we as individuals register and transform sensory input.




Rorshach chose the blots that garnered the most nuanced responses, and developed a classification system to help analyze the resulting data, but for much of the test’s history, this code was a highly guarded professional secret.

And when Rorshach died, a year after publishing the images, others began administering the test in service of their own speculative goals—anthropologists, potential employers, researchers trying to figure out what made Nazis tick, comedians…

The range of interpretative approaches earned the test a reputation as pseudo-science, but a 2013 review of Rorshach’s voluminous research went a long way toward restoring its credibility.

Whether or not you believe there’s something to it, it’s still fun to consider the things we bring to the table when examining these cards.

Do we see the image as fixed or something more akin to a freeze frame?

What part of the image do we focus on?

Our records show that Open Culture readers overwhelmingly focus on the hands, at least as far as Card 2 goes, which is to say the portion of the blot that appears to be high-fiving itself.

Never mind that the high five, as a gesture, is rumored to have come into existence sometime in the late 1970s. (Rorschach died in 1922.) That’s what the majority of Open Culture readers saw six years ago, though there was some variety of perception as to who was slapping that skin:

young elephants

despondent humans

monks

lawn gnomes

Disney dwarves

redheaded women in Japanese attire

chimpanzees with traffic cones on their heads

(In full disclosure, it's mostly bears.)

Maybe it's time for a do over?

Readers, what do you see now?

Image 1: Bat, butterfly, moth

Rorschach_blot_01

Image 2: Two humans

Rorschach_blot_02

Image 3: Two humans

800px-Rorschach_blot_03

Image 4: Animal hide, skin, rug

Rorschach_blot_04

Image 5: Bat, butterfly, moth

Rorschach_blot_05

Image 6: Animal hide, skin, rug

Rorschach_blot_06

Image 7: Human heads or faces

Rorschach_blot_07

Image 8: Animal; not cat or dog

689px-Rorschach_blot_08

Image 9: Human

647px-Rorschach_blot_09

Image 10: Crab, lobster, spider,

751px-Rorschach_blot_10

View Searls’ full TED-Ed lesson here.

Related Content:

Hermann Rorschach’s Original Rorschach Test: What Do You See? (1921)

The Psychological & Neurological Disorders Experienced by Characters in Alice in Wonderland: A Neuroscience Reading of Lewis Carroll’s Classic Tale

Introduction to Psychology: A Free Course from Yale University

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Join her in New York City for the next installment of her book-based variety show, Necromancers of the Public Domain, this April. Follow her @AyunHalliday.

A Brief Animated Introduction to Noam Chomsky’s Linguistic Theory, Narrated by The X-Files‘ Gillian Anderson

How is it that children just entering toddlerhood pick up the structure of their respective languages with ease? They are not formally taught to use speech; they have limited cognitive abilities and a “poverty of stimulus,” given their highly circumscribed environments. And yet, they learn the function and order of subjects, verbs, and objects, and learn to recognize improper usage. Children might make routine mistakes, but they understand and can be understood from a very early age, and for the most part without very much difficulty. How?

These are the questions that confronted Noam Chomsky in the early years of his career in linguistics. His answers produced a theory of Universal Grammar in the 1960s, and for decades, it has been the reigning theory in the field to beat, initiating what is often referred to as the “Chomskyan Era,” a phrase the man himself dislikes but which nonetheless sums up the kinds of issues that have been at stake in linguistics for over fifty years.




Questions about language acquisition have always been the subject of intense philosophical speculation. They were folded into general theories of epistemology, like Plato’s theory of forms or John Locke’s so-called “blank slate” hypothesis. Variations on these positions surface in different forms throughout Western intellectual history. Descartes picks up Plato’s dualism, arguing that humans speak and animals don’t because of the existence of an immortal “rational soul.” Behaviorist B.F. Skinner suggests that operant conditioning writes language onto a totally impressionable mind. (“Give me a child,” said Skinner, “and I will shape him into anything.”)

Chomsky “gave a twist” to this age-old debate over the existence of innate ideas, as Gillian Anderson tells us in the animated video above from BBC 4’s History of Ideas series. Chomsky’s theory is biolinguistic: it situates language acquisition in the structures of the brain. Not being himself a neurobiologist, he talks of those theoretical structures, responsible for reproducing accurate syntax, as a metaphorical “language acquisition device” (LAD), a hardwired faculty that separates the human brain from that of a dog or cat.

Chomsky’s theory has little to do with the content of language, but rather with its structure, which he says is universally encoded in our neural architecture. Children, he writes, “develop language because they’re pre-programmed to do this.” Syntax is prior to and independent of specific meaning, a point he demonstrated with the poetic sentence “Colorless green ideas sleep furiously.” Every English speaker can recognize the sentence as grammatical, even very small children, though it refers to no real objects and would never occur in conversation.

Conversely, we recognize “Furiously sleep ideas green colorless” as ungrammatical, though it means no more nor less than the first sentence. The regional variations on word order only underline his point since, in every case, children quickly understand how to use the version they’re presented with at roughly the same developmental age and in the same way. The existence of a theoretical Language Acquisition Device solves the chicken-egg problem of how children with no understanding of and only a very limited exposure to language, can learn to speak just by listening to language.

Chomsky’s theory was revolutionary in large part because it was testable, and researchers at the professor’s longtime employer, MIT, recently published evidence of a “language universal” they discovered in a comparative study of 37 languages. It's compelling research that just might anticipate the discovery of a physical Language Acquisition Device, or its neurobiological equivalent, in every human brain.

Related Content:

The Ideas of Noam Chomsky: An Introduction to His Theories on Language & Knowledge (1977)

Noam Chomsky Defines What It Means to Be a Truly Educated Person

5 Animations Introduce the Media Theory of Noam Chomsky, Roland Barthes, Marshall McLuhan, Edward Said & Stuart Hall

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

An Interactive Map of the 2,000+ Sounds Humans Use to Communicate Without Words: Grunts, Sobs, Sighs, Laughs & More

When did language begin? The question is not an easy one to answer. There are no records of the event. “Languages don’t leave fossils," notes the Linguistic Society of America, "and fossil skulls only tell us the overall shape and size of hominid brains, not what the brains could do.” The scant evidence from evolutionary biology does not tell us when early humans first began to use language, only that they could 100,000 years or so ago.

However, the question also depends on what we mean by language. Before the linguistic technologies of grammar and syntax, hominids, like other mammals today and a good number of non-mammals too, had a wordless language that communicated more directly, and more honestly, than any of the thousands of ways to string syllables into sentences.




That language still exists, of course, and those who understand it know when someone is afraid, relieved, frustrated, angry, confused, surprised, embarrassed, or awed, no matter what that someone says. It is a language of feeling—of sighs, grunts, rumbles, moans, whistles, sniffs, laughs, sobs, and so forth. Researchers call them “vocal bursts” and as any long-suffering married couple can tell you, they communicate a whole range of specific feelings.

“Emotional expressions,” says UC Berkeley psychology graduate student Alan Cowen, “color our social interactions with spirited declarations of our inner feeling that are difficult to fake, and that our friends, co-workers and loved ones rely on to decipher our true commitments.“ Cowen and his colleagues devised a study to test the range of emotion vocal bursts can carry.

The researchers asked 56 people, reports Discover magazine, “some professional actors and some not, to react to different emotional scenarios” in recordings. Next, they played the recordings for over a 1,000 people, who rated “the vocalizations based on the emotions and tone (positive or negative) they thought the clips conveyed.”

The researchers found that “vocal bursts convey at least 24 distinct kinds of emotions.” They plotted those feelings on a colorful interactive map, publicly available online. "The team says it could be useful in helping robotic devices better pin down human emotions,” Discover writes. “It could also be handy in clinical settings, helping patients who struggle with emotional processing.” The study only recorded vocalizations from English speakers, and "the results would undoubtedly vary if people from other countries or who spoke other languages were surveyed.”

But this limitation does not undermine another implication of the study: that human language consists of far more than just words, and that vocal bursts, which we likely share with a wide swath of the animal kingdom, are not only, perhaps, an original language but also one that continues to communicate the things we can’t or won’t say to each other. Read the study here and see the interactive vocal burst map here.

via MetaFilter

Related Content:

Where Did the English Language Come From?: An Animated Introduction

Why We Say “OK”: The History of the Most Widely Spoken Word in the World

The History of the English Language in Ten Animated Minutes

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

The Strange Dancing Plague of 1518: When Hundreds of People in France Could Not Stop Dancing for Months

If you find yourself thinking you aren’t a victim of fashion, maybe take another look. Yes, we can consciously train ourselves to resist trends through force of habit. We can declare our preferences and stand on principle. But we aren't consciously aware of what's happening in the hidden turnings of our brains. Maybe what we call the unconscious has more control over us than we would like to think.

Inexplicable episodes of mass obsession and compulsion serve as disquieting examples. Mass panics and delusions tend to occur, argues author John Waller, “in people who are under extreme psychological distress, and who believe in the possibility of spirit possession. All of these conditions were satisfied in Strasbourg in 1518,” the year the Dancing Plague came to the town in Alsace—an involuntary communal dance festival with deadly outcomes.

The event began with one person, as you’ll learn in the almost jaunty animated BBC video below, a woman known as Frau Troffea. One day she began dancing in the street. People came out of their houses and gawked, laughed, and clapped. Then she didn’t stop. She “continued to dance, without resting, morning, afternoon, and night for six whole days.” Then her neighbors joined in. Within a month, 400 people were “dancing relentlessly without music or song.”




We might expect that town leaders in this late-Medieval period would have declared it a mass possession event and commenced with exorcisms or witch burnings. Instead, it was said to be a natural phenomenon. Drawing on humoral theory, “local physicians blamed it on ‘hot blood,’” History.com’s Evan Andrews writes. They “suggested the afflicted simply gyrate the fever away. A stage was constructed and professional dancers were brought in. The town even hired a band to provide backing music.”

Soon, however, bloody and exhausted, people began dying from strokes and heart attacks. The dancing went on for months. It was not a fad. No one was enjoying themselves. On the contrary, Waller writes, “contemporaries were certain that the afflicted did not want to dance and the dancers themselves, when they could, expressed their misery and need for help.” This contradicts suggestions they were willing members of a cult, and paints an even darker picture of the event.

Certain psychonauts might see in the 1518 Dancing Plague a shared unconscious, working something out while dragging the poor Strasbourgians along behind it. Other, more or less plausible explanations have included ergotism, or poisoning “from a psychotropic mould that grows on stalks of rye." However, Waller points out, ergot “typically cuts off blood supply to the extremities making coordinated movement very difficult.”

He suggests the dancing mania came about through the meeting of two prior conditions: “The city’s poor were suffering from severe famine and disease,” and many people in the region believed they could obtain good health by dancing before a statue of Saint Vitus. They also believed, he writes, that “St. Vitus… had the power to take over their minds and inflict a terrible, compulsive dance. Once these highly vulnerable people began to anticipate the St. Vitus curse they increased the likelihood that they’d enter the trance state.”

The mystery cannot be definitively solved, but it does seem that what Waller calls “fervent supernaturalism” played a key role, as it has in many mass hysterias, including “ten such contagions which had broken out along the Rhine and Moselle rivers since 1374,” as the Public Domain Review notes. Further up, see a 1642 engraving based on a 1564 drawing by Peter Breughel of another dancing epidemic which occurred that year in Molenbeek. The 17th century German engraving above of a dancing epidemic in a churchyard features a man holding a severed arm.

We see mass panics and delusions around the world, for reasons that are rarely clear to scholars, psychiatrists, historians, anthropologists, and physicians during or after the fact. What is medically known as Saint Vitus dance, or Sydenham’s Chorea, has recognized physical causes like rheumatic fever and occurs in a specific subset of the population. The historical Saint Vitus Dance, or Dancing Plague, however, affected people indiscriminately and seems to have been a phenomenon of mass suggestion, like many other social-psychological events around the world.

Episodes of epidemic manias related to outmoded supernatural beliefs can seem especially bizarre, but the mass psychology of 21st century western culture includes many episodes of social contagion and compulsion no less strange, and perhaps no less widespread or deadly, especially during times of extreme stress.

via Public Domain Review

Related Content:

Oliver Sacks Explains the Biology of Hallucinations: “We See with the Eyes, But with the Brain as Well”

Behold the Mysterious Voynich Manuscript: The 15th-Century Text That Linguists & Code-Breakers Can’t Understand

A Free Yale Course on Medieval History: 700 Years in 22 Lectures

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

How Music Can Awaken Patients with Alzheimer’s and Dementia

In the late 1950’s, pioneering free jazz bandleader Sun Ra played a gig at a Chicago mental hospital, booked there by his manager Alton Abraham, who had an interest in alternative medicine. The experiment in musical therapy worked wonders. One patient who had not moved or spoken in years reportedly got up, walked over to the piano, and yelled out, “you call that music!”

The anecdote illustrates just one experience among untold millions in which a person suffering from a debilitating neurological condition responds positively, even miraculously, it seems, to music.




As famed neurologist and writer Oliver Sacks puts it in his book Musicophilia, “musical perception, musical sensibility, musical emotion and musical memory can survive long after other forms of memory have disappeared.”

This medical fact makes musical therapy an ideal intervention for patients suffering from Alzheimer’s disease and dementia. In the short video above, Sacks describes his visits to patients in various old age homes. “Some of them are confused, some are agitated, some are lethargic, some have almost lost language,” he says, “but all of them, without exception, respond to music.”

We can see just such a response in the clip at the top, in which the barely responsive Henry Dryer, a 92-year-old nursing home resident with dementia, transforms when he hears music. “The philosopher Kant called music ‘the quickening art,’ and Henry’s being quickened,” says Sacks says of the dramatic change, “he’s being brought to life.” Suddenly lucid and happy, Henry looks up and says, “I’m crazy about music. Beautiful sounds.”

The clip comes from a documentary called Alive Inside, winner of a 2014 Sundance Audience Award (see the trailer above), a film that shows us several musical “quickenings” like Henry’s. “Before Dryer started using his iPod,” notes The Week, “he could only answer yes-or-no questions—and sometimes he sat silently and still for hours at a time.” Now, he sings, carries on conversations and can “even recall things from years ago.”

Sacks comments that “music imprints itself on the brain deeper than any other human experience,” evoking emotions in ways that nothing else can. A 2010 Boston University study showed that Alzheimer’s patients “learned more lyrics when they were set to music rather than just spoken.” Likewise, researchers at the University of Utah found music to be “an alternative route for communicating with patients.”

As senior author of the Utah study, Dr. Norman Foster, says, “language and visual memory pathways are damaged early as the disease progresses, but personalized music programs can activate the brain, especially for patients who are losing contact with their environment.” See the effects for yourself in this extraordinary film, and learn more about Sacks' adventures with music and the brain in the 2007 discussion of Musicophilia, just above.

Related Content:

Sun Ra Plays a Music Therapy Gig at a Mental Hospital; Inspires Patient to Talk for the First Time in Years

Discover the Retirement Home for Elderly Musicians Created by Giuseppe Verdi: Created in 1899, It Still Lives On Today

The French Village Designed to Promote the Well-Being of Alzheimer’s Patients: A Visual Introduction to the Pioneering Experiment

In Touching Video, People with Alzheimer’s Tell Us Which Memories They Never Want to Forget

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Psilocybin Could Soon Be a Legal Treatment for Depression: Johns Hopkins Professor, Roland Griffiths, Explains How Psilocybin Can Relieve Suffering

Much of the recent scientific research into psychedelics has picked up where researchers left off in the mid-20th century, before LSD, psilocybin, and other psychoactive drugs became countercultural means of consciousness expansion, and then banned, illegal substances the government sought to control. Scientists from several fields studied psychedelics as treatments for addiction, depression, and anxiety, and end-of-life care. These applications were conceived and tested several decades ago.

Now, thanks to some serious investment from high-profile institutions like Johns Hopkins University, and thanks to changing government attitudes toward psychoactive drugs, it may be possible for psilocybin, the active ingredient in “magic mushrooms,” to get legal approval for therapy in a clinical setting by 2021. “For the first time in U.S. history,” Shelby Hartman reports at Rolling Stone, “a psychedelic drug is on the fast track to getting approved for treating depression by the federal government.”

As Michael Pollan has detailed in his latest book, How to Change Your Mind, the possibilities for psilocybin and other such drugs are vast. “But before the Food and Drug Administration can be petitioned to reclassify it,” Brittany Shoot notes at Fortune, the drug “first has to clear phase III clinical trials. The entire process is expected to take about five years.” In the TEDMED video above, you can see Roland R. Griffiths, Professor of Psychiatry and Behavioral Sciences at Johns Hopkins, discuss the ways in which psilocybin, “under supported conditions, can occasion mystical-type experiences associated with enduring positive changes in attitudes and behavior.”

The implications of this research span the fields of ethics and medicine, psychology and religion, and it’s fitting that Dr. Griffiths leads off with a statement about the compatibility of spirituality and science, supported by a quote from Einstein, who said “the most beautiful and profound emotion we can experience is the sensation of the mystical. It’s the source of all true science.” But the work Griffiths and others have been engaged in is primarily practical in nature—though it does not at all exclude the mystical—like finding effective means to treat depression in cancer patients, for example.

“Sixteen million Americans suffer from depression and approximately one-third of them are treatment resistant,” Hartman writes. “Depression is also an epidemic worldwide, affecting 300 million people around the world.” Psychotropic drugs like psilocybin, LSD, and MDMA (which is not classified as a psychedelic), have been shown for a long time to work for many people suffering from severe mental illness and addictions.

Although such drugs present some potential for abuse, they are not highly addictive, especially relative to the flood of opioids on the legal market that are currently devastating whole communities as people use them to self-medicate. It seems that what has most prevented psychedelics from being researched and prescribed has as much or more to do with long-standing prejudice and fear as it does with a genuine concern for public health. (And that’s not even to mention the financial interests who exert tremendous pressure on drug policy.)

But now, Hartman writes, “it appears [researchers] have come too far to go back—and the federal government is finally recognizing it, too.” Find out why this research matters in Dr. Griffiths' talk, Pollan’s book, the Multidisciplinary Association for Psychedelic Studies, and some of the posts we’ve linked to below.

Related Content:

How to Use Psychedelic Drugs to Improve Mental Health: Michael Pollan’s New Book, How to Change Your Mind, Makes the Case

New LSD Research Provides the First Images of the Brain on Acid, and Hints at Its Potential to Promote Creativity

Artist Draws 9 Portraits While on LSD: Inside the 1950s Experiments to Turn LSD into a “Creativity Pill”

When Aldous Huxley, Dying of Cancer, Left This World Tripping on LSD, Experiencing “the Most Serene, the Most Beautiful Death” (1963)

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

More in this category... »
Quantcast