The Psychological & Neurological Disorders Experienced by Characters in Alice in Wonderland: A Neuroscience Reading of Lewis Carroll’s Classic Tale

Most reputable doctors tend to refrain from diagnosing people they’ve never met or examined. Unfortunately, this circumspection doesn't obtain as often among lay folk. When we lob uninformed diagnoses at other people, we may do those with genuine mental health issues a serious disservice. But what about fictional characters? Can we ascribe mental illnesses to the surreal menagerie, say, in Lewis Carroll’s Alice’s Adventures in Wonderland? It’s almost impossible not to, given the overt themes of madness in the story.

Carroll himself, it seems, drew many of his depictions directly from the treatment of mental disorders in 19th century England, many of which were linked to “extremely poor working conditions,” notes Franziska Kohlt at The Conversation. During the industrial revolution, “populations in so-called ‘pauper lunatic asylums’ for the working class skyrocketed.” Carroll’s uncle, Robert Wilfred Skeffington Lutwidge, happened to be an officer of the Lunacy Commission, which supervised such institutions, and his work offers “stunning insights into the madness in Alice.”




Yet we should be careful. Like the supposed drug references in Alice, some of the lay diagnoses now applied to Alice’s characters may be a little far-fetched. Do we really see diagnosable PTSD or Tourette’s? Anxiety Disorder and Narcissistic Personality Disorder? These conditions hadn’t been categorized in Carroll’s day, though their symptoms are nothing new. And yet, experts have long looked to his nonsense fable for its depictions of abnormal psychology. One British psychiatrist didn’t just diagnose Alice, he named a condition after her.

In 1955, Dr. John Todd coined the term Alice in Wonderland Syndrome (AIWS) to describe a rare condition in which—write researchers in the Journal of Pediatric Neurosciences—“the sizes of body parts or sizes of external objects are perceived incorrectly.” Among other illnesses, Alice in Wonderland Syndrome may be linked to migraines, which Carroll himself reportedly suffered.

We might justifiably assume the Mad Hatter has mercury poisoning, but what other disorders might the text plausibly present? Holly Barker, doctoral candidate in clinical neuroscience at King’s College London, has used her scholarly expertise to identify and describe in detail two other conditions she thinks are evident in Alice.

Depersonalization:

“At several points in the story,” writes Barker, “Alice questions her own identity and feels ‘different’ in some way from when she first awoke.” Seeing in these descriptions the symptoms of Depersonalization Disorder (DPD), Barker describes the condition and its location in the brain.

This disorder encompasses a wide range of symptoms, including feelings of not belonging in one’s own body, a lack of ownership of thoughts and memories, that movements are initiated without conscious intention and a numbing of emotions. Patients often comment that they feel as though they are not really there in the present moment, likening the experience to dreaming or watching a movie. These symptoms occur in the absence of psychosis, and patients are usually aware of the absurdity of their situation. DPD is often a feature of migraine or epileptic auras and is sometimes experienced momentarily by healthy individuals, in response to stress, tiredness or drug use.

Also highly associated with childhood abuse and trauma, the condition “acts as a sort of defense mechanism, allowing an individual to become disconnected from adverse life events.” Perhaps there is PTSD in Carroll’s text after all, since an estimated 51% of DPD patients also meet those criteria.

Prosopagnosia:

This condition is characterized by “the selective inability to recognize faces.” Though it can be hereditary, prosopagnosia can also result from stroke or head trauma. Fittingly, the character supposedly affected by it is none other than Humpty-Dumpty, who tells Alice “I shouldn’t know you again if we did meet.”

“Your face is the same as everybody else has – the two eyes, so-” (marking their places in the air with his thumb) “nose in the middle, mouth under. It’s always the same. Now if you had two eyes on the same side of the nose, for instance – or the mouth at the top – that would be some help.”

This “precise description” of prosopagnosia shows how individuals with the condition rely on particularly “discriminating features to tell people apart," since they are unable to distinguish family members and close friends from total strangers.

Scholars know that Carroll’s text contains within it several abstract and seemingly absurd mathematical concepts, such as imaginary numbers and projective geometry. The informed work of researchers like Kohit and Barker shows that Alice’s Adventures in Wonderland might also present a complex 19th century understanding of mental illness and neurological disorders, conveyed in a superficially silly way, but possibly informed by serious research and observation. Read Barker’s article in full here to learn more about the conditions she diagnoses.

Related Content:

Lewis Carroll’s Photographs of Alice Liddell, the Inspiration for Alice in Wonderland

See The Original Alice In Wonderland Manuscript, Handwritten & Illustrated By Lewis Carroll (1864)

See Salvador Dali’s Illustrations for the 1969 Edition of Alice’s Adventures in Wonderland

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness.

What Is Procrastination & How Can We Solve It? An Introduction by One of the World’s Leading Procrastination Experts

I don’t know about you, but my tendency to procrastinate feels like a character flaw. And yet, no amount of moralizing with myself makes any difference. Feeling bad, in fact, only makes things worse. Perhaps that’s because—as Tim Pychyl, Associate Professor in Psychology at Carleton University argues—procrastination is not a moral failing so much as a coping mechanism for painful feelings, a psychological avoidance of tasks we fear for some reason: because we fear rejection or failure, or even the burdens of success.

Pychyl should know. He's made studying procrastination the basis of his career and runs the 20-year-old Procrastination Research Group. Procrastination is a “puzzle,” he theorizes (the title of one of his books is Solving the Procrastination PuzzleA Concise Guide to Strategies for Change). Solving it involves understanding how its pieces work, including our beliefs about how it operates. Pychyl's lecture above addresses graduate students charged with helping undergraduates who procrastinate, but its lessons apply to all of us. In his first slide, Pychyl outlines four typical beliefs about procrastination:

It’s me

It’s the task

It’s the way I think

It’s my lack of willpower

Pychyl wants to debunk these notions, but he also argues that procrastination is “something we seem to understand very well” in popular parlance. One of his slides shows a typical “successories”-type poster that reads, “Procrastination: hard work often pays off after time, but laziness always pays off now.” While Pychyl doesn’t use judgmental language like “laziness,” he does acknowledge that procrastination results from ideas about short- versus long-term gain. We want to feel good, right now, a drive common to everyone.

The next poster reads “if the job’s worth doing, it will still be worth doing tomorrow.” The notion of the “future self” plays a role—the you of tomorrow who still has to face the work your present self puts off. “What are we doing to 'future self?'” Pychyl asks. “If we can just bring future self into clearer vision, lots of times the procrastination may go away.” This has been demonstrated in research studies, Ana Swanson notes at The Washington Post, in which people made better decisions after viewing digitally-aged photographs of themselves. But in general, we tend not to have much consideration for "future self."

A final successories slide reads, “Procrastination: by not doing what you should be doing, you could be having this much fun.” This is one of the most pervasive forms of self-delusion. We may convince ourselves that putting difficult things off for tomorrow means more fun today. But the amount of guilt we feel ensures a different experience. “Guilt is a paralyzing emotion,” Pychyl says. When we put off an important task, we feel terrible. And often, instead of enjoying life, we create more work for ourselves that makes us feel purposeful, like cooking or cleaning. This “task management” game temporarily relieves guilt, but it does not address the central problem. We simply “manage our emotions by managing our tasks.”

The word procrastination comes directly from classical Latin and translates to “put forward” that which “belongs to tomorrow.” This sounds benign, given that many a task does indeed belong to tomorrow. But prudent planning is one thing, procrastination is another. When we put off what we can or should accomplish today, we invoke tomorrow as “a mystical land where 98% of all human productivity, motivation, and achievement are stored.” The distinction between planning or unavoidable delay and procrastination is important. When delays are either intentional or the consequence of unpredictable life events, we need not consider them a problem. “All procrastination is delay, but not all delay is procrastination.”

So, to sum up Pychyl’s research on our attitudes about procrastination: “we think we’re having more fun, but we’re not”; “we think we’re not affecting future self, but we are”; and “it’s all about giving in to feel good,” which—see point number one—doesn’t actually work that well.

While we might minimize procrastination as a minor issue, its personal costs tell us otherwise, including severe impacts to “performance, well-being, health, relationships, regrets & bereavement.” Procrastinators get sick more often, report higher rates of depression, and suffer the somatic and psychological effects of elevated stress. Procrastination doesn’t only affect our personal well-being and integrity, but it has an ethical dimension, affecting those around us who suffer “second-hand," either because of the time we take away from them when we rush off to finish things last-minute, or because the stress we put ourselves under negatively affects the health of our relationships.

But procrastination begins first and foremost with our relationship to ourselves. Again, we put things off not because we are morally deficient, or “lazy,” but because our emotional brains are trying to cope. We feel some significant degree of fear or anxiety about the task at hand. The guilt and shame that comes with not accomplishing the task compounds the problem, and leads to further procrastination. “The behavior,” writes Swanson, turns into “a vicious, self-defeating cycle.”

How do we get out of the self-made loop of procrastination? Just as in the failure of the “Just say No” campaign, simply shaking ourselves by the metaphorical shoulders and telling ourselves to get to work isn’t enough. We have to deal with the emotions that set things in motion, and in this case, that means going easy on ourselves. “Research suggests that one of the most effective things that procrastinators can do is to forgive themselves for procrastinating,” Swanson reports.

Once we reduce the guilt, we can weaken the proclivity to procrastinate. Then, paradoxically, we need to ignore our emotions. “Most of us seem to tacitly believe,” Pychyl says, “that our emotional state has to match the task at hand.” For writers and artists, this belief has a lofty pedigree in romantic ideas about inspiration and muses. Irrelevant, the procrastination expert says. When approaching something difficult, “I have to recognize that I’m rarely going to feel like it, and it doesn’t matter if I don’t feel like it.” Feelings of motivation and creative inspiration often strike us in the midst of a task, not before. Breaking down daunting activities into smaller tasks, and approaching these one at a time, gives us a practical roadmap for conquering procrastination. For more insights and research findings, watch Pychyl’s full lecture, and listen to him discuss his research on the Healthy Family podcast just above.

Related Content:

The Neuroscience & Psychology of Procrastination, and How to Overcome It

How Information Overload Robs Us of Our Creativity: What the Scientific Research Shows

Why You Do Your Best Thinking In The Shower: Creativity & the “Incubation Period”

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

How the Japanese Practice of “Forest Bathing”—Or Just Hanging Out in the Woods—Can Lower Stress Levels and Fight Disease

When the U.S. media began reporting on the phenomenon of “forest bathing” as a therapy for mental and physical health, the online commentariat—as it will—mocked the concept relentlessly as yet another pretentious, bourgeois repackaging of something thoroughly mundane. Didn’t we just used to call it “going outside”?

Well, yes, if all “forest bathing” means is “going outside,” then it does sound like a grandiose and unnecessary phrase. The term, however, is not an American marketing invention but a translation of the Japanese shinrin-yoku. “Coined by the Japanese Ministry of Agriculture, Forestry and Fisheries in 1982,” writes Meeri Kim at The Washington Post, “the word literally translates to ‘taking in the forest atmosphere’ or ‘forest bathing’ and refers to the process of soaking up the sights, smells and sounds of a natural setting to promote physiological and psychological health.”




So what? We already have the examples of thousands years of Buddhist monks (and Thich Nat Hanh), of Henry David Thoreau, and the saints of the Sierra Club. But the oldest and most useful ideas and practices can get carelessly discarded in the frantic pursuit of innovation at all costs. The pushing of hi-tech outdoor gear, wearable activity trackers, and health apps that ask us to log every movement can make going outside feel like a daunting, expensive chore or a competitive event.

Forest bathing involves none of those things. “Just be with the trees,” as Ephrat Livni describes the practice, “no hiking, no counting steps on a Fitbit. You can sit or meander, but the point is to relax rather than accomplish anything.” You don't have to hug the trees if you don't want to, but at least sit under one for a spell. Even if you don't attain enlightenment, you very well may reduce stress and boost immune function, according to several Japanese studies conducted between 2004 and 2012.

The Japanese government spent around four million dollars on studies conducted with hundreds of people "bathing" on 48 designated therapy trails. In his work, Qing Li, associate professor at Nippon Medical School in Tokyo, found “significant increases in NK [natural killer] cell activity in the week after a forest visit… positive effects lasted a month following each weekend in the woods.” Natural killer cells fight viruses and cancers, and are apparently stimulated by the oils that trees themselves secrete to ward off germs and pests. See the professor explain in the video above (he translates shinrin-yoku as taking a "forest shower," and also claims to have bottled some of the effects).

Additionally, experiments conducted by Japan’s Chiba University found that forest bathing lowered heart rate and blood pressure and brought down levels of cortisol, the stress hormone that can wreak havoc on every system when large amounts circulate through the body. Then there are the less tangible psychological benefits of taking in the trees. Subjects in one study “showed significantly reduced hostility and depression scores” after a walk in the woods. These findings underscore that spending time in the forest is a medical intervention as well as an aesthetic and spiritual one, something scientists have long observed but haven’t been able to quantify.

In their review of a book called Your Brain on Nature, Mother Earth News quotes Franklin Hough, first chief of the U.S. Division of Forestry, who remarked in a 19th century medical journal that forests have “a cheerful and tranquilizing influence which they exert upon the mind, more especially when worn down by mental labor.” Hough’s hypothesis has been confirmed, and despite what might sound to English speakers like a slightly ridiculous name, forest bathing is serious therapy, especially for the ever-increasing number of urbanites and those who spend their days in strip malls, office complexes, and other overbuilt environments.

What is a guided forest bathing experience like? You can listen to NPR's Alison Aubrey describe one above. She quotes Amos Clifford, founder of the Association of Nature & Forest Therapy, the certifying organization, as saying that a guide "helps you be here, not there," sort of like a meditation instructor. Clifford has been pushing health care providers to "incorporate forest therapy as a stress-reduction strategy" in the U.S., and there's no question that more stress reduction tools are sorely needed.

But, you may wonder, do you have to call it “forest bathing,” or pay for a certified guide, join a group, and buy some fancy outerwear to get the benefits hanging out with trees? I say, consider the words of John Muir, the indefatigable 19th naturalist, "father of the National Park System," and founding saint of the Sierra Club: In the eternal youth of Nature you may renew your own. Go quietly, alone; no harm will befall you. The quote may underestimate the amount of risk or overstate the benefits, but you get the idea. Muir was not one to get tangled up in semantics or overly detailed analysis. Nonetheless, his work inspired Americans to step in and preserve so much of the country's forest in the 19th and 20th centuries. Maybe the preventative medicine of "forest bathing" can help do the same in the 21st.

Related Content:

How Walking Fosters Creativity: Stanford Researchers Confirm What Philosophers and Writers Have Always Known

How Mindfulness Makes Us Happier & Better Able to Meet Life’s Challenges: Two Animated Primers Explain

This Is Your Brain on Exercise: Why Physical Exercise (Not Mental Games) Might Be the Best Way to Keep Your Mind Sharp

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Depression & Melancholy: Animated Videos Explain the Crucial Difference Between Everyday Sadness and Clinical Depression

“Depression,” the TED-Ed video above informs us, “is the leading cause of disability in the world.” This may be a hard fact to swallow, the product, we might think, of pharmaceutical advertising. We all feel down from time to time, we think. “Then circumstances change, and those sad feelings disappear.” Isn’t it like this for everyone? It is not. “Clinical depression is different. It’s a medical disorder, and it won’t go away just because you want it to.”

Depression can linger for up to two weeks, and become so debilitating that sufferers cannot work or play. It interferes with important relationships and “can have a lot of different symptoms: a low mood, loss of interest in things you’d normally enjoy, changes in appetite, feeling worthless or excessively guilty,” restlessness and insomnia, or extreme lethargy, poor concentration, and possible thoughts of suicide. But surely we can hear a paid promotional voice when the narrator states, “If you have at least 5 of those symptoms, according to psychiatric guidelines, you qualify for a diagnosis of depression.”




What we don’t typically hear about in pharmaceutical ads are the measurable physiological changes depression writes in the brain, including decreased brain matter in the frontal lobe and atrophy of the hippocampus. These effects are measurable in humans and rats, in study after study after study. But while most of us know the names of a neurotransmitter or two these days, not even neuroscientists fully understand the biology of depression. They do know that some combination of medication, therapy, and, in extreme cases electroconvulsive treatment, can allow people to more fully experience life.

People in treatment will still feel “down” on occasion, just like everyone does. But depression, the explainer wants us to understand, should never be compared to ordinary sadness. Its effects on behavior and brain health are too wide-ranging, pervasive, persistent, and detrimental. These effects can be invisible, which adds to an unfortunate social stigma that dissuades people from seeking treatment. The more we talk about depression openly, rather than treating as it as a shameful secret, the more likely people at risk will be to seek help.

Just as depression cannot be alleviated by trivializing or ignoring it, the condition does not respond to being romanticized. While, indeed, many a famous painter, poet, actor, etc. has suffered from clinical depression—and made it a part of their art—their examples should not suggest to us that artists shouldn’t get treatment. Sadness is never trivial.

Unlike physical pain, it is difficult, for example, to pinpoint the direct causes of sadness. As the short video above demonstrates, the assumption that sadness is caused by external events arose relatively recently. The humoral system of the ancient Greeks treated all sadness as a biological phenomenon. Greek physicians believed it was an expression of black bile, or “melaina kole,” from which we derive the word "melancholy." It seems we’ve come full circle, in a way. Ancient humoral theorists recommended nutrition, medical treatment, and physical exercise as treatments for melancholia, just as doctors do today for depression.

But melancholy is a much broader term, not a scientific designation; it is a collection of ideas about sadness that span thousands of years. Nearly all of those ideas include some sense that sadness is an essential experience. “If you’ve never felt melancholy,” the narrator says, “you’ve missed out on part of what it means to be human.” Thinkers have described melancholia as a precursor to, or inevitable result of, acquiring wisdom. One key example, Robert Burton’s 1621 text The Anatomy of Melancholy, "the apogee of Renaissance scholarship," set the tone for discussions of melancholy for the next few centuries.

The scientific/philosophical/literary text argues, “he that increaseth wisdom, increaseth sorrow,” a sentiment the Romantic poets turned on its head. Before them came John Milton, whose 1645 poem Il Penseroso addresses melancholy as “thou Goddes, sage and holy… Sober, stedfast, and demure.” The deity Melancholy oversees the contemplative life and reveals essential truths through “Gorgeous Tragedy.”

One of the poem’s loftiest themes showed the way forward for the Romantics: “The poet who seeks to attain the highest level of creative expression must embrace the divine,” write Milton scholars Katherine Lynch and Thomas H. Luxon, "which can only be accomplished by following the path set out in Il Penseroso.” The divine, in this case, takes the form of sadness personified. Yet this poem cannot be read in isolation: its companion, L’Allegro, praises Mirth, and of sadness says, “Hence loathed Melancholy / Of Cerberus, and blackest midnight born, in Stygian Cave forlorn / ‘Mongst horrid shapes, and shrieks, and sights unholy.”

Rather than contradict each other, these two characterizations speak to the ambivalent attitudes, and vastly different experiences, humans have about sadness. Fleeting bouts of melancholy can be sweet, touching, and beautiful, inspiring art, music, and poetry. Sadness can force us to reckon with life’s unpleasantness rather than deny or avoid it. On the other hand, in its most extreme, chronically intractable forms, such as what we now call clinical depression, sadness can destroy our capacity to act, to appreciate beauty and learn important lessons, marking the critical difference between a universal existential condition and a, thankfully, treatable physical disease.

Related Content:

Stanford’s Robert Sapolsky Demystifies Depression, Which, Like Diabetes, Is Rooted in Biology

How Baking, Cooking & Other Daily Activities Help Promote Happiness and Alleviate Depression and Anxiety

A Unified Theory of Mental Illness: How Everything from Addiction to Depression Can Be Explained by the Concept of “Capture”

Stephen Fry on Coping with Depression: It’s Raining, But the Sun Will Come Out Again

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Why Incompetent People Think They’re Amazing: An Animated Lesson from David Dunning (of the Famous “Dunning-Kruger Effect”)

The business world has long had special jargon for the Kafkaesque incompetence bedeviling the ranks of upper management. There is “the Peter principle,” first described in a satirical book of the same name in 1968. More recently, we have the positive notion of “failing upward.” The concept has inspired a mantra, “fail harder, fail faster,” as well as popular books like The Gift of Failure. Famed research professor, author, and TED talker Brené Brown has called TED “the failure conference," and indeed, a “FailCon” does exist, “in over a dozen cities on 6 continents around the globe.”

The candor about this most unavoidable of human phenomena may prove a boon to public health, lowering levels of hypertension by a significant margin. But is there a danger in praising failure too fervently? (Samuel Beckett’s quote on the matter, beloved by many a 21st century thought leader, proves decidedly more ambiguous in context.) Might it present an even greater opportunity for people to “rise to their level of incompetence”? Given the prevalence of the “Dunning-Kruger Effect,” a cognitive bias explained by John Cleese in a previous post, we may not be well-placed to know whether our efforts constitute success or failure, or whether we actually have the skills we think we do.




First described by social psychologists David Dunning (University of Michigan) and Justin Kruger (N.Y.U.) in 1999, the effect “suggests that we’re not very good at evaluating ourselves accurately.” So says the narrator of the TED-Ed lesson above, scripted by Dunning and offering a sober reminder of the human propensity for self-delusion. “We frequently overestimate our own abilities,” resulting in widespread “illusory superiority” that makes “incompetent people think they’re amazing.” The effect greatly intensifies at the lower end of the scale; it is often “those with the least ability who are most likely to overrate their skills to the greatest extent.” Or as Cleese plainly puts it, some people “are so stupid, they have no idea how stupid they are.”

Combine this with the converse effect—the tendency of skilled individuals to underrate themselves—and we have the preconditions for an epidemic of mismatched skill sets and positions. But while imposter syndrome can produce tragic personal results and deprive the world of talent, the Dunning-Kruger effect’s worst casualties affect us all adversely. People “measurably poor at logical reasoning, grammar, financial knowledge, math, emotional intelligence, running medical lab tests, and chess all tend to rate their expertise almost as favorably as actual experts do.” When such people get promoted up the chain, they can unwittingly do a great deal of harm.

While arrogant self-importance plays its role in fostering delusions of expertise, Dunning and Kruger found that most of us are subject to the effect in some area of our lives simply because we lack the skills to understand how bad we are at certain things. We don't know the rules well enough to successfully, creatively break them. Until we have some basic understanding of what constitutes competence in a particular endeavor, we cannot even understand that we’ve failed.

Real experts, on the other hand, tend to assume their skills are ordinary and unremarkable. “The result is that people, whether they’re inept or highly skilled, are often caught in a bubble of inaccurate self-perception." How can we get out? The answers won’t surprise you. Listen to constructive feedback and never stop learning, behavior that can require a good deal of vulnerability and humility.

Related Content:

John Cleese on How “Stupid People Have No Idea How Stupid They Are” (a.k.a. the Dunning-Kruger Effect)

Research Finds That Intellectual Humility Can Make Us Better Thinkers & People; Good Thing There’s a Free Course on Intellectual Humility

The Power of Empathy: A Quick Animated Lesson That Can Make You a Better Person

Free Online Psychology & Neuroscience Courses

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

How Scientology Works: A Primer Based on a Reading of Paul Thomas Anderson’s Film, The Master

Paul Thomas Anderson's The Master focuses, with almost unbearable intensity, on two characters: Joaquin Phoenix's impulsive ex-sailor Freddie Quell, and Philip Seymour Hoffman's Lancaster Dodd, "the founder and magnetic core of the Cause — a cluster of folk who believe, among other things, that our souls, which predate the foundation of the Earth, are no more than temporary residents of our frail bodily housing," writes The New Yorker's Anthony Lane in his review of the film. "Any relation to persons living, dead, or Scientological is, of course, entirely coincidental."

Before The Master came out, rumor built up that the film mounted a scathing critique of the Church of Scientology; now, we know that it accomplishes something, par for the course for Anderson, much more fascinating and artistically idiosyncratic.




Few of its gloriously 65-millimeter-shot scenes seem to have much to say, at least directly, about Scientology or any other system of thought. But perhaps the most memorable, in which Dodd, having discovered Freddie stown away aboard his chartered yacht, offers him a session of "informal processing," does indeed have much to do with the faith founded by L. Ron Hubbard — at least if you believe the analysis of Evan Puschak, better known as the Nerdwriter, who argues that the scene "bears an unmistakable reference to a vital activity within Scientology called auditing."

Just as Dodd does to Freddie, "the auditor in Scientology asks questions of the 'preclear' with the goal of ridding him of 'engrams,' the term for traumatic memory stored in what's called the 'reactive mind.'" By thus "helping the preclear relive the experience that caused the trauma," the auditor accomplishes a goal that, in a clip Puschak includes in the essay, Hubbard lays out himself: to "show a fellow that he's mocking up his own mind, therefore his own difficulties; that he is not completely adrift in, and swamped by, a body." Scientological or not, such notions do intrigue the desperate, drifting Freddie, and although the story of his and Dodd's entwinement, as told by Anderson, still divides critical opinion, we can say this for sure: it beats Battlefield Earth.

Related Content:

When William S. Burroughs Joined Scientology (and His 1971 Book Denouncing It)

The Career of Paul Thomas Anderson: A 5-Part Video Essay on the Auteur of Boogie Nights, Punch-Drunk Love, The Master, and More

Space Jazz, a Sonic Sci-Fi Opera by L. Ron Hubbard, Featuring Chick Corea (1983)

Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

Research Finds That Intellectual Humility Can Make Us Better Thinkers & People; Good Thing There’s a Free Course on Intellectual Humility

We may have grown used to hearing about the importance of critical thinking, and stowed away knowledge of logical fallacies and cognitive biases in our argumentative toolkit. But were we to return to the philosophical sources of informal logic, we would find that we only grasped at some of the principles of reason. The others involve questions of what we might call virtue or character—what for the Greeks fell into the categories of ethos and pathos. The principle of charity, for example, in which we give our opponents a fair hearing and respond to the best version of their arguments as we understand them. And the principle, exemplified by Plato’s Socrates, of intellectual humility. Or as one punk band put it in their Socratic tribute. “All I know is that I don’t know. All I know is that I don’t know nothing.”

Intellectual humility is not, contrary to most popular appearances, reflexively according equal weight to “both sides” of every argument or assuming that everyone’s opinion is equally valid. These are forms of mental laziness and ethical abdication. It is, however, believing in our own fallibility and opening ourselves up to hearing arguments without immediately forming a judgment about them or the people who make them. We do not abandon our reason and values, we strengthen them, argues Mark Leary, by “not being afraid of being wrong.” Leary, professor of psychology and neuroscience at Duke University, is the lead author of a new study on intellectual humility that found “essentially no difference between liberals and conservatives or between religious and nonreligious people” when it comes to intellectual humility.




The study challenges many ideas that can prevent dialogue. “There are stereotypes about conservatives and religiously conservative people being less intellectually humble about their beliefs," says Leary. But he and his colleagues “didn’t find a shred of evidence to support that.” This doesn’t necessarily mean that such people have high degrees of intellectual humility, only that all of us, perhaps equally, possess fairly low levels of the trait. I’ll be the first to admit that it is not an easy one to develop, especially when we’re on the defensive for some seemingly good reasons—and when we live in a culture that encourages us to make decisions and take actions on the strength of an image, some minimal text, and a few buttons that lead us right to our bank accounts. (To quote Operation Ivy again, “We get told to decide. Just like as if I’m not gonna change my mind.”)

But in the Duke study, reports Alison Jones at Duke Today, “those who displayed intellectual humility did a better job of evaluating the quality of evidence.” They took their time to make careful considerations. And they were generally more charitable and “less likely to judge a writer’s character based on his or her views.” By contrast, “intellectually arrogant” people gave writers with whom they disagreed “low scores in morality, honesty, competence, and warmth.” As a former teacher of rhetoric, I wonder whether the researchers accounted for the quality and persuasiveness of the writing itself. Nonetheless, this observation underscores the problem of conflating an author’s work with his or her character. Moral judgment can inhibit intellectual curiosity and open-mindedness. Intellectually arrogant people often resort to insults and personal attacks over thoughtful analysis.

The enormous number of assumptions we bring to almost every conversation with people who differ from us can blind us to our own faults and to other people’s strengths. But intellectual humility is not genetically determined—it is a skill that can be learned, Leary believes. Big Think recommends a free MOOC from the University of Edinburgh on intellectual humility (see an introduction to the concept at the top and a series of lectures here). “Faced with difficult questions,” explains course lecturer Dr. Ian Church, “people often tend to dismiss and marginalize dissent…. The world needs more people who are sensitive to reasons both for and against their beliefs, and are willing to consider the possibility that their political, religious and moral beliefs might be mistaken.” The course offers three different levels of engagement, from casual to quite involved, and three separate class sections at Coursera: Theory, Practice, and Science.

It’s likely that many of us need some serious preparation before we’re willing to listen to those who hold certain views. And perhaps certain views don't actually deserve a hearing. But in most cases, if we can let our guard down, set aside feelings of hostility, and become willing to learn something even from those with whom we disagree, we might be able to do what so many psychologists continue to recommend. As Cindy Lamothe writes at New York Magazine’s Science of Us blog, “we have to be willing to expose ourselves to opposing perspectives in the first place—which means that, as daunting as it may seem, listening to friends and family with radically different views can be beneficial to our long-term intellectual progress.” The holidays are soon upon us. Let the healing—or at least the charitable tolerance if you can manage it—begin.

via Big Think

Related Content:

Stephen Fry Identifies the Cognitive Biases That Make Trump Tick       

32 Animated Videos by Wireless Philosophy Teach You the Essentials of Critical Thinking

Why We Need to Teach Kids Philosophy & Safeguard Society from Authoritarian Control

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

More in this category... »
Quantcast