Depression & Melancholy: Animated Videos Explain the Crucial Difference Between Everyday Sadness and Clinical Depression

“Depression,” the TED-Ed video above informs us, “is the leading cause of disability in the world.” This may be a hard fact to swallow, the product, we might think, of pharmaceutical advertising. We all feel down from time to time, we think. “Then circumstances change, and those sad feelings disappear.” Isn’t it like this for everyone? It is not. “Clinical depression is different. It’s a medical disorder, and it won’t go away just because you want it to.”

Depression can linger for up to two weeks, and become so debilitating that sufferers cannot work or play. It interferes with important relationships and “can have a lot of different symptoms: a low mood, loss of interest in things you’d normally enjoy, changes in appetite, feeling worthless or excessively guilty,” restlessness and insomnia, or extreme lethargy, poor concentration, and possible thoughts of suicide. But surely we can hear a paid promotional voice when the narrator states, “If you have at least 5 of those symptoms, according to psychiatric guidelines, you qualify for a diagnosis of depression.”

What we don’t typically hear about in pharmaceutical ads are the measurable physiological changes depression writes in the brain, including decreased brain matter in the frontal lobe and atrophy of the hippocampus. These effects are measurable in humans and rats, in study after study after study. But while most of us know the names of a neurotransmitter or two these days, not even neuroscientists fully understand the biology of depression. They do know that some combination of medication, therapy, and, in extreme cases electroconvulsive treatment, can allow people to more fully experience life.

People in treatment will still feel “down” on occasion, just like everyone does. But depression, the explainer wants us to understand, should never be compared to ordinary sadness. Its effects on behavior and brain health are too wide-ranging, pervasive, persistent, and detrimental. These effects can be invisible, which adds to an unfortunate social stigma that dissuades people from seeking treatment. The more we talk about depression openly, rather than treating as it as a shameful secret, the more likely people at risk will be to seek help.

Just as depression cannot be alleviated by trivializing or ignoring it, the condition does not respond to being romanticized. While, indeed, many a famous painter, poet, actor, etc. has suffered from clinical depression—and made it a part of their art—their examples should not suggest to us that artists shouldn’t get treatment. Sadness is never trivial.

Unlike physical pain, it is difficult, for example, to pinpoint the direct causes of sadness. As the short video above demonstrates, the assumption that sadness is caused by external events arose relatively recently. The humoral system of the ancient Greeks treated all sadness as a biological phenomenon. Greek physicians believed it was an expression of black bile, or “melaina kole,” from which we derive the word "melancholy." It seems we’ve come full circle, in a way. Ancient humoral theorists recommended nutrition, medical treatment, and physical exercise as treatments for melancholia, just as doctors do today for depression.

But melancholy is a much broader term, not a scientific designation; it is a collection of ideas about sadness that span thousands of years. Nearly all of those ideas include some sense that sadness is an essential experience. “If you’ve never felt melancholy,” the narrator says, “you’ve missed out on part of what it means to be human.” Thinkers have described melancholia as a precursor to, or inevitable result of, acquiring wisdom. One key example, Robert Burton’s 1621 text The Anatomy of Melancholy, "the apogee of Renaissance scholarship," set the tone for discussions of melancholy for the next few centuries.

The scientific/philosophical/literary text argues, “he that increaseth wisdom, increaseth sorrow,” a sentiment the Romantic poets turned on its head. Before them came John Milton, whose 1645 poem Il Penseroso addresses melancholy as “thou Goddes, sage and holy… Sober, stedfast, and demure.” The deity Melancholy oversees the contemplative life and reveals essential truths through “Gorgeous Tragedy.”

One of the poem’s loftiest themes showed the way forward for the Romantics: “The poet who seeks to attain the highest level of creative expression must embrace the divine,” write Milton scholars Katherine Lynch and Thomas H. Luxon, "which can only be accomplished by following the path set out in Il Penseroso.” The divine, in this case, takes the form of sadness personified. Yet this poem cannot be read in isolation: its companion, L’Allegro, praises Mirth, and of sadness says, “Hence loathed Melancholy / Of Cerberus, and blackest midnight born, in Stygian Cave forlorn / ‘Mongst horrid shapes, and shrieks, and sights unholy.”

Rather than contradict each other, these two characterizations speak to the ambivalent attitudes, and vastly different experiences, humans have about sadness. Fleeting bouts of melancholy can be sweet, touching, and beautiful, inspiring art, music, and poetry. Sadness can force us to reckon with life’s unpleasantness rather than deny or avoid it. On the other hand, in its most extreme, chronically intractable forms, such as what we now call clinical depression, sadness can destroy our capacity to act, to appreciate beauty and learn important lessons, marking the critical difference between a universal existential condition and a, thankfully, treatable physical disease.

Related Content:

Stanford’s Robert Sapolsky Demystifies Depression, Which, Like Diabetes, Is Rooted in Biology

How Baking, Cooking & Other Daily Activities Help Promote Happiness and Alleviate Depression and Anxiety

A Unified Theory of Mental Illness: How Everything from Addiction to Depression Can Be Explained by the Concept of “Capture”

Stephen Fry on Coping with Depression: It’s Raining, But the Sun Will Come Out Again

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Why Incompetent People Think They’re Amazing: An Animated Lesson from David Dunning (of the Famous “Dunning-Kruger Effect”)

The business world has long had special jargon for the Kafkaesque incompetence bedeviling the ranks of upper management. There is “the Peter principle,” first described in a satirical book of the same name in 1968. More recently, we have the positive notion of “failing upward.” The concept has inspired a mantra, “fail harder, fail faster,” as well as popular books like The Gift of Failure. Famed research professor, author, and TED talker Brené Brown has called TED “the failure conference," and indeed, a “FailCon” does exist, “in over a dozen cities on 6 continents around the globe.”

The candor about this most unavoidable of human phenomena may prove a boon to public health, lowering levels of hypertension by a significant margin. But is there a danger in praising failure too fervently? (Samuel Beckett’s quote on the matter, beloved by many a 21st century thought leader, proves decidedly more ambiguous in context.) Might it present an even greater opportunity for people to “rise to their level of incompetence”? Given the prevalence of the “Dunning-Kruger Effect,” a cognitive bias explained by John Cleese in a previous post, we may not be well-placed to know whether our efforts constitute success or failure, or whether we actually have the skills we think we do.

First described by social psychologists David Dunning (University of Michigan) and Justin Kruger (N.Y.U.) in 1999, the effect “suggests that we’re not very good at evaluating ourselves accurately.” So says the narrator of the TED-Ed lesson above, scripted by Dunning and offering a sober reminder of the human propensity for self-delusion. “We frequently overestimate our own abilities,” resulting in widespread “illusory superiority” that makes “incompetent people think they’re amazing.” The effect greatly intensifies at the lower end of the scale; it is often “those with the least ability who are most likely to overrate their skills to the greatest extent.” Or as Cleese plainly puts it, some people “are so stupid, they have no idea how stupid they are.”

Combine this with the converse effect—the tendency of skilled individuals to underrate themselves—and we have the preconditions for an epidemic of mismatched skill sets and positions. But while imposter syndrome can produce tragic personal results and deprive the world of talent, the Dunning-Kruger effect’s worst casualties affect us all adversely. People “measurably poor at logical reasoning, grammar, financial knowledge, math, emotional intelligence, running medical lab tests, and chess all tend to rate their expertise almost as favorably as actual experts do.” When such people get promoted up the chain, they can unwittingly do a great deal of harm.

While arrogant self-importance plays its role in fostering delusions of expertise, Dunning and Kruger found that most of us are subject to the effect in some area of our lives simply because we lack the skills to understand how bad we are at certain things. We don't know the rules well enough to successfully, creatively break them. Until we have some basic understanding of what constitutes competence in a particular endeavor, we cannot even understand that we’ve failed.

Real experts, on the other hand, tend to assume their skills are ordinary and unremarkable. “The result is that people, whether they’re inept or highly skilled, are often caught in a bubble of inaccurate self-perception." How can we get out? The answers won’t surprise you. Listen to constructive feedback and never stop learning, behavior that can require a good deal of vulnerability and humility.

Related Content:

John Cleese on How “Stupid People Have No Idea How Stupid They Are” (a.k.a. the Dunning-Kruger Effect)

Research Finds That Intellectual Humility Can Make Us Better Thinkers & People; Good Thing There’s a Free Course on Intellectual Humility

The Power of Empathy: A Quick Animated Lesson That Can Make You a Better Person

Free Online Psychology & Neuroscience Courses

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

How Scientology Works: A Primer Based on a Reading of Paul Thomas Anderson’s Film, The Master

Paul Thomas Anderson's The Master focuses, with almost unbearable intensity, on two characters: Joaquin Phoenix's impulsive ex-sailor Freddie Quell, and Philip Seymour Hoffman's Lancaster Dodd, "the founder and magnetic core of the Cause — a cluster of folk who believe, among other things, that our souls, which predate the foundation of the Earth, are no more than temporary residents of our frail bodily housing," writes The New Yorker's Anthony Lane in his review of the film. "Any relation to persons living, dead, or Scientological is, of course, entirely coincidental."

Before The Master came out, rumor built up that the film mounted a scathing critique of the Church of Scientology; now, we know that it accomplishes something, par for the course for Anderson, much more fascinating and artistically idiosyncratic.

Few of its gloriously 65-millimeter-shot scenes seem to have much to say, at least directly, about Scientology or any other system of thought. But perhaps the most memorable, in which Dodd, having discovered Freddie stown away aboard his chartered yacht, offers him a session of "informal processing," does indeed have much to do with the faith founded by L. Ron Hubbard — at least if you believe the analysis of Evan Puschak, better known as the Nerdwriter, who argues that the scene "bears an unmistakable reference to a vital activity within Scientology called auditing."

Just as Dodd does to Freddie, "the auditor in Scientology asks questions of the 'preclear' with the goal of ridding him of 'engrams,' the term for traumatic memory stored in what's called the 'reactive mind.'" By thus "helping the preclear relive the experience that caused the trauma," the auditor accomplishes a goal that, in a clip Puschak includes in the essay, Hubbard lays out himself: to "show a fellow that he's mocking up his own mind, therefore his own difficulties; that he is not completely adrift in, and swamped by, a body." Scientological or not, such notions do intrigue the desperate, drifting Freddie, and although the story of his and Dodd's entwinement, as told by Anderson, still divides critical opinion, we can say this for sure: it beats Battlefield Earth.

Related Content:

When William S. Burroughs Joined Scientology (and His 1971 Book Denouncing It)

The Career of Paul Thomas Anderson: A 5-Part Video Essay on the Auteur of Boogie Nights, Punch-Drunk Love, The Master, and More

Space Jazz, a Sonic Sci-Fi Opera by L. Ron Hubbard, Featuring Chick Corea (1983)

Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

Research Finds That Intellectual Humility Can Make Us Better Thinkers & People; Good Thing There’s a Free Course on Intellectual Humility

We may have grown used to hearing about the importance of critical thinking, and stowed away knowledge of logical fallacies and cognitive biases in our argumentative toolkit. But were we to return to the philosophical sources of informal logic, we would find that we only grasped at some of the principles of reason. The others involve questions of what we might call virtue or character—what for the Greeks fell into the categories of ethos and pathos. The principle of charity, for example, in which we give our opponents a fair hearing and respond to the best version of their arguments as we understand them. And the principle, exemplified by Plato’s Socrates, of intellectual humility. Or as one punk band put it in their Socratic tribute. “All I know is that I don’t know. All I know is that I don’t know nothing.”

Intellectual humility is not, contrary to most popular appearances, reflexively according equal weight to “both sides” of every argument or assuming that everyone’s opinion is equally valid. These are forms of mental laziness and ethical abdication. It is, however, believing in our own fallibility and opening ourselves up to hearing arguments without immediately forming a judgment about them or the people who make them. We do not abandon our reason and values, we strengthen them, argues Mark Leary, by “not being afraid of being wrong.” Leary, professor of psychology and neuroscience at Duke University, is the lead author of a new study on intellectual humility that found “essentially no difference between liberals and conservatives or between religious and nonreligious people” when it comes to intellectual humility.

The study challenges many ideas that can prevent dialogue. “There are stereotypes about conservatives and religiously conservative people being less intellectually humble about their beliefs," says Leary. But he and his colleagues “didn’t find a shred of evidence to support that.” This doesn’t necessarily mean that such people have high degrees of intellectual humility, only that all of us, perhaps equally, possess fairly low levels of the trait. I’ll be the first to admit that it is not an easy one to develop, especially when we’re on the defensive for some seemingly good reasons—and when we live in a culture that encourages us to make decisions and take actions on the strength of an image, some minimal text, and a few buttons that lead us right to our bank accounts. (To quote Operation Ivy again, “We get told to decide. Just like as if I’m not gonna change my mind.”)

But in the Duke study, reports Alison Jones at Duke Today, “those who displayed intellectual humility did a better job of evaluating the quality of evidence.” They took their time to make careful considerations. And they were generally more charitable and “less likely to judge a writer’s character based on his or her views.” By contrast, “intellectually arrogant” people gave writers with whom they disagreed “low scores in morality, honesty, competence, and warmth.” As a former teacher of rhetoric, I wonder whether the researchers accounted for the quality and persuasiveness of the writing itself. Nonetheless, this observation underscores the problem of conflating an author’s work with his or her character. Moral judgment can inhibit intellectual curiosity and open-mindedness. Intellectually arrogant people often resort to insults and personal attacks over thoughtful analysis.

The enormous number of assumptions we bring to almost every conversation with people who differ from us can blind us to our own faults and to other people’s strengths. But intellectual humility is not genetically determined—it is a skill that can be learned, Leary believes. Big Think recommends a free MOOC from the University of Edinburgh on intellectual humility (see an introduction to the concept at the top and a series of lectures here). “Faced with difficult questions,” explains course lecturer Dr. Ian Church, “people often tend to dismiss and marginalize dissent…. The world needs more people who are sensitive to reasons both for and against their beliefs, and are willing to consider the possibility that their political, religious and moral beliefs might be mistaken.” The course offers three different levels of engagement, from casual to quite involved, and three separate class sections at Coursera: Theory, Practice, and Science.

It’s likely that many of us need some serious preparation before we’re willing to listen to those who hold certain views. And perhaps certain views don't actually deserve a hearing. But in most cases, if we can let our guard down, set aside feelings of hostility, and become willing to learn something even from those with whom we disagree, we might be able to do what so many psychologists continue to recommend. As Cindy Lamothe writes at New York Magazine’s Science of Us blog, “we have to be willing to expose ourselves to opposing perspectives in the first place—which means that, as daunting as it may seem, listening to friends and family with radically different views can be beneficial to our long-term intellectual progress.” The holidays are soon upon us. Let the healing—or at least the charitable tolerance if you can manage it—begin.

via Big Think

Related Content:

Stephen Fry Identifies the Cognitive Biases That Make Trump Tick       

32 Animated Videos by Wireless Philosophy Teach You the Essentials of Critical Thinking

Why We Need to Teach Kids Philosophy & Safeguard Society from Authoritarian Control

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Carl Jung Psychoanalyzes Hitler: “He’s the Unconscious of 78 Million Germans.” “Without the German People He’d Be Nothing” (1938)

Were you to google “Carl Jung and Nazism”—and I’m not suggesting that you do—you would find yourself hip-deep in the charges that Jung was an anti-Semite and a Nazi sympathizer. Many sites condemn or exonerate him; many others celebrate him as a blood and soil Aryan hero. It can be nauseatingly difficult at times to tell these accounts apart. What to make of this controversy? What is the evidence brought against the famed Swiss psychiatrist and onetime close friend, student, and colleague of Sigmund Freud?

Truth be told, it does not look good for Jung. Unlike Nietzsche, whose work was deliberately bastardized by Nazis, beginning with his own sister, Jung need not be taken out of context to be read as anti-Semitic. There is no irony at work in his 1934 paper The State of Psychotherapy Today, in which he marvels at National Socialism as a “formidable phenomenon,” and writes, “the ‘Aryan’ unconscious has a higher potential than the Jewish.” This is only one of the least objectionable of such statements, as historian Andrew Samuels demonstrates.

One Jungian defender admits in an essay collection called Lingering Shadows that Jung had been “unconsciously infected by Nazi ideas.” In response, psychologist John Conger asks, “Why not then say that he was unconsciously infected by anti-Semitic ideas as well?”—well before the Nazis came to power. He had expressed such thoughts as far back as 1918. Like the philosopher Martin Heidegger, Jung was accused of trading on his professional associations during the 30s to maintain his status, and turning on his Jewish colleagues while they were purged.

Yet his biographer Deirdre Bair claims Jung’s name was used to endorse persecution without his consent. Jung was incensed, “not least,” Mark Vernon writes at The Guardian, “because he was actually fighting to keep German psychotherapy open to Jewish individuals.” Bair also reveals that Jung was “involved in two plots to oust Hitler, essentially by having a leading physician declare the Führer mad. Both came to nothing.” And unlike Heidegger, Jung strongly denounced anti-Semitic views during the war. He “protected Jewish analysts,” writes Conger, “and helped refugees.” He also worked for the OSS, precursor to the CIA, during the war.

His recruiter Allen Dulles wrote of Jung’s “deep antipathy to what Nazism and Fascism stood for.” Dulles also cryptically remarked, “Nobody will probably ever know how much Prof. Jung contributed to the allied cause during the war.” These contradictions in Jung’s words, character, and actions are puzzling, to say the least. I would not presume to draw any hard and fast conclusions from them. They do, however, serve as the necessary context for Jung’s observations of Adolph Hitler. Nazis of today who praise Jung most often do so for his supposed characterization of Hitler as “Wotan,” or Odin, a comparison that thrills neo-pagans who, like the Germans did, use ancient European belief systems as clothes hangers for modern racist nationalism.

In his 1936 essay, “Wotan,” Jung describes the old god as a force all its own, a “personification of psychic forces” that moved through the German people “towards the end of the Weimar Republic”—through the “thousands of unemployed,” who by 1933 “marched in their hundreds of thousands.” Wotan, Jung writes, “is the god of storm and frenzy, the unleasher of passions and the lust of battle; moreover he is a superlative magician and artist in illusion who is versed in all secrets of an occult nature.” In personifying the “German psyche” as a furious god, Jung goes so far as to write, “We who stand outside judge the Germans far too much as if they were responsible agents, but perhaps it would be nearer the truth to regard them also as victims.”

“One hopes,” writes Per Brask, “evidently against hope, that Jung did not intend” his statements “as an argument of redemption for the Germans.” Whatever his intentions, his mystical racialization of the unconscious in “Wotan” accorded perfectly well with the theories of Alfred Rosenberg, “Hitler’s chief ideologist.” Like everything about Jung, the situation is complicated. In a 1938 interview, published by Omnibook Magazine in 1942, Jung repeated many of these disturbing ideas, comparing the German worship of Hitler to the Jewish desire for a Messiah, a “characteristic of people with an inferiority complex.” He describes Hitler’s power as a form of “magic.” But that power only exists, he says, because “Hitler listens and obeys….”

His Voice is nothing other than his own unconscious, into which the German people have projected their own selves; that is, the unconscious of seventy-eight million Germans. That is what makes him powerful. Without the German people he would be nothing.

Jung’s observations are bombastic, but they are not flattering. The people may be possessed, but it is their will, he says, that the Nazi leader enacts, not his own. "The true leader," says Jung, "is always led." He goes on to paint an even darker picture, having closely observed Hitler and Mussolini together in Berlin:

In comparison with Mussolini, Hitler made upon me the impression of a sort of scaffolding of wood covered with cloth, an automaton with a mask, like a robot or a mask of a robot. During the whole performance he never laughed; it was as though he were in a bad humor, sulking. He showed no human sign.

His expression was that of an inhumanly single-minded purposiveness, with no sense of humor. He seemed as if he might be a double of a real person, and that Hitler the man might perhaps be hiding inside like an appendix, and deliberately so hiding in order not to disturb the mechanism.

With Hitler you do not feel that you are with a man. You are with a medicine man, a form of spiritual vessel, a demi-deity, or even better, a myth. With Hitler you are scared. You know you would never be able to talk to that man; because there is nobody there. He is not a man, but a collective. He is not an individual, but a whole nation. I take it to be literally true that he has no personal friend. How can you talk intimately with a nation?

Read the full interview here. Jung goes on to further discuss the German resurgence of the cult of Wotan, the “parallel between the Biblical triad… and the Third Reich,” and other peculiarly Jungian formulations. Of Jung’s analysis, interviewer H.R. Knickerbocker concludes, “this psychiatric explanation of the Nazi names and symbols may sound to a layman fantastic, but can anything be as fantastic as the bare facts about the Nazi Party and its Fuehrer? Be sure there is much more to be explained in them than can be explained by merely calling them gangsters.”

Related Content:

Carl Jung Explains Why His Famous Friendship with Sigmund Freud Fell Apart in Rare 1959 Audio

Carl Jung Explains His Groundbreaking Theories About Psychology in a Rare Interview (1957)

Carl Jung: Tarot Cards Provide Doorways to the Unconscious, and Maybe a Way to Predict the Future

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

The Power of Introverts: Author Susan Cain Explains Why We Need to Appreciate the Talents & Abilities of the Quiet Ones

Ours is a loud culture of nonstop personal sharing, endless chatter, and 24-hour news, opinion, and entertainment. Even those people who prefer reading alone to the overstimulating carnival of social media feel pressured to participate. How else can you keep up with your family—whose Facebook posts you’d rather see die than have to read? How else to build a profile for employers—whom you desperately hope won’t check your Twitter feed?

For the introvert, maintaining an always-on façade can be profoundly enervating—and the problem goes far beyond the personal, argues author Susan Cain, reaching into every area of our lives.

“If you take a group of people and put them into a meeting,” says Cain in the short RSA video above, “the opinions of the loudest person, or the most charismatic person, or the most assertive person—those are the opinions that the group tends to follow.” This despite the fact that research shows “zero correlation” between being the loudest voice in the room and having the best ideas. Don’t we know this all too well.

Cain is the author of Quiet: The Power of Introverts in a World That Can’t Stop Talking, a book about leadership for introverts, the group least likely to want the social demands leadership requires. And yet, she argues, we nonetheless need introverts as leaders. “We’re living in a society now that is so overly extroverted,” she says. Cain identifies the phenomenon as a symptom of corporate capitalism overcoming predominantly agricultural ways of life. Aside from the significant question of whether we can change the culture without changing the economy, Cain makes a timely and compelling argument for a society that values different personality types equally.

But can there be a “world where it’s yin and yang” between introverts and extroverts? That depends, perhaps on how much credence we lend these well-worn Jungian categories, or whether we think of them as existing in binary opposition rather than on a spectrum, a circle, a hexagram, or whatever. Cain is not a psychologist but a former corporate lawyer who at least seems to believe the balancing act between extroverted and introverted can be achieved in the corporate world. She has given talks on “Networking for Introverts,” addressed the engineers at Google, and taken to the TED stage, the thought leader arena that accommodates all kinds of personalities, for better or worse.

Cain's TED talk above may be one of the better ones. Opening with a moving and funny personal narrative, she walks us through the barrage of messages introverts receive condemning their desire for quietude as somehow perverse and selfish. Naturally solitary people are taught to think of their introversion as "a second-class personality trait," Cain writes in her book, "somewhere between a disappointment and a pathology." Introverts must swim against the tide to be themselves. “Our most important institutions," she says above, "our schools and our workplaces, they are designed mostly for extroverts, and for extroverts' need for stimulation.”

The bias is deep, reaching into the classrooms of young children, who are now forced to do most of their work by committee. But when introverts give in to the social pressure that forces them into awkward extroverted roles, the loss affects everyone. “At the risk of sounding grandiose,” Cain says, “when it comes to creativity and to leadership, we need introverts doing what they do best.” Paradoxically, that can look like introverts taking the helm, but out of a genuine sense of duty rather than a desire for the spotlight.

Introverted leaders are more likely to share power and give others space to express ideas, Cain argues. Gandhi, Eleanor Roosevelt, and Rosa Parks exemplify such introverted leadership, and a quieter, more balanced and thoughtful culture would produce more leaders like them. Maybe this is a proposition anyone can endorse, whether they prefer Friday nights with hot tea and a novel or in the crush and bustle of the crowds.

Related Content:

Carl Jung Explains His Groundbreaking Theories About Psychology in a Rare Interview (1957)

The Neuroscience & Psychology of Procrastination, and How to Overcome It

Daily Meditation Boosts & Revitalizes the Brain and Reduces Stress, Harvard Study Finds

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

How Buddhism & Neuroscience Can Help You Change How Your Mind Works: A New Course by Bestselling Author Robert Wright

Buddhist thought and culture has long found a comfortable home among hippies, beatniks, New Age believers, artists, occultists and mystics. Recently, many of its tenets and practices have become widely popular among very different demographics of scientists, skeptics, and atheist communities. It may seem odd that an increasingly secularizing West would widely embrace an ancient Eastern religion. But even the Dalai Lama has pointed out that Buddhism’s essential doctrines align uncannily with the findings of modern science

The Pali Canon, the earliest collection of Buddhist texts, contains much that agrees with the scientific method. In the Kalama Sutta, for example, we find instructions for how to shape views and beliefs that accord with the methods espoused by the Royal Society many hundreds of years later.

Robert Wright—bestselling author and visiting professor of religion and psychology at Princeton and Penn—goes even further, showing in his book Why Buddhism is True how Buddhist insights into impermanence, delusion, ignorance, and unhappiness align with contemporary findings of neuroscience and evolutionary biology.

Wright is now making his argument for the compatibility of Buddhism and science in a new MOOC from Coursera called “Buddhism and Modern Psychology.” You can watch the trailer for the course, which starts this week, just above.

The core of Buddhism is generally contained in the so-called “Four Noble Truths,” and Wright explains in his lecture above how these teachings sum up the problem we all face, beginning with the first truth of dukkha. Often translated as “suffering,” the word might better be thought of as meaning “unsatisfactoriness,” as Wright illustrates with a reference to the Rolling Stones. Jagger's “can't get no satisfaction,” he says, captures “a lot of the spirit of what is called the First Noble Truth,” which, along with the Second, constitutes “the Buddha’s diagnosis of the human predicament.” Not only can we not get what we want, but even when we do, it hardly ever makes us happy for very long.

Rather than impute our misery to the displeasure of the gods, the Buddha, Wright tells Lion’s Roar, “says the reason we suffer, the reason we’re not enduringly satisfied, is that we don’t see the world clearly. That’s also the reason we sometimes fall short of moral goodness and treat other human beings badly.” Desperate to hold on to what we think will satisfy us, we become consumed by craving, as the Second Noble Truth explains, constantly clinging to pleasure and fleeing from pain. Just above, Wright explains how these two claims compare with the theories of evolutionary psychology. His course also explores how meditation releases us from craving and breaks the vicious cycle of desire and aversion.

Overall, the issues Wright addresses are laid out in his course description:

Are neuroscientists starting to understand how meditation “works”? Would such an understanding validate meditation—or might physical explanations of meditation undermine the spiritual significance attributed to it? And how are some of the basic Buddhist claims about the human mind holding up? We’ll pay special attention to some highly counterintuitive doctrines: that the self doesn’t exist, and that much of perceived reality is in some sense illusory. Do these claims, radical as they sound, make a certain kind of sense in light of modern psychology? And what are the implications of all this for how we should live our lives? Can meditation make us not just happier, but better people?

As to the last question, Wright is not alone among scientifically-minded people in answering with a resounding yes. Rather than relying on the beneficence of a supernatural savior, Buddhism offers a course of treatment—the “Noble Eightfold Path”—to combat our disposition toward illusory thinking. We are shaped by evolution, Wright says, to deceive ourselves. The Buddhist practices of meditation and mindfulness, and the ethics of compassion and nonharming, are “in some sense, a rebellion against natural selection.”

You can see more of Wright’s lectures on YouTube. Wright's free course, Buddhism and Modern Psychology, is getting started this week. You can sign up now.

Related Content:

How Mindfulness Makes Us Happier & Better Able to Meet Life’s Challenges: Two Animated Primers Explain

Daily Meditation Boosts & Revitalizes the Brain and Reduces Stress, Harvard Study Finds

Philosopher Sam Harris Leads You Through a 26-Minute Guided Meditation

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

« Go BackMore in this category... »