What Is Freedom? Watch Four Philosophy Animations on Freedom & Free Will Narrated by Harry Shearer

Growing up in America, I heard nearly every behavior, no matter how unpleasant, justified with the same phrase: "It's a free country." In her recent book Notes on a Foreign Country, the Istanbul-based American reporter Suzy Hansen remembers singing "God Bless the USA" on the school bus during the first Iraq war: "And I’m proud to be an AmericanWhere at least I know I’m free." That "at least," she adds, is funny: "We were free – at the very least we were that. Everyone else was a chump, because they didn’t even have that obvious thing. Whatever it meant, it was the thing that we had, and no one else did. It was our God-given gift, our superpower."

But how many of us can explain what freedom is? These videos from BBC Radio 4 and the Open University's animated History of Ideas series approach that question from four different angles. "Freedom is good, but security is better," says narrator Harry Shearer, summing up the view of seventeenth-century philosopher Thomas Hobbes, who imagined life without government, laws, or society as "solitary, poor, nasty, brutish, and short." The solution, he proposed, came in the form of a social contract "to put a strong leader, a sovereign or perhaps a government, over them to keep the peace" — an escape from "the war of all against all."




But that escape comes hand in hand with the unpalatable prospect of living under "a frighteningly powerful state." The nineteenth-century philosopher John Stuart Mill, who wrote a great deal about the state's proper limitations, based his concept of freedom in something called the "harm principle," which holds that "the state, my neighbors, and everyone else should let me get on with my life, as long as I don't harm anyone in the process." As "the seedbed of genius" and "the basis of enduring happiness for ordinary people," this individual freedom needs protection, especially when it comes to speech: "Merely causing offense, he thinks, is no grounds for intervention, because, in his view, that is not a harm."

That proposition remains debated more heatedly now, in the 21st century, than Mill probably could have imagined. But then as now, and as in any time of human history, we live in more or less the same world, "a world festering with moral evil, a world of wars, torture, rape, murder, and other acts of meaningless violence," not to mention "natural evil" like disease, famine, floods, and earthquakes. This gives rise to perhaps the oldest problem in the philosophical book, the problem of evil: "How could a good god allow anyone to do such horrific things?" Some have taken the fact that the wars, murders, floods, and earthquakes continue as evidence that no such god exists.

But had that god created "human beings that always did the right thing, never harmed anyone else, never went astray," we'd all have ended up "automata, preprogrammed robots." Better, in this view, "to have free will with the genuine risk that some people will end up evil than to live in a world without choice." Even so, the mere mention of free will, a concept no more easily defined than that of freedom itself, opens up a whole other can of worms, especially in light of research like neuroscientist Benjamin Libet's.

Libet, who "wired up subjects to an EEG machine, measuring brain activity via electrodes on our scalps," found that brain activity initiating a movement actually happened before the subjects thought they'd decided to make that movement. Does that disprove free will? Does evil disprove the existence of a good god? Does offense cause the same kind of harm as physical violence? Should we give up more security for freedom, or more freedom for security? These questions remain unanswered, and quite possibly unanswerable, but that doesn't make considering the very nature of freedom any less necessary as human societies — those in "free countries" and otherwise — find their way forward.

Related Content:

How Can I Know Right From Wrong? Watch Philosophy Animations on Ethics Narrated by Harry Shearer

47 Animated Videos Explain the History of Ideas: From Aristotle to Sartre

An Animated Aldous Huxley Identifies the Dystopian Threats to Our Freedom (1958)

Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. He’s at work on the book The Stateless City: a Walk through 21st-Century Los Angeles, the video series The City in Cinema, the crowdfunded journalism project Where Is the City of the Future?, and the Los Angeles Review of Books’ Korea Blog. Follow him on Twitter at @colinmarshall or on Facebook.

A New Animation Explains How Caffeine Keeps Us Awake

Let’s preface this by recalling that Honoré de Balzac drank up to 50 cups of coffee a day and lived to the ripe old age of … 51.

Of course, he produced dozens of novels, plays, and short stories before taking his leave. Perhaps his caffeine habit had a little something to do with that?

Pharmacist Hanan Qasim’s TED-Ed primer on how caffeine keeps us awake top loads the positive effects of the most world’s commonly used psychoactive substance. Global consumption is equivalent to the weight of 14 Eiffel Towers, measured in drops of coffee, soda, chocolate, energy drinks, decaf…and that’s just humans. Insects get theirs from nectar, though with them, a little goes a very long, potentially deadly way.




Caffeine’s structural resemblance to the neurotransmitter adenosine is what gives it that special oomph. Adenosine causes sleepiness by plugging into neural receptors in the brain, causing them to fire more sluggishly. Caffeine takes advantage of their similar molecular structures to slip into these receptors, effectively stealing adenosine’s parking space.

With a bioavailability of 99%, this interloper arrives ready to party.

On the plus side, caffeine is both a mental and physical pick me up.

In appropriate doses, it can keep your mind from wandering during a late night study session.

It lifts the body’s metabolic rate and boosts performance during exercise—an effect that’s easily counteracted by getting the bulk of your caffeine from chocolate or sweetened soda, or by dumping another Eiffel Tower’s worth of sugar into your coffee.

There’s even some evidence that moderate consumption may reduce the likelihood of such diseases as Parkinson’s, Alzheimer’s, and cancer.

What to do when that caffeine effect starts wearing off?

Gulp down more!

As with many drugs, prolonged usage diminishes the sought-after effects, causing its devotees (or addicts, if you like) to seek out higher doses, negative side effects be damned. Nervous jitters, incontinence, birth defects, raised heart rate and blood pressure… it’s a compelling case for sticking with water.

Animator Draško Ivezić (a 3-latte-a-day man, according to his studio’s website) does a hilarious job of personifying both caffeine and the humans in its thrall, particularly an egg-shaped new father.

Go to TED-Ed to learn more, or test your grasp of caffeine with a quiz.

Related Content:

Wake Up & Smell the Coffee: The New All-in-One Coffee-Maker/Alarm Clock is Finally Here!

Physics & Caffeine: Stop Motion Film Uses a Cup of Coffee to Explain Key Concepts in Physics

This is Coffee!: A 1961 Tribute to Our Favorite Stimulant

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Follow her @AyunHalliday.

Reality Is Nothing But a Hallucination: A Mind-Bending Crash Course on the Neuroscience of Consciousness

If you've been accused of living in "a world of your own," get ready for some validation. As cognitive scientist Anil Seth argues in "Your Brain Hallucinates Your Conscious Reality," the TED Talk above, everyone lives in a world of their own — at least if by "everyone" you mean "every brain," by "world" you mean "entire reality," and by "of their own" you mean "that it has created for itself." With all the signals it receives from our senses and all the prior experiences it has organized into expectations, each of our brains constructs a coherent image of reality — a "multisensory, panoramic 3D, fully, immersive inner movie" — for us to perceive.

"Perception has to be a process of 'informed guesswork,'" says the TED Blog's accompanying notes, "in which sensory signals are combined with prior expectations about the way the world is, to form the brain’s best guess of the causes of these signals."




Seth uses optical illusions and classic experiments to underscore the point that “we don’t just passively perceive the world; we actively generate it. The world we experience comes as much from the inside-out as the outside-in," in a process hardly different from that which we casually call hallucination. Indeed, in a way, we're always hallucinating. “It’s just that when we agree about our hallucinations, that’s what we call ‘reality.’” And as for what, exactly, constitutes the "we," our brains do a good deal of work to construct that too.

Seventeen minutes only allows Dash to go so far down the rabbit hole of the neuroscience of consciousness, but he'll galvanize the curiosity of anyone with even a mild interest in this mind-mending subject. He leaves us with a few implications of his and others' research to consider: first, "just as we can misperceive the world, we can misperceive ourselves"; second, "what it means to be me cannot be reduced to — or uploaded to — a software program running on an advanced robot, however sophisticated"; third, "our individual inner universe is just one way of being conscious, and even human consciousness generally is a tiny region in a vast space of possible consciousnesses." As we've learned, in a sense, from every TED Talk, no matter how busy a brain may be constructing both reality and the self, it can always come up with a few big takeaways for the audience.

Related Content:

Free Online Psychology & Neuroscience Courses

John Searle Makes A Forceful Case for Studying Consciousness, Where Everything Else Begins

Robert Sapolsky Explains the Biological Basis of Religiosity, and What It Shares in Common with OCD, Schizophrenia & Epilepsy

Stanford’s Robert Sapolsky Demystifies Depression, Which, Like Diabetes, Is Rooted in Biology

Alan Watts On Why Our Minds And Technology Can’t Grasp Reality

Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. He’s at work on the book The Stateless City: a Walk through 21st-Century Los Angeles, the video series The City in Cinema, the crowdfunded journalism project Where Is the City of the Future?, and the Los Angeles Review of Books’ Korea Blog. Follow him on Twitter at @colinmarshall or on Facebook.

How Information Overload Robs Us of Our Creativity: What the Scientific Research Shows

Flickr Commons photo by J Stimp

Everyone used to read Samuel Johnson. Now it seems hardly anyone does. That’s a shame. Johnson understood the human mind, its sadly amusing frailties and its double-blind alleys. He understood the nature of that mysterious act we casually refer to as “creativity." It is not the kind of thing one lucks into or masters after a seminar or lecture series. It requires discipline and a mind free of distraction. “My dear friend,” said Johnson in 1783, according to his biographer and secretary Boswell, “clear your mind of cant.”

There’s no missing apostrophe in his advice. Inspiring as it may sound, Johnson did not mean to say “you can do it!” He meant “cant,” an old word for cheap deception, bias, hypocrisy, insincere expression. “It is a mode of talking in Society,” he conceded, “but don’t think foolishly.” Johnson’s injunction resonated through a couple centuries, became garbled into a banal affirmation, and was lost in a graveyard of image macros. Let us endeavor to retrieve it, and ruminate on its wisdom.




We may even do so with our favorite modern brief in hand, the scientific study. There are many we could turn to. For example, notes Derek Beres, in a 2014 book neuroscientist Daniel Levitin brought his research to bear in arguing that “information overload keeps us mired in noise.... This saps us of not only willpower (of which we have a limited store) but creativity as well.” "We sure think we're accomplishing a lot," Levitin told Susan Page on The Diane Rehm Show in 2015, "but that's an illusion... as a neuroscientist, I can tell you one thing the brain is very good at is self-delusion."

Johnson’s age had its own version of information overload, as did that of another curmudgeonly voice from the past, T.S. Eliot, who wondered, “Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information?” The question leaves Eliot’s readers asking whether what we take for knowledge or information really are such? Maybe they’re just as often forms of needless busyness, distraction, and overthinking. Stanford researcher Emma Seppälä suggests as much in her work on “the science of happiness.” At Quartz, she writes,

We need to find ways to give our brains a break.... At work, we’re intensely analyzing problems, organizing data, writing—all activities that require focus. During downtime, we immerse ourselves in our phones while standing in line at the store or lose ourselves in Netflix after hours.

Seppälä exhorts us to relax and let go of the constant need for stimulation, to take longs walks without the phone, get out of our comfort zones, make time for fun and games, and generally build in time for leisure. How does this work? Let's look at some additional research. Bar-Ilan University’s Moshe Bar and Shira Baror undertook a study to measure the effects of distraction, or what they call “mental load,” the “stray thoughts” and “obsessive ruminations” that clutter the mind with information and loose ends. Our “capacity for original and creative thinking,” Bar writes at The New York Times, “is markedly stymied” by a busy mind. "The cluttered mind," writes Jessica Stillman, "is a creativity killer."

In a paper published in Psychological Science, Bar and Baror describe how “conditions of high load” foster unoriginal thinking. Participants in their experiment were asked to remember strings of arbitrary numbers, then to play word association games. “Participants with seven digits to recall resorted to the most statistically common responses(e.g., white/black)," writes Bar, "whereas participants with two digits gave less typical, more varied pairings (e.g. white/cloud).” Our brains have limited resources. When constrained and overwhelmed with thoughts, they pursue well-trod paths of least resistance, trying to efficiently bring order to chaos.

“Imagination," on the other hand, wrote Dr. Johnson elsewhere, “a licentious and vagrant faculty, unsusceptible of limitations and impatient of restraint, has always endeavored to baffle the logician, to perplex the confines of distinction, and burst the enclosures of regularity.” Bar describes the contrast between the imaginative mind and the information processing mind as “a tension in our brains between exploration and exploitation.” Gorging on information makes our brains “’exploit’ what we already know," or think we know, "leaning on our expectation, trusting the comfort of a predictable environment.” When our minds are “unloaded,” on the other hand, such as can occur during a hike or a long, relaxing shower, we can shed fixed patterns of thinking, and explore creative insights that might otherwise get buried or discarded.

As Drake Baer succinctly puts in at New York Magazine’s Science of Us, “When you have nothing to think about, you can do your best thinking.” Getting to that state in a climate of perpetual, unsleeping distraction, opinion, and alarm, requires another kind of discipline: the discipline to unplug, wander off, and clear your mind.

For another angle on this, you might want to check out Cal Newport's 2016 book, Deep Work: Rules for Focused Success in a Distracted World.

Related Content:

The Neuroscience & Psychology of Procrastination, and How to Overcome It

Why You Do Your Best Thinking In The Shower: Creativity & the “Incubation Period”

How Walking Fosters Creativity: Stanford Researchers Confirm What Philosophers and Writers Have Always Known

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Evelyn Glennie (a Musician Who Happens to Be Deaf) Shows How We Can Listen to Music with Our Entire Bodies

Composer and percussionist Dame Evelyn Glennie, above, feels music profoundly. For her, there is no question that listening should be a whole body experience:

Hearing is basically a specialized form of touch. Sound is simply vibrating air which the ear picks up and converts to electrical signals, which are then interpreted by the brain. The sense of hearing is not the only sense that can do this, touch can do this too. If you are standing by the road and a large truck goes by, do you hear or feel the vibration? The answer is both. With very low frequency vibration the ear starts becoming inefficient and the rest of the body’s sense of touch starts to take over. For some reason we tend to make a distinction between hearing a sound and feeling a vibration, in reality they are the same thing. It is interesting to note that in the Italian language this distinction does not exist. The verb ‘sentire’ means to hear and the same verb in the reflexive form ‘sentirsi’ means to feel.

It’s a philosophy born of necessity—her hearing began to deteriorate when she was 8, and by the age of 12, she was profoundly deaf. Music lessons at that time included touching the wall of the practice room to feel the vibrations as her teacher played.

While she acknowledges that her disability is a publicity hook, it’s not her preferred lede, a conundrum she explores in her "Hearing Essay." Rather than be celebrated as a deaf musician, she’d like to be known as the musician who is teaching the world to listen.

In her TED Talk, How To Truly Listen, she differentiates between the ability to translate notations on a musical score and the subtler, more soulful skill of interpretation. This involves connecting to the instrument with every part of her physical being. Others may listen with ears alone. Dame Evelyn encourages everyone to listen with fingers, arms, stomach, heart, cheekbones… a phenomenon many teenagers experience organically, no matter what their earbuds are plugging.

And while the vibrations may be subtler, her philosophy could cause us to listen more attentively to both our loved ones and our adversaries, by staying attuned to visual and emotional pitches, as well as slight variations in volume and tone.

Related Content:

How Did Beethoven Compose His 9th Symphony After He Went Completely Deaf?

Hear a 20 Hour Playlist Featuring Recordings by Electronic Music Pioneer Pauline Oliveros (RIP)

How Ingenious Sign Language Interpreters Are Bringing Music to Life for the Deaf: Visualizing the Sound of Rhythm, Harmony & Melody

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  She’ll is appearing onstage in New York City this June as one of the clowns in Paul David Young’s Faust 3. Follow her @AyunHalliday.

Artists May Have Different Brains (More Grey Matter) Than the Rest of Us, According to a Recent Scientific Study

Image Photo courtesy of the Laboratory of Neuro Imaging at UCLA.

Sometimes—as in the case of neuroscience—scientists and researchers seem to be saying several contradictory things at once. Yes, opposing claims can both be true, given different context and levels of description. But which is it, Neuroscientists? Do we have “neuroplasticity”—the ability to change our brains, and therefore our behavior? Or are we “hard-wired” to be a certain way by innate structures.

The debate long predates the field of neuroscience. It figured prominently in the work, for example, of John Locke and other early modern theorists of cognition—which is why Locke is best known as the theorist of tabula rasa. In “Some Thoughts Concerning Education,” Locke mostly denies that we are able to change much at all in adulthood.




Personality, he reasoned, is determined not by biology, but in the “cradle” by “little, almost insensible impressions on our tender infancies.” Such imprints “have very important and lasting consequences.” Sorry, parents. Not only did your kid get wait-listed for that elite preschool, but their future will also be determined by millions of sights and sounds that happened around them before they could walk.

It’s an extreme, and unscientific, contention, fascinating as it may be from a cultural standpoint. Now we have psychedelic-looking brain scans popping up in our news feeds all the time, promising to reveal the true origins of consciousness and personality. But the conclusions drawn from such research are tentative and often highly contested.

So what does science say about the eternally mysterious act of artistic creation? The abilities of artists have long seemed to us godlike, drawn from supernatural sources, or channeled from other dimensions. Many neuroscientists, you may not be surprised to hear, believe that such abilities reside in the brain. Moreover, some think that artists’ brains are superior to those of mediocre ability.

Or at least that artists’ brains have more gray and white matter than “right-brained” thinkers in the areas of “visual perception, spatial navigation and fine motor skills.” So writes Katherine Brooks in a Huffington Post summary of “Drawing on the right side of the brain: A voxel-based morphometry analysis of observational drawing.” The 2014 study, published at NeuroImage, involved a very small sampling of graduate students, 21 of whom were artists, 23 of whom were not. All 44 students were asked to complete drawing tasks, which were then scored and compared to images of their brain taken by a method called “voxel-based morphometry.”

“The people who are better at drawing really seem to have more developed structures in regions of the brain that control for fine motor performance and what we call procedural memory,” the study’s lead author, Rebecca Chamberlain of Belgium’s KU Leuven University, told the BBC. (Hear her segment on BBC Radio 4’s Inside Science here.) Does this mean, as Artnet News claims in their quick take, that “artists’ brains are more fully developed?”

It’s a juicy headline, but the findings of this limited study, while “intriguing,” are “far from conclusive.” Nonetheless, it marks an important first step. “No studies” thus far, Chamberlain says, “have assessed the structural differences associated with representational skills in visual arts.” Would a dozen such studies resolve questions about causality--nature or nurture? As usual, the truth probably lies somewhere in-between.

At Smithsonian, Randy Rieland quotes several critics of the neuroscience of art, which has previously focused on what happens in the brain when we look at a Van Gogh or read Jane Austen. The problem with such studies, writes Philip Ball at Nature, is that they can lead to “creating criteria of right or wrong, either in the art itself or in individual reactions to it.” But such criteria may already be predetermined by culturally-conditioned responses to art.

The science is fascinating and may lead to numerous discoveries. It does not, as the Creators Project writes hyperbolically, suggest that "artists actually are different creatures from everyone else on the planet." As University of California philosopher professor Alva Noe states succinctly, one problem with making sweeping generalizations about brains that view or create art is that “there can be nothing like a settled, once-and-for-all account of what art is.”

Emerging fields of “neuroaesthetics” and “neurohumanities” may muddy the waters between quantitative and qualitative distinctions, and may not really answer questions about where art comes from and what it does to us. But then again, given enough time, they just might.

via The Creators Project

Related Content:

This Is Your Brain on Jane Austen: The Neuroscience of Reading Great Literature

The Neuroscience of Drumming: Researchers Discover the Secrets of Drumming & The Human Brain

The Neuroscience & Psychology of Procrastination, and How to Overcome It

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Sigmund Freud, Father of Psychoanalysis, Introduced in a Monty Python-Style Animation

Pity the hedgehog. The freezing temperatures of winter compel them to cozy up to others of its kind, but the prickly spines covering their bodies prevent them from sustaining the easy, ongoing intimacy they so crave.

It's a hell of a metaphor for human relationships, compliments of 19th-century philosopher Arthur Schopenhauer. It certainly spoke to Sigmund Freud, who devoted his life trying to figure out why so many of us resort to petty behaviors, spurning those we love, and sabotaging ourselves at every turn.




Popular representations would have us believe that the father of psychoanalysis was a detached sort of know-it-all, emotionally superior to the basket cases sniveling on his couch. Not so. As he noted in 1897:

I have been through some kind of neurotic experience, curious states… twilight thoughts, veiled doubts… The chief patient I am preoccupied with is myself… my little hysteria… the analysis is more difficult than any other. Something from the deepest depths of my own neurosis sets itself against any advance in understanding neuroses…

We feel ya', doc, and so does The School of Life, the London-based organization for developing emotional intelligence, co-founded by philosophical essayist, Alain de Botton:

… consulting a psychotherapist should be as accessible and as normal as developing your career, getting help for a physical problem, or going to the gym to get healthy. Just as we take care of our bodies and physical health, a vital element of self-care is devoting focused time and energy to exploring and understanding our thoughts and feelings.

The school puts your money where its mouth is by retaining a roster of licensed psychotherapists who can be booked for in-person or Skype sessions.

It's not for everyone. There are those who are determined to pursue the path to contentment and self-knowledge solo, impervious to Freud’s belief that “No one who disdains the key will ever be able to unlock the door.”

The therapy-averse can still learn something from the video above. Narrator de Botton charms his way through an easily digested overview of Freud’s personal and professional life, and the resulting tenets of psychoanalysis.

And filmmaker Mad Adam ensures that this brief trip through the infant phases---oral, anal, phallic---will be a jolly one, replete with droll, mostly vintage images.

Release more monsters of the id with the School of Life’s psychotherapy playlist.

Related Content:

20,000 Letters, Manuscripts & Artifacts From Sigmund Freud Get Digitized and Made Available Online

Download Sigmund Freud’s Great Works as Free eBooks & Free Audio Books: A Digital Celebration on His 160th Birthday

What is Love? BBC Philosophy Animations Feature Sartre, Freud, Aristophanes, Dawkins & More

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Follow her @AyunHalliday.

More in this category... »
Quantcast