Neurons as Art: See Beautiful Anatomy Drawings by the Father of Neuroscience, Santiago Ramón y Cajal

Art depends on popular judgments about the universe, and is nourished by the limited expanse of sentiment. . . . In contrast, science was barely touched upon by the ancients, and is as free from the inconsistencies of fashion as it is from the fickle standards of taste. . . . And let me stress that this conquest of ideas is not subject to fluctuations of opinion, to the silence of envy, or to the caprices of fashion that today repudiate and detest what yesterday was praised as sublime.

- Santiago Ramón y Cajal

The above drawing is the sort of sublime rendering that attracts throngs of visitors to the world’s great modern art museums, but that’s not the sort of renown the artist, Nobel Prize-winning father of modern neuroscience Santiago Ramón y Cajal (1852 -1934), actively sought.

Or rather, he might have back before his father, a professor of anatomy, coerced his wild young son into transferring from a provincial art academy to the medical school where he himself was employed.

After a stint as an army medical officer, the artist-turned-anatomist concentrated on inflammation, cholera, and epithelial cells before zeroing in on his true muse—the central nervous system.

At the time, reticular theory, which held that everything in the nervous system was part of a single continuous network, prevailed.

Ramón y Cajal was able to disprove this widely held belief by using Golgi stains to support the existence of individual nervous cells—neurons—that, while not physically connected, communicated with each other through a system of axons, dendrites, and synapses.

He called upon both his artistic and medical training in documenting what he observed through his microscope. His meticulous freehand drawings are far more accurate than anything that could be produced by the microscopic-image photographic tools available at the time.

His precision was such that his illustrations continue to be published in medical textbooks. Further research has confirmed many of his suppositions.

As art critic Roberta Smith writes in The New York Times, the drawings are “fairly hard-nosed fact if you know your science”:

If you don’t, they are deep pools of suggestive motifs into which the imagination can dive. Their lines, forms and various textures of stippling, dashes and faint pencil circles would be the envy of any modern artist. That they connect with Surrealist drawing, biomorphic abstraction and exquisite doodling is only the half of it.

The drawings’ pragmatic titles certainly take on a poetic quality when one considers the context of their creation:

Axon of Purkinje neurons in the cerebellum of a drowned man

The hippocampus of a man three hours after death

Glial cells of the cerebral cortex of a child

His specimens were not limited to the human world:

Retina of lizard

The olfactory bulb of the dog

In his book Advice for a Young Investigator, Ramón y Cajal took a holistic view of the relationship between science and the arts:

The investigator ought to possess an artistic temperament that impels him to search for and admire the number, beauty, and harmony of things; and—in the struggle for life that ideas create in our minds—a sound critical judgment that is able to reject the rash impulses of daydreams in favor of those thoughts most faithfully embracing objective reality.

Explore more of Ramón y Cajal’s cellular drawings in Beautiful Brain: The Drawings of Santiago Ramón y Cajal, the companion book to a recent traveling exhibition of his work. Or immerse yourself at the neural level by ordering a reproduction on a beach towel, yoga mat, cell phone case, shower curtain, or other necessity on Science Source.

Related Content:

Ernst Haeckel’s Sublime Drawings of Flora and Fauna: The Beautiful Scientific Drawings That Influenced Europe’s Art Nouveau Movement (1889)

Leonardo da Vinci’s Visionary Notebooks Now Online: Browse 570 Digitized Pages

Two Million Wondrous Nature Illustrations Put Online by The Biodiversity Heritage Library

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Join her in New York City April 15 for the next installment of her book-based variety show, Necromancers of the Public Domain. Follow her @AyunHalliday.

Behold an Anatomically Correct Replica of the Human Brain, Knitted by a Psychiatrist

Our brains dictate our every move.

They’re the ones who spur us to study hard, so we can make something of ourselves, in order to better our communities.

They name our babies, choose our clothes, decide what we’re hungry for.

They make and break laws, organize protests, fritter away hours on social media, and give us the green light to binge watch a bunch of dumb shows when we could be reading War and Peace.

They also plant the seeds for Fitzcarraldo-like creative endeavors that take over our lives and generate little to no income.

We may describe such endeavors as a labor of love, into which we’ve poured our entire heart and soul, but think for a second.

Who’s really responsible here?

The heart, that muscular fist-sized Valentine, content to just pump-pump-pump its way through life, lub-dub, lub-dub, from cradle to grave?

Or the brain, a crafty Iago of an organ, possessor of billions of neurons, complex, contradictory, a mystery we’re far from unraveling?

Psychiatrist Dr. Karen Norberg’s brain has steered her to study such heavy duty subjects as the daycare effect, the rise in youth suicide, and the risk of prescribing selective serotonin reuptake inhibitors as a treatment for depression.

On a lighter note, it also told her to devote nine months to knitting an anatomically correct replica of the human brain.

(Twelve, if you count three months of research before casting on.)

How did her brain convince her to embark on this madcap assignment?

Easy. It arranged for her to be in the middle of a more prosaic knitting project, then goosed her into noticing how the ruffles of that project resembled the wrinkles of the cerebral cortex.

Coincidence?

Not likely. Especially when one of the cerebral cortex's most important duties is decision making.

As she explained in an interview with The Telegraph, brain development is not unlike the growth of a knitted piece:

You can see very naturally how the 'rippling' effect of the cerebral cortex emerges from properties that probably have to do with nerve cell growth. In the case of knitting, the effect is created by increasing the number of stitches in each row.

Dr. Norberg—who, yes, has on occasion referred to her project as a labor of love—told Scientific American that such a massive crafty undertaking appealed to her sense of humor because “it seemed so ridiculous and would be an enormously complicated, absurdly ambitious thing to do.”

That’s the point at which many people’s brains would give them permission to stop, but Dr. Norberg and her brain persisted, pushing past the hypothetical, creating colorful individual structures that were eventually sewn into two cuddly hemispheres that can be joined with a zipper.

(She also let slip that her brain—by which she means the knitted one, though the observation certainly holds true for the one in her head—is female, due to its robust corpus callosum, the “tough body” whose millions of fibers promote communication and connection.)

via The Telegraph

Related Content:

A Massive, Knitted Tapestry of the Galaxy: Software Engineer Hacks a Knitting Machine & Creates a Star Map Featuring 88 Constellations

Jazz Musician Plays Acoustic Guitar While Undergoing Brain Surgery, Helping Doctors Monitor Their Progress

How Meditation Can Change Your Brain: The Neuroscience of Buddhist Practice

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Join her in New York City for the next installment of her book-based variety show, Necromancers of the Public Domain, this April. Follow her @AyunHalliday.

Why We Dance: An Animated Video Explains the Science Behind Why We Bust a Move

Has any culture, apart from that of the tiny Utah town in Footloose, done entirely without dancing? It would at first seem that any human need the rhythmic shaking of one's limbs to organized sound fulfills must reside pretty low on the overall priority scale, but anthropology tells us that various human societies started dancing before they got into most every other activity that fills their time today. "Why is this ostensibly frivolous act so fundamental to being human?" asks the Aeon video above. "The answer, it seems, is in our need for social cohesion — that vital glue that keeps societies from breaking apart despite interpersonal differences."

Directed and animated by Rosanna Wan and Andrew Khosravani, the four-minute explainer frames our deep, culture-transcending need to "bust a move" in terms of the work of both 19th- and early 20th-century French sociologist Émile Durkheim and more recent research performed by Bronwyn Tarr, an Oxford evolutionary biologist who also happens to be a dancer herself.

Durkheim posited the phenomenon of "collective effervescence," or "a sort of electricity," or "that exhilaration, almost euphoria, that overtakes groups of people united by a common purpose, pursuing an intensely involving activity together." When you feel it, you feel "a flow, a sense that your self is melding with the group as a whole." And has any practice generated as much collective effervescence throughout human history as dance?

Modern science has shed a bit of light on why: Tarr has found that "we humans have a natural tendency to synchronize our movements with other humans," thanks to a region in the brain which helps us make the same movements we see others making. "When we mimic our partner's movements, and they're mimicking ours, similar neural networks in both networks open up a rush of neurohormones, all of which make us feel good." Listening to music "can create such a euphoric delight that it appears to activate opioid receptors in the brain," making it even harder to resist getting up and dancing. "They said he'd never win," Footloose's tagline said of the movie's big-city teen intent on getting the town dancing again, but "he knew he had to" — an assurance that turns out to have had a basis in neurology.

Related Content:

Animated Introductions to Three Sociologists: Durkheim, Weber & Adorno

The Strange Dancing Plague of 1518: When Hundreds of People in France Could Not Stop Dancing for Months

The Addams Family Dance to The Ramones’ “Blitzkrieg Bop”

Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

How Does the Rorschach Inkblot Test Work?: An Animated Primer

A frightening monster?

Two friendly bears?

Say what!?

As anybody with half a brain and the gift of sight knows, the black and red inkblot below resembles nothing so much as a pair of gnomes, gavotting so hard their knees bleed.

...or perhaps it’s open to interpretation.

Back in 2013, when Open Culture celebrated psychologist Hermann Rorschach’s birthday by posting the ten blots that form the basis of his famous personality test, readers reported seeing all sorts of things in Card 2:

A uterus

Lungs

Kissing puppies

A painted face

Little calfs

Tinkerbell checking her butt out in the mirror

Two ouija board enthusiasts, summoning demons

Angels

And yes, high-fiving bears

As Rorshach biographer Damion Searls explains in an animated Ted-ED lesson on how the Rorschach Test can help us understand the patterns of our perceptions, our answers depend on how we as individuals register and transform sensory input.

Rorshach chose the blots that garnered the most nuanced responses, and developed a classification system to help analyze the resulting data, but for much of the test’s history, this code was a highly guarded professional secret.

And when Rorshach died, a year after publishing the images, others began administering the test in service of their own speculative goals—anthropologists, potential employers, researchers trying to figure out what made Nazis tick, comedians…

The range of interpretative approaches earned the test a reputation as pseudo-science, but a 2013 review of Rorshach’s voluminous research went a long way toward restoring its credibility.

Whether or not you believe there’s something to it, it’s still fun to consider the things we bring to the table when examining these cards.

Do we see the image as fixed or something more akin to a freeze frame?

What part of the image do we focus on?

Our records show that Open Culture readers overwhelmingly focus on the hands, at least as far as Card 2 goes, which is to say the portion of the blot that appears to be high-fiving itself.

Never mind that the high five, as a gesture, is rumored to have come into existence sometime in the late 1970s. (Rorschach died in 1922.) That’s what the majority of Open Culture readers saw six years ago, though there was some variety of perception as to who was slapping that skin:

young elephants

despondent humans

monks

lawn gnomes

Disney dwarves

redheaded women in Japanese attire

chimpanzees with traffic cones on their heads

(In full disclosure, it's mostly bears.)

Maybe it's time for a do over?

Readers, what do you see now?

Image 1: Bat, butterfly, moth

Rorschach_blot_01

Image 2: Two humans

Rorschach_blot_02

Image 3: Two humans

800px-Rorschach_blot_03

Image 4: Animal hide, skin, rug

Rorschach_blot_04

Image 5: Bat, butterfly, moth

Rorschach_blot_05

Image 6: Animal hide, skin, rug

Rorschach_blot_06

Image 7: Human heads or faces

Rorschach_blot_07

Image 8: Animal; not cat or dog

689px-Rorschach_blot_08

Image 9: Human

647px-Rorschach_blot_09

Image 10: Crab, lobster, spider,

751px-Rorschach_blot_10

View Searls’ full TED-Ed lesson here.

Related Content:

Hermann Rorschach’s Original Rorschach Test: What Do You See? (1921)

The Psychological & Neurological Disorders Experienced by Characters in Alice in Wonderland: A Neuroscience Reading of Lewis Carroll’s Classic Tale

Introduction to Psychology: A Free Course from Yale University

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Join her in New York City for the next installment of her book-based variety show, Necromancers of the Public Domain, this April. Follow her @AyunHalliday.

The Cringe-Inducing Humor of The Office Explained with Philosophical Theories of Mind

"I'm a friend first and a boss second," says David Brent, middle manager at the Slough branch of paper company Wernham-Hogg. "Probably an entertainer third." Those of us who've watched the original British run of The Office — and especially those of us who still watch it regularly — will remember that and many other of Brent's pitiable declarations besides. As portrayed by the show's co-creator Ricky Gervais, Brent constitutes both The Office's comedic and emotional core, at once a fully realized character and someone we've all known in real life. His distinctive combination of social incompetence and an aggressive desperation to be liked provokes in us not just laughter but a more complex set of emotions as well, resulting in one expression above all others: the cringe.

"In David Brent, we have a character so invested in the performance of himself that he's blocked his own access to others' feelings." So goes the analysis of Evan Puschak, a.k.a. the Nerdwriter, in his video interpreting the humor of The Office through philosophical theories of mind.

The elaborate friend-boss-entertainer song-and-dance Brent constantly puts on for his co-workers so occupies him that he lacks the ability or even the inclination to have any sense of what they're thinking. "The irony is that Brent can't see that a weak theory of mind always makes for a weak self-performance. You can't brute force your preferred personality onto another's consciousness: it takes two to build an identity."

Central though Brent is to The Office, we laugh not just at what he says and does, but how the other characters (which Puschak places across a spectrum of ability to understand the minds of others) react — or fail to react — to what he says and does, how he reacts to their reactions, and so on. Mastery of the comedic effects of all this has kept the original Office effective more than fifteen years later, though its effect may not be entirely pleasurable: "A lot of people say that cringe humor like this is hard to watch," says Puschak, "but in the same way that under our confidence, in theory of mind, lies an anxiety, I think that under our cringing there's actually a deep feeling of relief." When Brent and others fail to connect, their "body language speaks in a way that is totally transparent: in that moment the embarrassment is not only palpable, it's palpably honest." And it reminds us that — if we're being honest — none of us are exactly mind-readers ourselves.

You can get the complete British run of The Office on Amazon here.

Related Content:

Ricky Gervais Presents “Learn Guitar with David Brent”

The Philosophy of Bill Murray: The Intellectual Foundations of His Comedic Persona

A Romp Through the Philosophy of Mind: A Free Online Course from Oxford

Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

Sleep or Die: Neuroscientist Matthew Walker Explains How Sleep Can Restore or Imperil Our Health

Wouldn’t it be nice if we could fix the work/life thing by chucking out the difference? At home, you're in the office, at the office, you're at home, always on and never off—sleep, optional. Two-four hours per 24-hour cycle should be enough, right? Wrong. We need proper sleep like we need good food, low stress, engaging pursuits, etc.—to thrive and live a long and happy life. If you wait until you’re dead to sleep, you’ll be dead sooner than you think. “Short sleep predicts a shorter life,” explains sleep researcher Matthew Walker in the RSA animation Sleep or Die, above. "Sleep," he says, "is a non-negotiable biological necessity.“

The National Sleep Foundation recommends that adults sleep an average of eight hours a night. That number may vary from person to person, but fewer than six can be highly detrimental. Walker is something of a “sleep evangelist,” notes Berkeley News. Ask him about “the downside of pulling an all-nighter, and he'll rattle off a list of ill effects that range from memory loss and a compromised immune system to junk food cravings and wild mood swings.” The neuroscientist tells Terry Gross on Fresh Air, “Every disease that is killing us in developed nations has causal and significant links to a lack of sleep.”

Walker has a lot more to say about sleep in the interview below, including tips for getting there, whether you can make up for lost sleep (you can’t), and why you shouldn’t yank teenagers out of bed on the weekends. Why should we listen to him? Well, he isn’t just any sleep scientist. “To be specific,” writes Rachel Cooke at The Guardian, “he is the director of the Center for Human Sleep Science at the University of California, a research institute whose goal—possibly unachievable—is to understand everything about sleep’s impact on us, from birth to death, in sickness and health.”

 

The benefits of sound sleep include enhanced creativity and concentration, lower blood pressure, better mood regulation, and higher immunity and fertility. Lack of sleep, however, is "increasing our risk of cancer, heart attack and Alzheimer’s," notes Cooke. Indeed, "after just one night of only four or five hours’ sleep," Walker tells The Guardian, "your natural killer cells—the ones that attack the cancer cells that appear in your body every day—drop by 70%." Sleep deprivation has such serious outcomes that "the World Health Organisation has classed any form of night-time shift work as a probable carcinogen."

Sleep holds many mysteries, but one thing scientists like Walker seem to know: poor sleep leaves us more in sickness than in health. And we are in the midst of a “catastrophic sleep-loss epidemic.” “No one would look at an infant baby asleep, and say ‘What a lazy baby!” Walker observes. Yet adults have “stigmatized sleep with the label of laziness. We want to seem busy, and one way we express that is by proclaiming how little sleep we’re getting.” It’s a way to broadcast that we aren’t falling behind or missing out. But our bodies’ natural cycles and rhythms don’t speed up along with technology and global markets.

“As bedrooms everywhere glow from the screens of round-the-clock technology consumption,” Berkeley News writes, millions of people suffer physical, emotional, cognitive, and psychological stresses. Or, put more positively, “a growing body of scientific work” shows that “a solid seven to nine hours of sleep a night serves functions beyond our wildest imaginations.” Learn more about not only what’s gone wrong with sleep, but how to start addressing the problem in Walker’s book Why We Sleep: Unlocking the Power of Sleep and Dreams.

Related Content:

Bertrand Russell’s Advice For How (Not) to Grow Old: “Make Your Interests Gradually Wider and More Impersonal”

Brian Eno Lists the Benefits of Singing: A Long Life, Increased Intelligence, and a Sound Civilization

10 Longevity Tips from Dr. Shigeaki Hinohara, Japan’s 105-Year-Old Longevity Expert

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

A Brief Animated Introduction to Noam Chomsky’s Linguistic Theory, Narrated by The X-Files‘ Gillian Anderson

How is it that children just entering toddlerhood pick up the structure of their respective languages with ease? They are not formally taught to use speech; they have limited cognitive abilities and a “poverty of stimulus,” given their highly circumscribed environments. And yet, they learn the function and order of subjects, verbs, and objects, and learn to recognize improper usage. Children might make routine mistakes, but they understand and can be understood from a very early age, and for the most part without very much difficulty. How?

These are the questions that confronted Noam Chomsky in the early years of his career in linguistics. His answers produced a theory of Universal Grammar in the 1960s, and for decades, it has been the reigning theory in the field to beat, initiating what is often referred to as the “Chomskyan Era,” a phrase the man himself dislikes but which nonetheless sums up the kinds of issues that have been at stake in linguistics for over fifty years.

Questions about language acquisition have always been the subject of intense philosophical speculation. They were folded into general theories of epistemology, like Plato’s theory of forms or John Locke’s so-called “blank slate” hypothesis. Variations on these positions surface in different forms throughout Western intellectual history. Descartes picks up Plato’s dualism, arguing that humans speak and animals don’t because of the existence of an immortal “rational soul.” Behaviorist B.F. Skinner suggests that operant conditioning writes language onto a totally impressionable mind. (“Give me a child,” said Skinner, “and I will shape him into anything.”)

Chomsky “gave a twist” to this age-old debate over the existence of innate ideas, as Gillian Anderson tells us in the animated video above from BBC 4’s History of Ideas series. Chomsky’s theory is biolinguistic: it situates language acquisition in the structures of the brain. Not being himself a neurobiologist, he talks of those theoretical structures, responsible for reproducing accurate syntax, as a metaphorical “language acquisition device” (LAD), a hardwired faculty that separates the human brain from that of a dog or cat.

Chomsky’s theory has little to do with the content of language, but rather with its structure, which he says is universally encoded in our neural architecture. Children, he writes, “develop language because they’re pre-programmed to do this.” Syntax is prior to and independent of specific meaning, a point he demonstrated with the poetic sentence “Colorless green ideas sleep furiously.” Every English speaker can recognize the sentence as grammatical, even very small children, though it refers to no real objects and would never occur in conversation.

Conversely, we recognize “Furiously sleep ideas green colorless” as ungrammatical, though it means no more nor less than the first sentence. The regional variations on word order only underline his point since, in every case, children quickly understand how to use the version they’re presented with at roughly the same developmental age and in the same way. The existence of a theoretical Language Acquisition Device solves the chicken-egg problem of how children with no understanding of and only a very limited exposure to language, can learn to speak just by listening to language.

Chomsky’s theory was revolutionary in large part because it was testable, and researchers at the professor’s longtime employer, MIT, recently published evidence of a “language universal” they discovered in a comparative study of 37 languages. It's compelling research that just might anticipate the discovery of a physical Language Acquisition Device, or its neurobiological equivalent, in every human brain.

Related Content:

The Ideas of Noam Chomsky: An Introduction to His Theories on Language & Knowledge (1977)

Noam Chomsky Defines What It Means to Be a Truly Educated Person

5 Animations Introduce the Media Theory of Noam Chomsky, Roland Barthes, Marshall McLuhan, Edward Said & Stuart Hall

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

More in this category... »
Quantcast