Alice in Wonderland Syndrome: The Real Perceptual Disorder That May Have Shaped Lewis Carroll’s Creative World

Alice’s Adventures in Wonderland isn’t just a beloved children’s story: it’s also a neuropsychological  syndrome. Or rather the words “Alice in Wonderland,” as Lewis Carroll’s book is commonly known, have also become attached to a condition that, though not harmful in itself, causes distortions in the sufferer’s perception of reality. Other names include dysmetropsia or Todd’s syndrome, the latter of which pays tribute to the consultant psychiatrist John Todd, who defined the disorder in 1955. He described his patients as seeing some objects as much larger than they really were and other objects as much smaller, resulting in challenges not entirely unlike those faced by Alice when put by Carroll through her growing-and-shrinking paces.

Todd also suggested that Carroll had written from experience, drawing inspiration from the hallucinations he experienced when afflicted with what he called “bilious headache.”  The transformations Alice feels herself undergoing after she drinks from the “DRINK ME” bottle and eats the “EAT ME” cake are now known, in the neuropsychological literature, as macropsia and micropsia.




“I was in the kitchen talking to my wife,” writes novelist Craig Russell of one of his own bouts of the latter. “I was hugely animated and full of energy, having just put three days’ worth of writing on the page in one morning and was bursting with ideas for new books. Then, quite calmly, I explained to my wife that half her face had disappeared. As I looked around me, bits of the world were missing too.”

Though “many have speculated that Lewis Carroll took some kind of mind-altering drug and based the Alice books on his hallucinatory experiences,” writes Russell, “the truth is that he too suffered from the condition, but in a more severe and protracted way,” combined with ocular migraine. Russell also notes that the sci-fi visionary Philip K. Dick, though “never diagnosed as suffering from migrainous aura or temporal lobe epilepsy,” left behind a body of work that has has given rise to “a growing belief that the experiences he described were attributable to the latter, particularly.” Suitably, classic Alice in Wonderland syndrome “tends to be much more common in childhood” and disappear in maturity. One sufferer documented in the scientific literature is just six years old, younger even than Carroll’s eternal little girl — presumably, an eternal seer of reality in her own way.

Related Content:

A Beautiful 1870 Visualization of the Hallucinations That Come Before a Migraine

Behold Lewis Carroll’s Original Handwritten & Illustrated Manuscript for Alice’s Adventures in Wonderland (1864)

Lewis Carroll’s Photographs of Alice Liddell, the Inspiration for Alice in Wonderland

Ralph Steadman’s Warped Illustrations of Alice’s Adventures in Wonderland on the Story’s 150th Anniversary

Alice’s Adventures in Wonderland, Illustrated by Salvador Dalí in 1969, Finally Gets Reissued

Curious Alice — The 1971 Anti-Drug Movie Based on Alice in Wonderland That Made Drugs Look Like Fun

Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletter Books on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

Why Do We Dream?: An Animated Lesson

Why do we dream? It’s a question science still can’t answer, says the TED-Ed lesson above by Amy Adkins. Many neuroscientists currently make sense of dreaming as a way for the brain to consolidate memory at night. “This may include reorganizing and recoding memories in relation to emotional drives,” writes computational neuroscientist Paul King, “as well as transferring memories between brain regions.” You might imagine a defragging hard drive, the sorting and filing process happening while a computer sleeps.

But the brain is not a computer. Important questions remain. Why do dreams have such a powerful hold on us, not only individually, but — as a recent project collecting COVID dreams explores — collectively? Are dreams no more than gibberish, the mental detritus of the day, or do they convey important messages to our conscious minds? Several millennia before Freud’s The Interpretation of Dreams, “Mesopotamian kings recorded and interpreted their dreams on wax tablets.” A thousand years later, Egyptians catalogued one hundred of the most common dreams and their meanings in a dream book.




The ancients were convinced their dreams carried messages from beyond their consciousness. Many modern theorists beginning with Freud have seen dreams as purely self-referential, and neurotic. “We dream,” the lesson notes, “to fulfill our wishes.” Instead of messages from the gods, dreams are symbolic communication from unconscious repressed drives. Or, “we dream to remember,” as some contemporary neuroscientists claim, or “we dream to forget” as a neurobiological theory called “reverse learning” argued in 1983. Dreams are exercises for the brain, rehearsals, nighttime problem solving … the lesson touches briefly on each of these theories in turn.

But whatever answers science provides will hardly satisfy human curiosity about the content of our dreams. For this, perhaps, we should look elsewhere. We might turn, for example, to the Museum of Dreams, “a hub for exploring the social and political significance of dream-life.” Philosophical and scientific theories of dreaming are all speculative. “Rather than seek a definitive explanation, the Museum’s goal is to explore the generative and performative nature of dream-life — all the remarkable ways people have put their dreams to work.” Before we share and, yes, interpret our dreams with others, they remain, in Toni Morrison’s words, “unspeakable things unspoken.”

Related Content:

Do Our Dreams Predict the Future? Vladimir Nabokov Spent Three Months Testing That Theory in 1964

Do Octopi Dream? An Astonishing Nature Documentary Suggests They Do

Watch Dreams That Money Can Buy, a Surrealist Film by Man Ray, Marcel Duchamp, Alexander Calder, Fernand Léger & Hans Richter

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Alan Alda: 3 Ways to Express Your Thoughts So That Everyone Will Understand You

In need of someone to perform surgery in a combat zone, you probably wouldn’t choose Alan Alda, no matter how many times you’ve seen him do it on television. This sounds obvious to those of us who believe that actors don’t know how to do anything at all. But a performer like Alda doesn’t become a cultural icon by accident: his particular skill set has enabled him not just to communicate with millions at a time through film and television, but also to navigate his offscreen and personal life with a certain adeptness. In the Big Think video above, he reveals three of his own long-relied-upon strategies to “express your thoughts so that everyone will understand you.”

“I don’t really like tips,” Alda declares. Standard public-speaking advice holds that you should “vary the pace of your speech, vary the volume,” for example, but while sound in themselves, those strategies executed mechanically get to be “kind of boring.” Rather than operating according to a fixed playbook, as Alda sees it, your variations in pace and volume — or your gestures, movements around the stage, and everything else — should occur organically, as a product of “how you’re talking and relating” to your audience. A skilled speaker doesn’t follow rules per se, but gauges and responds dynamically to the listener’s understanding even as he speaks.




But if pressed, Alda can provide three tips “that I do kind of follow.” These he calls “the three rules of three”: first, “I try only to say three important things when I talk to people”; second, “If I have a difficult thing to understand, if there’s something I think is not going to be easy to get, I try to say it in three different ways”; third, ” I try to say it three times through the talk.” He gets deeper into his personal theories of communication in the second video below, beginning with a slightly contrarian defense of jargon: “When people in the same profession have a word that stands for five pages of written knowledge, why say five pages of stuff when you can say one word?” The trouble comes when words get so specialized that they hinder communication between people of different professions.

At its worst, jargon becomes a tool of dominance: “I’m smart; I talk like this,” its users imply, “You can’t really talk like this, so you’re not as smart as me.” But when we actively simplify our language to communicate to the broadest possible audience, we can discover “what are the concepts that really matter” beneath the jargon. All the better if we can tell a dramatic story to illustrate our point, as Alda does at the end of the video. It involves a medical student conveying a patient’s diagnosis more effectively than his supervisor, all thanks to his experience with the kind of “mirroring” exercises familiar to every student of acting. A doctor who can communicate is always preferable to one who can’t; even a real-life Hawkeye, after all, needs to make himself understood once in a while.

Related Content:

Alan Alda Uses Improv to Teach Scientists How to Communicate Their Ideas

What Is a Flame?: The First Prize-Winner at Alan Alda’s Science Video Competition

How to Speak: Watch the Lecture on Effective Communication That Became an MIT Tradition for Over 40 Years

Charles & Ray Eames’ A Communications Primer Explains the Key to Clear Communication in the Modern Age (1953)

Erich Fromm’s Six Rules of Listening: Learn the Keys to Understanding Other People from the Famed Psychologist

How to Get Over the Anxiety of Public Speaking?: Watch the Stanford Video, “Think Fast, Talk Smart,” Viewed Already 15 Million Times

Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletter Books on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

Make Body Language Your Superpower: A 15-Minute Primer on Body Language & Public Speaking from Stanford Business School

A few years ago, the idea of “power poses” — that is, physical stances that increase the dynamism of one’s personality — gained a great many adherents in a very short time, but not long thereafter emerged doubts as to its scientific soundness. Nevertheless, while standing with your hands on your hips may not change who you are, we can fairly claim that such a thing as body language does exist. And in that language, certain bodily arrangements communicate better messages than others: according to the presenters of the talk above, keeping your hands power-poseishly on your hips is actually a textbook bad public-speaking position, down there with shoving them in your pockets or clasping them before you in the dreaded “fig leaf.”

Now viewed well over 5.5 million times, “Make Body Language Your Superpower” was originally delivered as the final project of a team of graduate students at Stanford’s Graduate School of Business. That same institution gave us lecturer Matt Abrahams’ talk “Think Fast, Talk Smart,” which, with its 23 million views and counting, suggests its campus possesses a literal fount of public-speaking wisdom.




Working as a team, these students keep it short and simple, accompanying their talk with takeaway-announcing Powerpoint slides (“1. Posture breeds success, 2. Gestures strengthen our message, 3. The audience’s body matters too”) and even a video clip that vividly illustrates what not to do: in this case, with a fidgety, rotation-heavy turn on stage by Armageddon and Transformers auteur Michael Bay.

Though we can’t hear what Bay is saying, we couldn’t be blamed for assuming it’s not the truth. That owes not so much to the Hollywood penchant for dissimulation and hyperbole as it does to his particular stances, gestures, and perambulations, all of a kind that primes our subconsciousness to expect lies. “We all want to avoid our own Michael Bay moments when we communicate,” says one of the presenters, but even when we take pains to tell the truth, the whole truth, and nothing but the truth, the defensive postures into which many of us instinctively retreat can undercut our efforts. “Decoding Deceptive Body Language,” the talk just above, can help us learn both to identify the impression of dishonesty and to avoid giving it ourselves. Not that it’s always easy: as the example of Bill Clinton underscores in both these presentations, even master communicators have their slip-ups.

Related Content:

How to Get Over the Anxiety of Public Speaking?: Watch the Stanford Video, “Think Fast, Talk Smart,” Viewed Already 15 Million Times

How to Speak: Watch the Lecture on Effective Communication That Became an MIT Tradition for Over 40 Years

Can You Spot Liars Through Their Body Language? A Former FBI Agent Breaks Down the Clues in Non-Verbal Communication

How to Spot Bullshit: A Primer by Princeton Philosopher Harry Frankfurt

How to Sound Smart in a TED Talk: A Funny Primer by Saturday Night Live‘s Will Stephen

Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletter Books on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

Carl Jung Offers an Introduction to His Psychological Thought in a 3-Hour Interview (1957)

In the 1950s, it was fashionable to drop Freud’s name — often as not in pseudo-intellectual sex jokes. Freud’s preoccupations had as much to do with his fame as the actual practice of psychotherapy, and it was assumed — and still is to a great degree — that Freud had “won” the debate with his former student and friend Carl Jung, who saw religion, psychedelic drugs, occult practices, etc. as valid forms of individualizing and integrating human selves — selves that were after all, he thought, connected by far more than biological drives for sex and death.

Now Jung’s insights permeate the culture, in increasingly popular fields like transpersonal psychology, for example, that see humans as “radically interconnected, not just isolated individuals,” psychologist Harris L. Friedman argues. Movements like these grew out of the “counterculture movements of the 1960s,” psychology lecturer and author Steve Taylor explains, “and the wave of psycho-experimentation it involved, through psychedelic substances, meditation and other consciousness changing practices” — the very practices Jung explored in his work.




Indeed, Jung was the first “to legitimize a spiritual approach to the practice of depth psychology,” Mark Kasprow and Bruce Scotton point out, and “suggested that psychological development extends to include higher states of consciousness and can continue throughout life, rather than stop with the attainment of adult ego maturation.” Against Freud, who thought transcendence was regression, Jung “proposed that transcendent experience lies within and is accessible to everyone, and that the healing and growth stimulated by such experience often make use of the languages of symbolic imagery and nonverbal experience.”

Jung’s work became increasingly important after his death in 1961, leading to the publication of his collected works in 1969. These introduced readers to all of his  “key concepts and ideas, from archetypal symbols to analytical psychology to UFOs,” notes a companion guide. Near the end of his life, Jung himself provided a verbal survey of his life’s work in the form of four one-hour interviews conducted in 1957 by University of Houston’s Dr. Richard Evans at the Eidgenossische Technische Hoschschule (Federal Institute of Technology) in Zurich.

“The conversations were filmed as part of an educational project designed for students of the psychology department. Evans is a poor interviewer, but Jung compensates well,” the Gnostic Society Library writes. The edited interviews begin with a question about Jung’s concept of persona (also, incidentally, the theme and title of Ingmar Bergman’s 1966 masterpiece). In response, Jung describes the persona in plain terms and with everyday examples as a fictional self “partially dictated by society and partially dictated by the expectations or the wishes one nurses oneself.”

The less we’re consciously aware of our public selves as performances in these terms, the more we’re prone, Jung says, to neuroses, as the pressure of our “shadow,” exerts itself. Jung and Evans’ discussion of persona only grazes the surface of their wide-ranging conversation about the unconscious and the many ways to access it. Throughout, Jung’s examples are clear and his explanations lucid. Above, you can see a transcribed video of the same interviews. Read a published transcript in the collection C.G. Jung Speaking, and see more Jung interviews and documentaries at the Gnostic Society Library.

Related Content: 

Zen Master Alan Watts Explains What Made Carl Jung Such an Influential Thinker

The Visionary Mystical Art of Carl Jung: See Illustrated Pages from The Red Book

How Carl Jung Inspired the Creation of Alcoholics Anonymous

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Take an Intellectual Odyssey with a Free MIT Course on Douglas Hofstadter’s Pulitzer Prize-Winning Book Gödel, Escher, Bach: An Eternal Golden Braid

In 1979, mathematician Kurt Gödel, artist M.C. Escher, and composer J.S. Bach walked into a book title, and you may well know the rest. Douglas R. Hofstadter won a Pulitzer Prize for Gödel, Escher, Bach: an Eternal Golden Braid, his first book, thenceforth (and henceforth) known as GEB. The extraordinary work is not a treatise on mathematics, art, or music, but an essay on cognition through an exploration of all three — and of formal systems, recursion, self-reference, artificial intelligence, etc. Its publisher settled on the pithy description, “a metaphorical fugue on minds and machines in the spirit of Lewis Carroll.”

GEB attempted to reveal the mind at work; the minds of extraordinary individuals, for sure, but also all human minds, which behave in similarly unfathomable ways. One might also describe the book as operating in the spirit — and the practice — of Herman Hesse’s Glass Bead Game, a novel Hesse wrote in response to the data-driven machinations of fascism and their threat to an intellectual tradition he held particularly dear. An alternate title (and key phrase in the book) Magister Ludi, puns on both “game” and “school,” and alludes to the importance of play and free association in the life of the mind.




Hesse’s esoteric game, writes his biographer Ralph Freedman, consists of “contemplation, the secrets of the Chinese I Ching and Western mathematics and music” and seems similar enough to Hofstadter’s approach and that of the instructors of MIT’s open course, Gödel, Escher, Bach: A Mental Space Odyssey. Offered through the High School Studies Program as a non-credit enrichment course, it promises “an intellectual vacation” through “Zen Buddhism, Logic, Metamathematics, Computer Science, Artificial Intelligence, Recursion, Complex Systems, Consciousness, Music and Art.”

Students will not study directly the work of Gödel, Escher, and Bach but rather “find their spirits aboard our mental ship,” the course description notes, through contemplations of canons, fugues, strange loops, and tangled hierarchies. How do meaning and form arise in systems like math and music? What is the relationship of figure to ground in art? “Can recursion explain creativity,” as one of the course notes asks. Hofstadter himself has pursued the question beyond the entrenchment of AI research in big data and brute force machine learning. For all his daunting erudition and challenging syntheses, we must remember that he is playing a highly intellectual game, one that replicates his own experience of thinking.

Hofstadter suggests that before we can understand intelligence, we must first understand creativity. It may reveal its secrets in comparative analyses of the highest forms of intellectual play, where we see the clever formal rules that govern the mind’s operations; the blind alleys that explain its failures and limitations; and the possibility of ever actually reproducing workings in a machine. Watch the lectures above, grab a copy of Hofstadter’s book, and find course notes, readings, and other resources for the fascinating course Gödel, Escher, Bach: A Mental Space Odyssey archived here. The course will be added to our list, 1,700 Free Online Courses from Top Universities.

Related Content: 

How a Bach Canon Works. Brilliant.

Mathematics Made Visible: The Extraordinary Mathematical Art of M.C. Escher

The Mirroring Mind: An Espresso-Fueled Interpretation of Douglas Hofstadter’s Groundbreaking Ideas

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Are We All Getting More Depressed?: A New Study Analyzing 14 Million Books, Written Over 160 Years, Finds the Language of Depression Steadily Rising


The relations between thought, language, and mood have become subjects of study for several scientific fields of late. Some of the conclusions seem to echo religious notions from millennia ago. “As a man thinketh, so he is,” for example, proclaims a famous verse in Proverbs (one that helped spawn a self-help movement in 1903). Positive psychology might agree. “All that we are is the result of what we have thought,” says one translation of the Buddhist Dhammapada, a sentiment that cognitive behavioral therapy might endorse.

But the insights of these traditions — and of social psychology — also show that we’re embedded in webs of connection: we don’t only think alone; we think — and talk and write and read — with others. External circumstances influence mood as well as internal states of mind. Approaching these questions differently, researchers at the Luddy School of Informatics, Computing, and Engineering at Indiana University asked, “Can entire societies become more or less depressed over time?,” and is it possible to read collective changes in mood in the written languages of the past century or so?




The team of scientists, led by Johan Bollen, Indiana University professor of informatics and computing, took a novel approach that brings together tools from at least two fields: large-scale data analysis and cognitive-behavioral therapy (CBT). Since diagnostic criteria for measuring depression have only been around for the past 40 years, the question seemed to resist longitudinal study. But CBT provided a means of analyzing language for markers of “cognitive distortions” — thinking that skews in overly negative ways. “Language is closely intertwined with this dynamic” of thought and mood, the researchers write in their study, “Historical language records reveal a surge of cognitive distortions in recent decades,” published just last month in PNAS.

Choosing three languages, English (US), German, and Spanish, the team looked for “short sequences of one to five words (n-grams), labeled cognitive distortion schemata (CDS).” These words and phrases express negative thought processes like “catastrophizing,” “dichotomous reasoning,” “disqualifying the positive,” etc. Then, the researchers identified the prevalence of such language in a collection of over 14 million books published between 1855 and 2019 and uploaded to Google Books. The study controlled for language and syntax changes during that time and accounted for the increase in technical and non-fiction books published (though it did not distinguish between literary genres).

What the scientists found in all three languages was a distinctive “‘hockey stick’ pattern” — a sharp uptick in the language of depression after 1980 and into the present time. The only spikes that come close on the timeline occur in English language books during the Gilded Age and books published in German during and immediately after World War II. (Highly interesting, if unsurprising, findings.) Why the sudden, steep climb in language signifying depressive thinking? Does it actually mark a collective shift in mood, or show how historically oppressed groups have had more access to publishing in the past forty years, and have expressed less satisfaction with the status quo?

While they are careful to emphasize that they “make no causal claims” in the study, the researchers have some ideas about what’s happened, observing for example:

The US surge in CDS prevalence coincides with the late 1970s when wages stopped tracking increasing work productivity. This trend was associated with rises in income inequality to recent levels not seen since the 1930s. This phenomenon has been observed for most developed economies, including Germany, Spain and Latin America.

Other factors cited include the development of the World Wide Web and its facilitation of political polarization, “in particular us-vs.-them thinking… dichotomous reasoning,” and other maladaptive thought patterns that accompany depression. The scale of these developments might be enough to explain a major collective rise in depression, but one commenter offers an additional gloss:

The globe is *Literally* on fire, or historically flooding – Multiple economic crashes barely decades apart – a ghost town of a housing market – a multi-year global pandemic – wealth concentration at the .01% level – terrible pay/COL equations – blocking unionization/workers rights – abusive militarized police, without the restraint or training of actual military –  You can’t afford X for a monthly mortgage payment!  Pay 1.5x for rent instead! – endless wars for the last… 30…years? 50 if we include stuff like Korea, Cold War, Vietnam… How far has the IMC been milking the gov for funds to make the rich richer? Oh, and a billionaire 3-way space race to determine who’s got the biggest “rocket”

These sound like reasons for global depression indeed, but the arrow could also go the other way: maybe catastrophic reasoning produced actual catastrophes; black and white thinking led to endless wars, etc…. More study is needed, says Bollen and his colleagues, yet it seems probable, given the data, that “large populations are increasingly stressed by pervasive cultural, economic, and social changes” — changes occurring more rapidly, frequently, and with greater impact on our daily lives than ever before. Read the full study at PNAS

Related Content: 

Stanford’s Robert Sapolsky Demystifies Depression, Which, Like Diabetes, Is Rooted in Biology

A Unified Theory of Mental Illness: How Everything from Addiction to Depression Can Be Explained by the Concept of “Capture”

Charles Bukowski Explains How to Beat Depression: Spend 3-4 Days in Bed and You’ll Get the Juices Flowing Again (NSFW)

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

B.F. Skinner Demonstrates His “Teaching Machine,” the 1950s Automated Learning Device

The name B.F. Skinner often provokes darkly humorous references to such bizarre ideas as “Skinner boxes,” which put babies in cage-like cribs, and put the cribs in windows as if they were air-conditioners, leaving the poor infants to raise themselves. Skinner was hardly alone in conducting experiments that flouted, if not flagrantly ignored, the ethical concerns now central to experimentation on humans. The code of conduct on torture and abuse that ostensibly governs members of the American Psychological Association did not exist. Radical behaviorists like Skinner were redefining the field. His work has come to stand for some of its worst abuses.

But Skinner has been mischaracterized in the popularization of his ideas — a popularization, it’s true, in which he enthusiastically took part. The actual “Skinner box” was cruel enough — an electrified cage for animal experimentation — but it was not the infant window box that often goes by the name. This was, instead, called an “aircrib” or “baby-tender,” and it was loaded with creature comforts like climate control and a complement of toys. “In our compartment,” Skinner wrote in a 1945 Ladies Home Journal article, “the waking hours are invariably active and happy ones.” Describing his first test subject, his own child, he wrote, “our baby acquitted an amusing, almost apelike skill in the use of her feet.”




Skinner was not a soulless monster who put babies in cages, but he also did not understand mammalian babies’ need for physical touch. Likewise, when it came to education, Skinner had ideas that can seem contrary to what we know works best, namely a variety of methods that honor different learning styles and abilities. Educators in the 1950s embraced far more regimented practices, and Skinner believed humans could be trained just like other animals. He treated an early experiment in classroom technology just like an experiment teaching pigeons to play ping-pong. It was, in fact, “the foundation for his education technology,” says education journalist Audrey Watters, “that we’ll build machines and they’ll give students — just like pigeons — positive reinforcement and students — just like pigeons — will learn new skills.”

To this end, Skinner created what he called the Teaching Machine in 1954 while he taught psychology at Harvard. He was hardly the first to design such a device, but he was the first to invent a machine based on behaviorist principles, as Abhishek Solanki explains in a Medium article:

The teaching machine was composed of mainly a program, which was a system of combined teaching and test items that carried the student gradually through the material to be learned. The “machine” was composed of a fill-in-the-blank method on either a workbook or on a computer. If the student was correct, he/she got reinforcement and moved on to the next question. If the answer was incorrect, the student studied the correct answer to increasing the chances of getting reinforced next time.

Consisting of a wooden box, a metal lid with cutouts, and various paper discs with questions and answers written on them, the machine did adjust for different students’ needs, in a way. Skinner “noted that the learning process should be divided into a large number of very small steps and reinforcement must be dependent upon the completion of each step. He believed this was the best possible arrangement for learning because it took into account the rate of learning for each individual student.” He was again inspired by his own children, coming up with the machine after visiting his daughter’s school and deciding he could improve on things.

The method and means of learning, as you’ll see in the demonstration films above, were not individualized. “There was very, very little freedom in Skinner’s vision,” says Watters. “Indeed Skinner wrote a very well-known book, Beyond Freedom and Dignity in the early 1970s, in which he said freedom doesn’t exist.” While Skinner’s machine didn’t itself become widely used, his ideas about education, and education technology, are still very much with us. We see Skinner’s machine “taking new forms with adaptive teaching and e-learning,” writes Solanki.

And we see the darker side of his design in classroom technology, says Watters, in an industry that profits from alienating, one-size-fits all ed-tech solutions. But she also sees “students who are resisting and communities who are building practices that serve their needs rather than serving the needs of engineers.” Skinner’s theories of conditioning were and are incredibly persuasive, but his reductive views of human nature seem to leave out more than they explain. Learn more about the history of teaching machines in Watters’ new book, Teaching Machines: The History of Personalized Learning.

Related Content: 

The Little Albert Experiment: The Perverse 1920 Study That Made a Baby Afraid of Santa Claus & Bunnies

Hermann Rorschach’s Original Rorschach Test: What Do You See? (1921)

A Brief Animated Introduction to Noam Chomsky’s Linguistic Theory, Narrated by The X-Files‘ Gillian Anderson

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

More in this category... »
Quantcast
Open Culture was founded by Dan Colman.