Charlie Chaplin started appearing in his first films in 1914—40 films, to be precise—and, by 1915, the United States had a major case of “Chaplinitis.” Chaplin mustaches were suddenly popping up everywhere–as were Chaplin imitators and Chaplin look-alike contests. A young Bob Hope apparently won one such contest in Cleveland. Chaplin Fever continued burning hot through 1921, the year when the Chaplin look-alike contest, shown above, was held outside the Liberty Theatre in Bellingham, Washington.
According to legend, somewhere between 1915 and 1921, Chaplin decided to enter a Chaplin look-alike contest, and lost, badly.
A short article called “How Charlie Chaplin Failed,” appearing in The Straits Times of Singapore in August of 1920, read like this:
Lord Desborough, presiding at a dinner of the Anglo-Saxon club told a story which will have an enduring life. It comes from Miss Mary Pickford who told it to Lady Desborough, “Charlie Chaplin was one day at a fair in the United States, where a principal attraction was a competition as to who could best imitate the Charlie Chaplin walk. The real Charlie Chaplin thought there might be a chance for him so he entered for the performance, minus his celebrated moustache and his boots. He was a frightful failure and came in twentieth.
A variation on the same story appeared in a New Zealand newspaper, the Poverty Bay Herald, again in 1920. As did another story in the Australian newspaper, the Albany Advertiser, in March, 1921.
A competition in Charlie Chaplin impersonations was held in California recently. There was something like 40 competitors, and Charlie Chaplin, as a joke, entered the contest under an assumed name. He impersonated his well known film self. But he did not win; he was 27th in the competition.
Did Chaplin come in 20th place? 27th place? Did he enter a contest at all? It’s fun to imagine that he did. But, a century later, many consider the story the stuff of urban legend. When one researcher asked the Association Chaplin to weigh in, they apparently had this to say: “This anecdote told by Lord Desborough, whoever he may have been, was quite widely reported in the British press at the time. There are no other references to such a competition in any other press clipping albums that I have seen so I can only assume that this is the source of that rumour, urban myth, whatever it is. However, it may be true.”
I’d like to believe it is.
Note: An earlier version of this post appeared on our site in early 2016.
We can all remember seeing images of medieval Europeans wearing pointy shoes, but most of us have paid scant attention to the shoes themselves. That may be for the best, since the more we dwell on one fact of life in the Middle Ages or another, the more we imagine how uncomfortable or even painful it must have been by our standards. Dentistry would be the most vivid example, but even that fashionable, vaguely elfin footwear inflicted suffering, especially at the height of its popularity — not least among flashy young men — in the fourteenth and fifteenth centuries.
Called poulaines, a name drawn from the French word for Poland in reference to the footwear’s supposedly Polish origin, these pointy shoes appeared around the time of Richard II’s marriage to Anne of Bohemia in 1382. “Both men and women wore them, although the aristocratic men’s shoes tended to have the longest toes, sometimes as long as five inches,” writes Ars Technica’s Jennifer Ouellette. “The toes were typically stuffed with moss, wool, or horsehair to help them hold their shape.” If you’ve ever watched the first Blackadder series, know that the shoes worn by Rowan Atkinson’s hapless plotting prince may be comic, but they’re not an exaggeration.
Regardless, he was a bit behind the times, given that the show was set in 1485, right when poulaines went out of fashion. But they’d already done their damage, as evidenced by a 2021 study linking their wearing to nasty foot disorders. “Bunions — or hallux valgus — are bulges that appear on the side of the foot as the big toe leans in towards the other toes and the first metatarsal bone points outwards,” writes the Guardian’s Nicola Davis. A team of University of Cambridge researchers found signs of them being more prevalent in the remains of individuals buried in the fourteenth and fifteenth centuries than those buried from the eleventh through the thirteenth centuries.
Yet bunions were hardly the evil against which the poulaine’s contemporary critics inveighed. After the Great Pestilence of 1348, says the London Museum, “clerics claimed the plague was sent by God to punish Londoners for their sins, especially sexual sins.” The shoes’ lascivious associations continued to draw ire: “In 1362, Pope Urban V passed an edict banning them, but it didn’t really stop anybody from wearing them.” Then came sumptuary laws, according to which “commoners were charged to wear shorter poulaines than barons and knights.” The power of the state may be as nothing against that of the fashion cycle, but had there been a law against the bluntly square-toed shoes in vogue when I was in high school, I can’t say I would’ve objected.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
There have been many theories of how human history works. Some, like German thinker G.W.F. Hegel, have thought of progress as inevitable. Others have embraced a more static view, full of “Great Men” and an immutable natural order. Then we have the counter-Enlightenment thinker Giambattista Vico. The 18th century Neapolitan philosopher took human irrationalism seriously, and wrote about our tendency to rely on myth and metaphor rather than reason or nature. Vico’s most “revolutionary move,” wrote Isaiah Berlin, “is to have denied the doctrine of a timeless natural law” that could be “known in principle to any man, at any time, anywhere.”
Vico’s theory of history included inevitable periods of decline (and heavily influenced the historical thinking of James Joyce and Friedrich Nietzsche). He describes his concept “most colorfully,” writes Alexander Bertland at the Internet Encyclopedia of Philosophy, “when he gives this axiom”:
Men first felt necessity then look for utility, next attend to comfort, still later amuse themselves with pleasure, thence grow dissolute in luxury, and finally go mad and waste their substance.
The description may remind us of Shakespeare’s “Seven Ages of Man.” But for Vico, Bertland notes, every decline heralds a new beginning. History is “presented clearly as a circular motion in which nations rise and fall… over and over again.”
Two-hundred and twenty years after Vico’s 1774 death, Carl Sagan—another thinker who took human irrationalism seriously—published his book The Demon Haunted World, showing how much our everyday thinking derives from metaphor, mythology, and superstition. He also foresaw a future in which his nation, the U.S., would fall into a period of terrible decline:
I have a foreboding of an America in my children’s or grandchildren’s time — when the United States is a service and information economy; when nearly all the manufacturing industries have slipped away to other countries; when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what’s true, we slide, almost without noticing, back into superstition and darkness…
Sagan believed in progress and, unlike Vico, thought that “timeless natural law” is discoverable with the tools of science. And yet, he feared “the candle in the dark” of science would be snuffed out by “the dumbing down of America…”
…most evident in the slow decay of substantive content in the enormously influential media, the 30 second sound bites (now down to 10 seconds or less), lowest common denominator programming, credulous presentations on pseudoscience and superstition, but especially a kind of celebration of ignorance…
Sagan died in 1996, a year after he wrote these words. No doubt he would have seen the fine art of distracting and misinforming people through social media as a late, perhaps terminal, sign of the demise of scientific thinking. His passionate advocacy for science education stemmed from his conviction that we must and can reverse the downward trend.
As he says in the poetic excerpt from Cosmos above, “I believe our future depends powerfully on how well we understand this cosmos in which we float like a mote of dust in the morning sky.”
When Sagan refers to “our” understanding of science, he does not mean, as he says above, a “very few” technocrats, academics, and research scientists. Sagan invested so much effort in popular books and television because he believed that all of us needed to use the tools of science: “a way of thinking,” not just “a body of knowledge.” Without scientific thinking, we cannot grasp the most important issues we all jointly face.
We’ve arranged a civilization in which most crucial elements profoundly depend on science and technology. We have also arranged things so that almost no one understands science and technology. This is a prescription for disaster. We might get away with it for a while, but sooner or later this combustible mixture of ignorance and power is going to blow up in our faces.
Sagan’s 1995 predictions are now being heralded as prophetic. As Director of Public Radio International’s Science Friday, Charles Bergquist tweeted, “Carl Sagan had either a time machine or a crystal ball.” Matt Novak cautions against falling back into superstitious thinking in our praise of Demon Haunted World. After all, he says, “the ‘accuracy’ of predictions is often a Rorschach test” and “some of Sagan’s concerns” in other parts of the book “sound rather quaint.”
Of course Sagan couldn’t predict the future, but he did have a very informed, rigorous understanding of the issues of thirty years ago, and his prediction extrapolates from trends that have only continued to deepen. If the tools of science education—like most of the country’s wealth—end up the sole property of an elite, the rest of us will fall back into a state of gross ignorance, “superstition and darkness.” Whether we might come back around again to progress, as Giambattista Vico thought, is a matter of sheer conjecture. But perhaps there’s still time to reverse the trend before the worst arrives. As Novak writes, “here’s hoping Sagan, one of the smartest people of the 20th century, was wrong.”
Note: An earlier version of this post appeared on our site in 2017.
One would count neither Elon Musk nor Neil deGrasse Tyson among the most reserved public figures of the twenty-first century. Given the efforts Musk has been making to push into the business of outer space, which has long been Tyson’s intellectual domain, it’s only natural that the two would come into conflict. Not long ago, the media eagerly latched on to signs of a “feud” that seemed to erupt between them over Tyson’s remark that Musk — or rather, his company SpaceX — “hasn’t done anything that NASA hasn’t already done. The actual space frontier is still held by NASA.”
What this means is that SpaceX has yet to take humanity anywhere in outer space we haven’t been before. That’s not a condemnation, but in fact a description of business as usual. “The history of really expensive things ever happening in civilization has, in essentially every case, been led, geopolitically, by nations,” Tyson says in the StarTalk video above. “Nations lead expensive projects, and when the costs of these projects are understood, the risks are quantified, and the time frames are established, then private enterprise comes in later, to see if they can make a buck off of it.”
To go, boldly or otherwise, “where no one has gone before often involves risk that a company that has investors will not take, unless there’s a very clear return on investment. Governments don’t need a financial return on investment if they can get a geopolitical return on investment.” Though private enterprise may be doing more or less what NASA has been doing for 60 years, Tyson hastens to add, private enterprise does do it cheaper. In that sense, “SpaceX has been advancing the engineering frontier of space exploration,” not least by its development of reusable rockets. Still, that’s not exactly the Final Frontier.
Musk has made no secret of his aspirations to get to Mars, but Tyson doesn’t see that eventuality as being led by SpaceX per se. “The United States decides, ‘We need to send astronauts to Mars,’ ” he imagines. “Then NASA looks around and says, ‘We don’t have a rocket to do that.’ And then Elon says ‘I have a rocket!’ and rolls out his rocket to Mars. Then we ride in the SpaceX rocket to Mars.” That scenario will look even more possible if the unmanned Mars missions SpaceX has announced go according to plan. Whatever their differences, Tyson and Musk — and every true space enthusiast — surely agree that it doesn’t matter where the money comes from, just as long as we get out there one day soon.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
Not everyone on August 1, 1981 had a VCR at their disposal, and not everybody stayed up until midnight. But fortunately at least one person did, in order to tape the first two hours of a new cable channel called MTV: Music Television. Did they know it would be historic? MTV certainly hoped it would be: they equated the premiere of this 24/7 video version of radio with the moon landing. People born long after this time might wonder why a MTV Music Video award statuette was honoring Buzz Aldrin. But at the time, it made sense. “Ladies and Gentlemen, Rock and Roll.” It was a statement: less than three decades after the first rock and roll single, this genre of music had won—-it had colonized the planet. And beyond the planet, the next stop: the universe.
It’s fitting the execs chose as their first selection The Buggles’ “Video Killed the Radio Star.” Visuals were not just going to be an adjunct to the music, they were going to become inextricably linked. Either MTV was prescient about the visual decade to come or they in fact caused it to happen. Music videos or short films had been around since the invention of sound in the cinema, but MTV was *all* videos, *all the time*, brought to Americans due to the deregulation of the television industry in 1972 and the slow growth of cable channels.
After a Pat Benatar video, the VJs introduce themselves—-Mark Goodman, Nina Blackwood, J.J. Jackson, Alan Hunter, and Martha Quinn (all soon to be household names and crushes)-—and then straight into a block of commercials: school binders, Superman II, and Dolby Noise Reduction. A strange group of advertisers, to be sure. Goodman returns to ask, blindly, “Aren’t those guys the best?” Goodman has no idea what has preceded him.
Yes, the first day of MTV was pretty rough. In fact, it’s a bit like a DJ who turns up to a gig to find they’ve left most of their records across town. In the first two hours we get two Rod Stewart songs, two by the Pretenders, two by Split Ends, another Pat Benatar video, two from Styx, and two from the concert film for the People of Kampuchea. We also get completely obscure videos: PH.D. “Little Susie’s on the Up”, Robin Lane and the Chartbusters “When Things Go Wrong”, Michael Johnson “Bluer Than Blue”. This is D‑list stuff. No wonder MTV premiered at midnight.
From these humble beginnings the channel would soon find its groove and two years later it would become ubiquitous in American households.
People predicted the end of MTV right from the beginning. It would be a fad, or it would run out of videos to play. Forty years later, the channel has rebranded itself into oblivion. And while music videos still get made, none have the effect that those first two decades had on generations of viewers. To paraphrase the Buggles, we have seen the playback and it seems so long ago.
Note: An earlier version of this post appeared on our site in 2021.
It’s practically guaranteed that we now have more stupid people on the planet than ever before. Of course, we might be tempted to think; just look at how many of them disagree with mypolitics. But this unprecedented stupidity is primarily, if not entirely, a function of an unprecedentedly large global population. The more important matter has less to do with quantity of stupidity than with its quality: of all the forms it can take, which does the most damage? Robert Greene, author of The 48 Laws of Power and The Laws of Human Nature, addresses that question in the clip above from an interview with podcaster Chris Williamson.
“What makes people stupid,” Greene explains, “is their certainty that they have all the answers.” The basic idea may sound familiar, since we’ve previously featured here on Open Culture the related phenomenon of the Dunning-Kruger effect. In some sense, stupid people who know they’re stupid aren’t actually stupid, or at least not harmfully so.
True to form, Greene makes a classical reference: Athens’ leaders went into the Peloponnesian War certain of victory, when it actually brought about the end of the Athenian golden age. “People who are certain of things are very stupid,” he says, “and when they have power, they’re very, very dangerous,” perhaps more so than those we would call evil.
This brings to mind the oft-quoted principle known as Hanlon’s Razor: “Never attribute to malice that which is adequately explained by stupidity.” But even in otherwise intelligent individuals, a tendency toward premature certainty can induce that stupidity. Better, in Greene’s view, to cultivate what John Keats, inspired by Shakespeare, called “negative capability”: the power to “hold two thoughts in your head at the same time, two thoughts that apparently contradict each other.” We might consider, for instance, entertaining the ideas of our aforementioned political enemies — not fully accepting them, mind you, but also not fully accepting our own. It may, at least, prevent the onset of stupidity, a condition that’s clearly difficult to cure.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
Note: Yesterday, Marianne Faithfull passed away at age 78. In her memory, we’re bringing back a favorite from deep in our archive. It originally appeared on our site in June 2012.
That film came out in 1966, two years before the immortal Airplane show but well into Godard’s first major burst of daring creativity, which began with 1959’s Breathless and lasted at least until Sympathy for the Devil, his 1968 documentary on — or, anyway, including — the Rolling Stones. Brody pointed specifically to the clip above, a brief scene where Marianne Faithfull sings “As Tears Go By,” a hit, in separate recordings, for both Faithfull and the Stones.
Brody notes how these two minutes of a cappella performance from the 19-year-old Faithfull depict the “styles of the day.” For a long time since that day, alas, we American filmgoers hadn’t had a chance to fully experience Made in U.S.A. Godard based its script on Donald E. Westlake’s novel The Jugger but never bothered to secure adaptation rights, and the film drifted in legal limbo until 2009. But today, with that red tape cut, crisp new prints circulate freely around the United States. Keep an eye on your local revival house’s listings so you won’t miss your chance to witness Faithfull’s café performance, and other such Godardian moments, in their theatrical glory. The cinephilically intrepid Brody, of course, found a way to see it, after a fashion, nearly thirty years before its legitimate American release: “The Mudd Club (the White Street night spot and music venue) got hold of a 16-mm. print and showed it — with the projector in the room — to a crowd of heavy smokers. It was like watching a movie outdoors in London by night, or as if through the shrouding mists of time.”
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
Offered on the Coursera platform, the Digital Marketing & E‑Commerce Professional Certificate consists of seven courses, all collectively designed to help students “develop digital marketing and e‑commerce strategies; attract and engage customers through digital marketing channels like search and email; measure marketing analytics and share insights; build e‑commerce stores, analyze e‑commerce performance, and build customer loyalty.” The courses include:
In total, this program “includes over 190 hours of instruction and practice-based assessments, which simulate real-world digital marketing and e‑commerce scenarios that are critical for success in the workplace.” Along the way, students will learn how to use tools and platforms like Canva, Constant Contact, Google Ads, Google Analytics, Hootsuite, HubSpot, Mailchimp, Shopify, and Twitter. The courses also focus on some timely AI topics–like how to kickstart marketing strategy ideas with AI, or use AI to help you understand your audience.
You can start a 7‑day free trial and explore the courses. If you continue beyond that, Google/Coursera will charge $49 USD per month. That translates to about $300 after 6 months.
Note: Open Culture has a partnership with Coursera. If readers enroll in certain Coursera courses and programs, it helps support Open Culture.
A 600-year-old manuscript—written in a script no one has ever decoded, filled with cryptic illustrations, its origins remaining to this day a mystery…. It’s not as satisfying a plot, say, of a National Treasure or Dan Brown thriller, certainly not as action-packed as pick-your-Indiana Jones…. The Voynich Manuscript, named for the antiquarian who rediscovered it in 1912, has a much more hermetic nature, somewhat like the work of Henry Darger; it presents us with an inscrutably alien world, pieced together from hybridized motifs drawn from its contemporary surroundings.
The Voynich Manuscriptis unique for having made up its own alphabet while also seeming to be in conversation with other familiar works of the period, such that it resembles an uncanny doppelganger of many a medieval text.
A comparatively long book at 234 pages, it roughly divides into seven sections, any of which might be found on the shelves of your average 1400s European reader—a fairly small and rarefied group. “Over time, Voynich enthusiasts have given each section a conventional name” for its dominant imagery: “botanical, astronomical, cosmological, zodiac, biological, pharmaceutical, and recipes.”
Scholars can only speculate about these categories. The manuscript’s origins and intent have baffled cryptologists since at least the 17th century, when, notes Vox, “an alchemist described it as ‘a certain riddle of the Sphinx.’” We can presume, “judging by its illustrations,” writes Reed Johnson at The New Yorker, that Voynich is “a compendium of knowledge related to the natural world.” But its “illustrations range from the fanciful (legions of heavy-headed flowers that bear no relation to any earthly variety) to the bizarre (naked and possibly pregnant women, frolicking in what look like amusement-park waterslides from the fifteenth century).”
The manuscript’s “botanical drawings are no less strange: the plants appear to be chimerical, combining incompatible parts from different species, even different kingdoms.” These drawings led scholar Nicholas Gibbs to compare it to the Trotula, a Medieval compilation that “specializes in the diseases and complaints of women,” as he wrote in a Times Literary Supplement article. It turns out, according to several Medieval manuscript experts who have studied the Voynich, that Gibbs’ proposed decoding may not actually solve the puzzle.
The degree of doubt should be enough to keep us in suspense, and therein lies the Voynich Manuscript’s enduring appeal—it is a black box, about which we might always ask, as Sarah Zhang does, “What could be so scandalous, so dangerous, or so important to be written in such an uncrackable cipher?” Wilfred Voynich himself asked the same question in 1912, believing the manuscript to be “a work of exceptional importance… the text must be unraveled and the history of the manuscript must be traced.” Though “not an especially glamorous physical object,” Zhang observes, it has nonetheless taken on the aura of a powerful occult charm.
But maybe it’s complete gibberish, a high-concept practical joke concocted by 15th century scribes to troll us in the future, knowing we’d fill in the space of not-knowing with the most fantastically strange speculations. This is a proposition Stephen Bax, another contender for a Voynich solution, finds hardly credible. “Why on earth would anyone waste their time creating a hoax of this kind?,” he asks. Maybe it’s a relic from an insular community of magicians who left no other trace of themselves. Surely in the last 300 years every possible theory has been suggested, discarded, then picked up again.
Should you care to take a crack at sleuthing out the Voynich mystery—or just to browse through it for curiosity’s sake—you can find the manuscript scanned at Yale’s Beinecke Rare Book & Manuscript Library, which houses the vellum original. Or flip through the Internet Archive’s digital version above. Another privately-run site contains a history and description of the manuscript and annotations on the illustrations and the script, along with several possible transcriptions of its symbols proposed by scholars. Good luck!
Note: An earlier version of this post appeared on our site in 2017.
Several generations of American students have now had the experience of being told by an English teacher that they’d been reading Robert Frost all wrong, even if they’d never read him at all. Most, at least, had seen his lines “Two roads diverged in a wood, and I— / I took the one less traveled by, / And that has made all the difference” — or in any case, they’d heard them quoted with intent to inspire. “ ‘The Road Not Taken’ has nothing to do with inspiration and stick-to-it-iveness,” writes The Hedgehog Review’s Ed Simon in a reflection on Frost’s 150th birthday. Rather, “it’s a melancholic exhalation at the futility of choice, a dirge about enduring in the face of meaninglessness.”
Similarly misinterpreted is Frost’s second-known poem, “Stopping by Woods on a Snowy Evening,” whose wagon-driving narrator declares that “the woods are lovely, dark and deep, / But I have promises to keep / And miles to go before I sleep, / And miles to go before I sleep.” You can hear the whole thing read aloud by Frost himself in the new video above from Evan Puschak, better known as the Nerdwriter. “What draws me in is the crystalline clarity of the imagery,” says Puschak. “You instantly picture this quiet, wintry evening scene that Frost conjures,” one that feels as if it belongs in “a liminal space” where “time and nature are not divided and structured in human ways.”
Frost evokes this feeling “precisely by structuring time and space in a human way” — that is, using the structures of poetry. Puschak breaks down the relevant techniques like its rhythm, meter, and rhyme scheme (rhyming being a quality of his work that once got him labeled, as Simon puts it, “a jingle man out of step with the prosodic conventions of the twentieth century”). But “the seeming simplicity of the imagery, phrasing, and structure of this poem conceal a lot of subtlety,” and the more you look at it, “the more you see the real world intruding on the narrator’s meditative moment.”
“It’s hard not to read ‘Stopping by Woods on a Snowy Evening’ as concerning self-annihilation (albeit self-annihilation avoided),” writes Simon. After all, why place that “But” after “the observation of the dark, lovely finality of the woods, of that frozen lake so amenable to drowning oneself, if only then to reaffirm that here are promises to keep, miles to go before he sleeps, responsibilities and duties that must be fulfilled before death can be entertained?” This is hardly the kind of subject you’d expect from “the Norman Rockwell of verse,” as Frost’s sheer accessibility led many to perceive him. But as with poetry of any culture or era, sufficiently close reading is what really makes all the difference.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
In 590 AD, Pope Gregory I unveiled a list of the Seven Deadly Sins – lust, gluttony, greed, sloth, wrath, envy and pride – as a way to keep the flock from straying into the thorny fields of ungodliness. These days, though, for all but the most devout, Pope Gregory’s list seems less like a means to moral behavior than a description of cable TV programming.
So instead, let’s look to one of the saints of the 20th century–Mahatma Gandhi. On October 22, 1925, Gandhi published a list he called the Seven Social Sins in his weekly newspaper Young India.
Politics without principles.
Wealth without work.
Pleasure without conscience.
Knowledge without character.
Commerce without morality.
Science without humanity.
Worship without sacrifice.
The list sprang from a correspondence that Gandhi had with someone only identified as a “fair friend.” He published the list without commentary save for the following line: “Naturally, the friend does not want the readers to know these things merely through the intellect but to know them through the heart so as to avoid them.”
Unlike the Catholic Church’s list, Gandhi’s list is expressly focused on the conduct of the individual in society. Gandhi preached non-violence and interdependence and every single one of these sins are examples of selfishness winning out over the common good.
It’s also a list that, if fully absorbed, will make the folks over at the US Chamber of Commerce and Ayn Rand Institute itch. After all, “Wealth without work,” is a pretty accurate description of America’s 1%. (Investments ain’t work. Ask Thomas Piketty.) “Commerce without morality” sounds a lot like every single oil company out there and “knowledge without character” describes half the hacks on cable news. “Politics without principles” describes the other half.
In 1947, Gandhi gave his fifth grandson, Arun Gandhi, a slip of paper with this same list on it, saying that it contained “the seven blunders that human society commits, and that cause all the violence.” The next day, Arun returned to his home in South Africa. Three months later, Gandhi was shot to death by a Hindu extremist.
Note: An earlier version of this post appeared on our site in 2014.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.