H.G. Wells’ War of the Worlds has terrified and fascinated readers and writers for decades since its 1898 publication and has inspired numerous adaptations. The most notorious use of Wells’ book was by Orson Welles, whom the author called “my little namesake,” and whose 1938 War of the Worlds Halloween radio play caused public alarm (though not actually a national panic). After the occurrence, reports Phil Klass, the actor remarked, “I’m extremely surprised to learn that a story, which has become familiar to children through the medium of comic strips and many succeeding and adventure stories, should have had such an immediate and profound effect upon radio listeners.”
Surely Welles knew that is precisely why the broadcast had the effect it did, especially in such an anxious pre-war climate. The 1898 novel also startled its first readers with its verisimilitude, playing on a late Victorian sense of apocalyptic doom as the turn-of-the century approached.
But what contemporary circumstances eight years later, we might wonder, fueled the imagination of Henrique Alvim Corrêa, whose 1906 illustrations of the novel you can see here? Wells himself approved of these incredible drawings, praising them before their publication and saying, “Alvim Corrêa did more for my work with his brush than I with my pen.”
Indeed they capture the novel’s uncanny dread. Martian tripods loom, ghastly and cartoonish, above blasted realist landscapes and scenes of panic. In one illustration, a grotesque, tentacled Martian ravishes a nude woman. In a surrealist drawing of an abandoned London above, eyes protrude from the buildings, and a skeletal head appears above them. The alien technology often appears clumsy and unsophisticated, which contributes to the generally terrifying absurdity that emanates from these finely rendered plates.
Alvim Corrêa was a Brazilian artist living in Brussels and struggling for recognition in the European art world. His break seemed to come when the War of the Worlds illustrations were printed in a large-format, limited French edition of the book, with each of the 500 copies signed by the artist himself.
Unfortunately, Corrêa’s tuberculosis killed him four years later. His War of the Worlds drawings did not bring him fame in his lifetime or after, but his work has been cherished since by a devoted cult following. The original prints you see here remained with the artist’s family until a sale of 31 of them in 1990. You can see many more, as well as scans from the book and a poster announcing the publication, at The Public Domain Review and the Monster Brains site.
Note: An earlier version of this post appeared on our site in 2015.
People understand evolution in all sorts of different ways. We’ve all heard a variety of folk explanations of that all-important phenomenon, from “survival of the fittest” to “humans come from monkeys,” that run the spectrum from broadly correct to badly mangled. One less often heard but more elegant way to put it is that all species, living or extinct, share a common ancestor. This is true of evolution as Darwin knew it, and it could well be true of other forms of “evolution” outside the biological realm as well. Take languages, which we know full well have changed and split into different varieties over time: do they, too, all share a single ancestor?
In the RobWords video above, language Youtuber Rob Watts starts with his native English and traces its roots back as far as possible. He ascends up the family tree past Low West German, past Proto-Germanic — “a language that was theoretically spoken by a single group of people who would eventually go on to become the Swedes, the Germans, the Dutch, the English, and more” — back to an ancestor of not just English and the Germanic languages, but almost all the European languages, as well as of Asian languages like Hindi, Pashtu, Kurdish, Farsi, and Bengali. Its name? Proto-Indo-European.
Watts quotes the eighteenth-century philologist Sir William Jones, who wrote that the ancient Asian language of Sanskrit has a structure “more perfect than the Greek, more copious than the Latin, and more exquisitely refined than either, yet bearing to both of them a stronger affinity, both in the roots of verbs and in the forms of grammar, than could possibly have been produced by accident.” As with such conspicuously shared traits observed in disparate species of plant or animal, no expert “could examine all three without believing them to have sprung from some common source, which, perhaps, no longer exists.”
The evidence is everywhere, if you pay attention to the sort of unexpected cognates and very-nearly-cognates Watts points out spanning geographically and temporally various languages. Take the English hundred, the Latin centum, the Ancient Greek hekaton, the Russian sto, and the Sanskrit Shatam; or the more deeply buried resemblances of English heart, the Latin cordis, the Russian serdce, and the sanskrit hrd. In some cases, linguists have actually used these commonalities to reverse-engineer Proto-Indo-European words, though always with the caveat that the whole thing “is a reconstructed language; it’s our best guess of what a common ancestral language could have been like.” Was there a still older language from which the non-Proto-Indo-European-descended languages also descended? That’s a question to push the linguistic imagination to its very limits.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
Neil deGrasse Tyson has spent his career talking up not just science itself, but also its practitioners. If asked to name the greatest scientist of all time, one might expect him to need a minute to think about it — or even to find himself unable to choose. But that’s hardly Tyson’s style, as evidenced by the clip above from his 92nd Street Y conversation with Fareed Zakaria. “Who do you think is the most extraordinary scientific mind that humanity has produced?” Zakaria asks. “There’s no contest,” Tyson immediately responds. “Isaac Newton.”
Those familiar with Tyson will know he would be prepared for the follow-up. By way of explanation, he narrates certain events of Newton’s life: “He, working alone, discovers the laws of motion. Then he discovers the law of gravity.” Faced with the question of why planets orbit in ellipses rather than perfect circles, he first invents integral and differential calculus in order to determine the answer. Then he discovers the laws of optics. “Then he turns 26.” At this point in the story, young listeners who aspire to scientific careers of their own will be nervously recalculating their own intellectual and professional trajectories.
They must remember that Newton was a man of his place and time, specifically the England of the late seventeenth and early eighteenth centuries. And even there, he was an outlier the likes of which history has hardly known, whose eccentric tendencies also inspired him to come up with powdered toad-vomit lozenges and predict the date of the apocalypse (not that he’s yet been proven wrong on that score). But in our time as in his, future (or current) scientists would do well to internalize Newton’s spirit of inquiry, which got him presciently wondering whether, for instance, “the stars of the night sky are just like our sun, but just much, much farther away.”
“Great scientists are not marked by their answers, but by how great their questions are.” To find such questions, one needs not just curiosity, but also humility before the expanse of one’s own ignorance. “I do not know what I may appear to the world,” Newton once wrote, “but to myself I seem to have been only like a boy playing on the seashore, and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me.” Nearly three centuries after his death, that ocean remains forbiddingly but promisingly vast — at least to those who know how to look at it.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
And now for a good use of AI. The UK-based telecom company O2 has developed a chatbot (“named Daisy”) that performs a noble task. Impersonating an elderly grandmother, the chatbot engages with internet fraudsters and then systematically frustrates them and wastes their time. As part of a demo, notes The Guardian, Daisy wasted a series of fraudsters’ time for up to 40 minutes each–“when they could otherwise have been scamming real people.” The AI system was trained on real scam calls–according to Virgin Media O2’s marketing director, Simon Valcarcel–so it “knows exactly the tactics to look out for, exactly the type of information to give to keep the scammers online and waste time.” If you have three minutes to spare, you can listen to Daisy clown a scam artist above.
It’s Friday, which means that tonight, many of us will sit down to watch a movie with our family, our friends, our significant other, or — for some cinephiles, best of all — by ourselves. If you haven’t yet lined up any home-cinematic experience in particular, consider taking a look at this playlist of 31 feature films just made available to stream by Warner Bros. You’ll know the name of that august Hollywood studio, of course, but did you know that it put out True Stories, the musical plunge into tabloid America directed by Talking Heads’ David Byrne? Or Waiting for Guffman, the first improvised movie by Christopher Guest and his troupe of crack comedic players like Eugene Levy, Fred Willard, Catherine O’Hara, and Parker Posey?
That may already strike many Open Culture readers as the makings of a fine double feature, though some may prefer to watch the early work of another kind of auteur: Michel Gondry’s The Science of Sleep, say, or Richard Linklater’s SubUrbia (a stage-play adaptation that could well be paired with Sidney Lumet’s Deathtrap).
But if you’re just looking to have some fun, there’s no reason you couldn’t fire up the likes of Mr. Nice Guy, Jackie Chan’s first English-language picture. Should that prove too refined, Warner Bros. has also generously made available American Ninja V — a non-canonical entry in that series, we should note, starring not original American Ninja Michael Dudikoff, but direct-to-video martial-arts icon David Bradley. On Friday night, after all, any viewing goes.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
Music is often described as the most abstract of all the arts, and arguably the least visual as well. But these qualities, which seem so basic to the nature of the form, have been challenged for at least three centuries, not least by composers themselves. Take Antonio Vivaldi, whose Le quattro stagioni, or The Four Seasons, of 1718–1720 evoke not just broad impressions of the eponymous parts of the year, but a variety of natural and human elements characteristic to them. In the course of less than an hour, its listeners — whether of the early eighteenth century or the early twenty-first — “see” spring, summer, autumn, and winter unfold vividly before their mind’s eye.
Now, composer Stephen Malinowski has visualized The Four Seasons in an entirely different way. As previouslyfeatured here on Open Culture, he uses his Music Animation Machine to create what we might call graphical scores, which abstractly represent the instrumental parts that make up widely loved classical compositions in time with the music itself.
On this page, you can watch four videos, with each one visualizing one of the piece’s concerti. Fans of the Music Animation Machine will notice that its formerly simple visuals have taken a big step forward, though what can look at first like a psychedelic light show also has a clear and legible order.
?si=xEQ3Twamh82m3Yhr
For “Spring” and “Autumn,” Malinowski animates performances by violinist Shunske Sato and musicians of the Netherlands Bach Society; for “Summer” and “Winter,” performances by Cynthia Miller Freivogel and early-music ensemble Voices of Music (previously featured here for their renditions of Bach’s Brandenburg Concertos and “Air on the G String,”Pachelbel’s Canon, and indeed The Four Seasons). Generally understandable at a glance — and in many ways, more illuminating than actually seeing the musicians play their instruments — these scores also use a system called “harmonic coloring,” which Malinkowski explains here. This may add up to a complete audiovisual experience, but if you’d also like a literary element, why not pull up The Four Seasons’ accompanying sonnets while you’re at it?
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
As you’ve probably noticed if you’re a regular reader of this site, we’re big fans of book illustration, particularly that from the form’s golden age—the late 18th and 19th century—before photography took over as the dominant visual medium. But while photographs largely supplanted illustrations in textbooks, magazines, and newspapers over the course of the 20th century, works of fiction, which had been routinely published in lavishly illustrated editions, suddenly became the featureless banks of words we know today. Though image-heavy graphic novels and comic books have thrived in recent decades, the illustrated literary text is a rarity indeed.
Why did this change come about? “I really don’t know,” writes Christopher Howse at The Telegraph, but he points out that the era of illustrated fiction for grown-ups ended “after the death of the big Victorian novelists,” like Dickens and Trollope. Before adult picture-books went out of style, several now-famous artists made careers as book illustrators. When we think of the big names from the period, we think of Aubrey Beardsley and Gustave Doré, both of whom we’ve covered heavily here. We tend not to think of Irish artist Harry Clarke—a relative latecomer—but we should. Of the many incredible illustrations from famous works of literature we’ve featured here, my favorite might be Clarke’s 1926 illustrations of Goethe’s Faust.
So out-there are some of his illustrations, so delightfully nightmarish and weird, one is tempted to fall back on that rather sophomoric explanation for art we find disturbing: maybe he was on drugs! Not that he’d need them to conjure up many of the images he did. His source material is bizarre enough (maybe Goethe was on drugs!). In any case, we can definitely call Clarke’s work hallucinatory, and that goes for his earlier, 1923 illustrations of Edgar Allan Poe’s Tales of Mystery and Imagination as well, of which you can see a few choice examples here.
Dublin-born Clarke worked as a stained-glass artist as well as an illustrator, and drew his inspiration from the earlier art nouveau aesthetic of Beardsley and others, adding his own rococo flourishes to the elongated forms and decorative patterns favored by those artists. His glowering figures—including one who looks quite a bit like Poe himself, at the top—suit the feverish intensity of Poe’s world to perfection. And like Poe, Clarke’s art generally thrived in a seductively dark underworld filled with ghouls and fiends. Both of these proto-goths died young, Poe under mysterious circumstances at age 40, Clarke of tuberculosis at 42.
We made sand think: this phrase is used from time to time to evoke the particular technological wonders of our age, especially since artificial intelligence seems to be back on the slate of possibilities. While there would be no Silicon Valley without silica sand, semiconductors are hardly the first marvel humanity has forged out of that kind of material. Consider the three millennia of history behind the traditional Japanese sword, long known even outside the Japanese language as the katana (literally “one-sided blade”) — or, more to the point of the Veritasium video above, the 1,200 years in which such weapons have been made out of steel. How Japanese Masters Turn Sand Into Swords
In explaining the science of the katana, Veritasium host Derek Muller begins more than two and a half billion years ago, when Earth’s oceans were “rich with dissolved iron.” But then, cyanobacteria started photosynthesizing that iron and creating oxygen as a by-product. This process dropped layers of iron onto the sea floor, which eventually hardened into layers of sedimentary rock.
With few such formations of its own, the geologically volcanic Japan actually came late to steel, importing it long before it could manage domestic production using the iron oxide that accumulated in its rivers, recovered as “iron sand.”
By that time, iron swords would no longer cut it, as it were, but the addition of charcoal in the heating process could produce the “incredibly strong alloy” of steel. Certain Japanese swordsmiths have continued to use steel made with the more or less traditional smelting process you can see performed in rural Shimane prefecture in the video. To the disappointment of its producer, Petr Lebedev, who participates in the whole process, the foot-operated bellows of yore have been electrified, but he hardly seems disappointed by his chance to take up a katana himself. He may have yet to attain the skill of a master swordsman, but understanding every scientific detail of the weapon he wields must make slicing bamboo clean in half that much more satisfying.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
Charlie Chaplin started appearing in his first films in 1914—40 films, to be precise—and, by 1915, the United States had a major case of “Chaplinitis.” Chaplin mustaches were suddenly popping up everywhere–as were Chaplin imitators and Chaplin look-alike contests. A young Bob Hope apparently won one such contest in Cleveland. Chaplin Fever continued burning hot through 1921, the year when the Chaplin look-alike contest, shown above, was held outside the Liberty Theatre in Bellingham, Washington.
According to legend, somewhere between 1915 and 1921, Chaplin decided to enter a Chaplin look-alike contest, and lost, badly.
A short article called “How Charlie Chaplin Failed,” appearing in The Straits Times of Singapore in August of 1920, read like this:
Lord Desborough, presiding at a dinner of the Anglo-Saxon club told a story which will have an enduring life. It comes from Miss Mary Pickford who told it to Lady Desborough, “Charlie Chaplin was one day at a fair in the United States, where a principal attraction was a competition as to who could best imitate the Charlie Chaplin walk. The real Charlie Chaplin thought there might be a chance for him so he entered for the performance, minus his celebrated moustache and his boots. He was a frightful failure and came in twentieth.
A variation on the same story appeared in a New Zealand newspaper, the Poverty Bay Herald, again in 1920. As did another story in the Australian newspaper, the Albany Advertiser, in March, 1921.
A competition in Charlie Chaplin impersonations was held in California recently. There was something like 40 competitors, and Charlie Chaplin, as a joke, entered the contest under an assumed name. He impersonated his well known film self. But he did not win; he was 27th in the competition.
Did Chaplin come in 20th place? 27th place? Did he enter a contest at all? It’s fun to imagine that he did. But, a century later, many consider the story the stuff of urban legend. When one researcher asked the Association Chaplin to weigh in, they apparently had this to say: “This anecdote told by Lord Desborough, whoever he may have been, was quite widely reported in the British press at the time. There are no other references to such a competition in any other press clipping albums that I have seen so I can only assume that this is the source of that rumour, urban myth, whatever it is. However, it may be true.”
I’d like to believe it is.
Note: An earlier version of this post appeared on our site in early 2016.
We can all remember seeing images of medieval Europeans wearing pointy shoes, but most of us have paid scant attention to the shoes themselves. That may be for the best, since the more we dwell on one fact of life in the Middle Ages or another, the more we imagine how uncomfortable or even painful it must have been by our standards. Dentistry would be the most vivid example, but even that fashionable, vaguely elfin footwear inflicted suffering, especially at the height of its popularity — not least among flashy young men — in the fourteenth and fifteenth centuries.
Called poulaines, a name drawn from the French word for Poland in reference to the footwear’s supposedly Polish origin, these pointy shoes appeared around the time of Richard II’s marriage to Anne of Bohemia in 1382. “Both men and women wore them, although the aristocratic men’s shoes tended to have the longest toes, sometimes as long as five inches,” writes Ars Technica’s Jennifer Ouellette. “The toes were typically stuffed with moss, wool, or horsehair to help them hold their shape.” If you’ve ever watched the first Blackadder series, know that the shoes worn by Rowan Atkinson’s hapless plotting prince may be comic, but they’re not an exaggeration.
Regardless, he was a bit behind the times, given that the show was set in 1485, right when poulaines went out of fashion. But they’d already done their damage, as evidenced by a 2021 study linking their wearing to nasty foot disorders. “Bunions — or hallux valgus — are bulges that appear on the side of the foot as the big toe leans in towards the other toes and the first metatarsal bone points outwards,” writes the Guardian’s Nicola Davis. A team of University of Cambridge researchers found signs of them being more prevalent in the remains of individuals buried in the fourteenth and fifteenth centuries than those buried from the eleventh through the thirteenth centuries.
Yet bunions were hardly the evil against which the poulaine’s contemporary critics inveighed. After the Great Pestilence of 1348, says the London Museum, “clerics claimed the plague was sent by God to punish Londoners for their sins, especially sexual sins.” The shoes’ lascivious associations continued to draw ire: “In 1362, Pope Urban V passed an edict banning them, but it didn’t really stop anybody from wearing them.” Then came sumptuary laws, according to which “commoners were charged to wear shorter poulaines than barons and knights.” The power of the state may be as nothing against that of the fashion cycle, but had there been a law against the bluntly square-toed shoes in vogue when I was in high school, I can’t say I would’ve objected.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
There have been many theories of how human history works. Some, like German thinker G.W.F. Hegel, have thought of progress as inevitable. Others have embraced a more static view, full of “Great Men” and an immutable natural order. Then we have the counter-Enlightenment thinker Giambattista Vico. The 18th century Neapolitan philosopher took human irrationalism seriously, and wrote about our tendency to rely on myth and metaphor rather than reason or nature. Vico’s most “revolutionary move,” wrote Isaiah Berlin, “is to have denied the doctrine of a timeless natural law” that could be “known in principle to any man, at any time, anywhere.”
Vico’s theory of history included inevitable periods of decline (and heavily influenced the historical thinking of James Joyce and Friedrich Nietzsche). He describes his concept “most colorfully,” writes Alexander Bertland at the Internet Encyclopedia of Philosophy, “when he gives this axiom”:
Men first felt necessity then look for utility, next attend to comfort, still later amuse themselves with pleasure, thence grow dissolute in luxury, and finally go mad and waste their substance.
The description may remind us of Shakespeare’s “Seven Ages of Man.” But for Vico, Bertland notes, every decline heralds a new beginning. History is “presented clearly as a circular motion in which nations rise and fall… over and over again.”
Two-hundred and twenty years after Vico’s 1774 death, Carl Sagan—another thinker who took human irrationalism seriously—published his book The Demon Haunted World, showing how much our everyday thinking derives from metaphor, mythology, and superstition. He also foresaw a future in which his nation, the U.S., would fall into a period of terrible decline:
I have a foreboding of an America in my children’s or grandchildren’s time — when the United States is a service and information economy; when nearly all the manufacturing industries have slipped away to other countries; when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what’s true, we slide, almost without noticing, back into superstition and darkness…
Sagan believed in progress and, unlike Vico, thought that “timeless natural law” is discoverable with the tools of science. And yet, he feared “the candle in the dark” of science would be snuffed out by “the dumbing down of America…”
…most evident in the slow decay of substantive content in the enormously influential media, the 30 second sound bites (now down to 10 seconds or less), lowest common denominator programming, credulous presentations on pseudoscience and superstition, but especially a kind of celebration of ignorance…
Sagan died in 1996, a year after he wrote these words. No doubt he would have seen the fine art of distracting and misinforming people through social media as a late, perhaps terminal, sign of the demise of scientific thinking. His passionate advocacy for science education stemmed from his conviction that we must and can reverse the downward trend.
As he says in the poetic excerpt from Cosmos above, “I believe our future depends powerfully on how well we understand this cosmos in which we float like a mote of dust in the morning sky.”
When Sagan refers to “our” understanding of science, he does not mean, as he says above, a “very few” technocrats, academics, and research scientists. Sagan invested so much effort in popular books and television because he believed that all of us needed to use the tools of science: “a way of thinking,” not just “a body of knowledge.” Without scientific thinking, we cannot grasp the most important issues we all jointly face.
We’ve arranged a civilization in which most crucial elements profoundly depend on science and technology. We have also arranged things so that almost no one understands science and technology. This is a prescription for disaster. We might get away with it for a while, but sooner or later this combustible mixture of ignorance and power is going to blow up in our faces.
Sagan’s 1995 predictions are now being heralded as prophetic. As Director of Public Radio International’s Science Friday, Charles Bergquist tweeted, “Carl Sagan had either a time machine or a crystal ball.” Matt Novak cautions against falling back into superstitious thinking in our praise of Demon Haunted World. After all, he says, “the ‘accuracy’ of predictions is often a Rorschach test” and “some of Sagan’s concerns” in other parts of the book “sound rather quaint.”
Of course Sagan couldn’t predict the future, but he did have a very informed, rigorous understanding of the issues of thirty years ago, and his prediction extrapolates from trends that have only continued to deepen. If the tools of science education—like most of the country’s wealth—end up the sole property of an elite, the rest of us will fall back into a state of gross ignorance, “superstition and darkness.” Whether we might come back around again to progress, as Giambattista Vico thought, is a matter of sheer conjecture. But perhaps there’s still time to reverse the trend before the worst arrives. As Novak writes, “here’s hoping Sagan, one of the smartest people of the 20th century, was wrong.”
Note: An earlier version of this post appeared on our site in 2017.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.