Some Porter classics–“Every Time We Say Goodbye,” “So In Love”–demand sincerity. This one calls for a strong dose of the opposite, which Pop and Harry deliver, both vocally and in the barnstorming music video above. They’re dangerous, funny, and anything but canned, weaving through rat-glammy 1980s New York in thrift store finery, with side trips to a cemetery and a farm where Pop smooches a goat.
As Alex Cox, who brought further punk pedigree to the project as the director of Sid and Nancy and Repo Man told Spin: “Iggy had always wanted to make a video with animals and Debbie had always wanted to publicly burn lingerie so I let them.”
They also filled Pop’s palms with stigmata and ants, and swapped Porter’s champagne for a case of generic dog food.
There are a few minor tweaks to the lyrics (“What cocks!”) and the stars inject the patter with a gleefully louche downtown sensibility. Mars rises behind the Twin Towers, for a swelligantly off-beat package that raised a lot of money for AIDS research and awareness. Other gems from the project:
“It’s All Right with Me” performed by Tom Waits, directed by Jim Jarmusch
“Night and Day” performed by U2, directed by Wim Wenders
“Don’t Fence Me In” performed and directed by David Byrne
Most young male fans from my generation failed to appreciate the gender imbalance in comic books. After all, what were the X‑Men without powerful X‑women Storm, Rogue, and, maybe the most powerful mutant of all, Jean Grey? Indie comics like Love and Rockets revolved around strong female characters, and if the legacy golden age Marvel and DC titles were nearly all about Great Men, well… just look at the time they came from. We shrugged it off, and also failed to appreciate how the hypersexualization of women in comics made many of the women around us uncomfortable and hyperannoyed.
Had we been curious enough to look, however, we would have found that golden age comics weren’t just innocent “products of their time”—they reflected a collective will, just as did the comics of our time. And the character who first challenged golden age attitudes about women—Wonder Woman, created in 1941—began her career as perhaps one of the kinkiest superheroes in mainstream comic books. What’s more, she was created by a psychologist William Moulton Marston, who first published under a pseudonym, due in part to his unconventional personal life. Marston, writes NPR, “had a wife—and a mistress. He fathered children with both of them, and they all secretly lived together in Rye, N.Y.”
The other woman in Marston’s polyamorous threesome, one of his former students, happened to be the niece of Margaret Sanger, and Marston just happened to be the creator of the lie detector. The details of his life are as odd and prurient now as they were to readers in the 1940s—partly an index of how little some things have changed. And now that Marston’s creation has finally received her blockbuster due, his story seems ripe for the Hollywood telling. Such it has received, it appears, in Professor Marston & the Wonder Women, the upcoming biopic by Angela Robinson. It’s unfair to judge a film by its trailer, but in the clips above we see much more of Marston’s dual romance than we do of the invention of his famous heroine.
Yet as political historian Jill Lepore tells it, the cultural history of Wonder Woman is as fascinating as her creator’s personal life, though it may be impossible to fully separate the two. A press release accompanying Wonder Woman’s debut explained that Marston aimed “to set up a standard among children and young people of strong, free, courageous womanhood; to combat the idea that women are inferior to men, and to inspire girls to self-confidence in athletics, occupations and professions monopolized by men.” It went on to express Marston’s view that “the only hope for civilization is the greater freedom, development and equality of women in all fields of human activity.”
The language sounds like that of many a modern-day NGO, not a World War II-era popular entertainment. But Marston would go further, saying, “Frankly, Wonder Woman is the psychological propaganda for the new type of woman who should, I believe, rule the world.” His interest in domineering women and S&M drove the early stories, which are full of bondage imagery. “There are a lot of people who get very upset at what Marston was doing…,” Lepore told Terry Gross on Fresh Air. “’Is this a feminist project that’s supposed to help girls decide to go to college and have careers, or is this just like soft porn?’” As Marston understood it, the latter question could be asked of most comics.
When writer Olive Richard—pen name of Marston’s mistress Olive Byrne—asked him in an interview for Family Circle whether some comics weren’t “full of torture, kidnapping, sadism, and other cruel business,” he replied, “Unfortunately, that is true.” But “the reader’s wish is to save the girl, not to see her suffer.” Marston created a “girl”—or rather a superhuman Amazonian princess—who saved herself and others. “One of the things that’s a defining element of Wonder Woman,” says Lepore, “is that if a man binds her in chains, she loses all of her Amazonian strength. So in almost every episode of the early comics, the ones that Marston wrote… she’s chained up or she’s roped up.” She has to break free, he would say, “in order to signify her emancipation from men.” She does her share of roping others up as well, with her lasso of truth and other means.
The seemingly clear bondage references in all those ropes and chains also had clear political significance, Lepore explains. During the fight for suffrage, women would chain themselves to government buildings. In parades, suffragists “would march in chains—they imported that iconography from the abolitionist campaigns of the 19th century that women had been involved in… Chains became a really important symbol,” as in the 1912 drawing below by Lou Rogers. Wonder Woman’s mythological origins also had deeper signification than the male fantasy of a powerful race of well-armed dominatrices. Her story, writes Lepore at The New Yorker, “comes straight out of feminist utopian fiction” and the fascination many feminists had with anthropologists’ speculation about an Amazonian matriarchy.
The combination of feminist symbols have made the character a redoubtable icon for every generation of activists—as in her appearance on 1972 cover of Ms. magazine, further up, an issue headlined by Gloria Steinem and Simone de Beauvoir. Marston translated the feminist ideas of the suffrage movement, and of women like Margaret Sanger, Elizabeth Cady Stanton, his wife, lawyer Elizabeth Holloway Marston, and his mistress Olive Byrne, into a powerful, long-revered superhero. He also translated his own ideas of what Havelock Ellis called “the erotic rights of women.”
Marston’s version of Wonder Woman (he stopped writing the comic in 1947) had as much agency—sexual and otherwise—as any male character of the time. (See her breaking the bonds of “Prejudice,” “Prudery,” and “Man’s Superiority” in a drawing, below, from Marston’s 1943 article “Why 100,000 Americans Read Comics.”) The character was undoubtedly kinky, a quality that largely disappeared from later iterations. But she was not created, as were so many women in comics in the following decades, as an object of teenage lust, but as a radically liberated feminist hero. Read more about Marston in Lepore’s essays at Smithsonian and The New Yorker and in her book, The Secret History of Wonder Woman.
“The electronic media haven’t wiped out the book: it’s read, used, and wanted, perhaps more than ever. But the role of the book has changed. It’s no longer alone. It no longer has sole charge of our outlook, nor of our sensibilities.” As familiar as those words may sound, they don’t come from one of the think pieces on the changing media landscape now published each and every day. They come from the mouth of midcentury CBC television host John O’Leary, introducing an interview with Marshall McLuhan more than half a century ago.
McLuhan, one of the most idiosyncratic and wide-ranging thinkers of the twentieth century, would go on to become world famous (to the point of making a cameo in Woody Allen’sAnnie Hall) as a prophetic media theorist. He saw clearer than many how the introduction of mass media like radio and television had changed us, and spoke with more confidence than most about how the media to come would change us. He understood what he understood about these processes in no small part because he’d learned their history, going all the way back to the development of writing itself.
Writing, in McLuhan’s telling, changed the way we thought, which changed the way we organized our societies, which changed the way we perceived things, which changed the way we interact. All of that holds truer for the printing press, and even truer still for television. He told the story in his book The Gutenberg Galaxy, which he was working on at the time of this interview in May of 1960, and which would introduce the term “global village” to its readers, and which would crystallize much of what he talked about in this broadcast. Electronic media, in his view, “have made our world into a single unit.”
With this “continually sounding tribal drum” in place, “everybody gets the message all the time: a princess gets married in England, and ‘boom, boom, boom’ go the drums. We all hear about it. An earthquake in North Africa, a Hollywood star gets drunk, away go the drums again.” The consequence? “We’re re-tribalizing. Involuntarily, we’re getting rid of individualism.” Where “just as books and their private point of view are being replaced by the new media, so the concepts which underlie our actions, our social lives, are changing.” No longer concerned with “finding our own individual way,” we instead obsess over “what the group knows, feeling as it does, acting ‘with it,’ not apart from it.”
Though McLuhan died in 1980, long before the appearance of the modern internet, many of his readers have seen recent technological developments validate his notion of the global village — and his view of its perils as well as its benefits — more and more with time. At this point in history, mankind can seem less united than ever than ever, possibly because technology now allows us to join any number of global “tribes.” But don’t we feel more pressure than ever to know just what those tribes know and feel just what they feel?
No wonder so many of those pieces that cross our news feeds today still reference McLuhan and his predictions. Just this past weekend, Quartz’s Lila MacLellan did so in arguing that our media, “while global in reach, has come to be essentially controlled by businesses that use data and cognitive science to keep us spellbound and loyal based on our own tastes, fueling the relentless rise of hyper-personalization” as “deep-learning powered services promise to become even better custom-content tailors, limiting what individuals and groups are exposed to even as the universe of products and sources of information expands.” Long live the individual, the individual is dead: step back, and it all looks like one of those contradictions McLuhan could have delivered as a resonant sound bite indeed.
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
Last month, a Spanish court ordered the exhumation of Salvador Dalí’s, to see whether–as a paternity case claims–he’s the father of María Pilar Abel Martínez, a tarot card reader born in 1956. When experts opened his crypt on Thursday night, they encountered a pretty remarkable scene. According to Narcís Bardalet, the doctor who embalmed the artist’s body back in 1989, Dalí’s face was covered with a silk handkerchief – a magnificent handkerchief.” “When it was removed, I was delighted to see his moustache was intact … I was quite moved. You could also see his hair.” “His moustache is still intact, [like clock hands at] 10 past 10, just as he liked it. It’s a miracle.” “The moustache is still there and will be for centuries.” That’s perhaps the last surviving trace of Dalí’s schtick that will remain.
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
Ask any creator subject to frequent interviews which questions they dread, and one in particular will come up more than any other: “Where do you get your ideas?” Some have readily spoken and written on the subject — Isaac Asimov, Neil Gaiman, David Lynch — but most, even if they’ve had truly astonishing ideas, have given the subject of ideas in general little thought. The video above, named after the infamous question, compiles a variety of answers from a variety of people, younger and older, famous and less so, into a five-minute search for the source of human creativity.
“I get ideas in fragments,” says Lynch, whose voice we hear amid the many others in the video. “It’s as if, in the other room, there’s a puzzle and all the pieces are together. But in my room, they just flip one piece at a time into me.”
When a good idea comes along, says a twelve-year-old named Ursula, “that’s the feeling they call inspiration.” But Radiolab host Robert Krulwich has a dim view of inspiration: “I’m a little suspicious of the idea like, ‘In the beginning there was nothing and then there was light.’ I don’t think I’ve had that experience, and for other people who’ve said that they’ve had that experience, I’m not sure I believe them.”
“Inspiration is for amateurs,” says artist Chuck Close. “The rest of us just show up and get to work. Every great idea came out of work, everything.” Chalk up another point in favor of Thomas Edison’s famous breakdown of genius as one percent inspiration and 99 percent perspiration — but what kind of perspiration? As professional skateboarder Ray Barbee sees it, “most people start off by mimicking something, but then it turns into their own thing because they don’t really have the ability to mimic it precisely,” a process that produces “originality from copying.”
“Whenever I finish a story,” says New Yorker writer Susan Orlean, “I go through a period of time where I feel like I will never again have an idea.” But it never lasts as long as it feels: “One day you fall onto something, and it just looks you in the face and says, ‘I’m the one.’ ” That “one” could take the form, according to the video’s contributors, of a chance encounter, a sentence in a story, a yellow ball bouncing down the street, a solitary lawn chair seen from a train window, a dump trick, or many other even less expected entities besides. You just have to be primed and ready to connect it in an interesting manner to other things in your head, in your environment, and in the culture. “Luck is what happens when preparation meets opportunity,” goes a well-known quote often attributed to Seneca — and so, it seems, is creativity.
In her view, color has the power to close the gap between the subjects of musty public domain photos and their modern viewers. The most fulfilling moment for this artist, aka Klimblim, comes when “suddenly the person looks back at you as if he’s alive.”
A before and after comparison of her digital makeover on Nadezhda Kolesnikova, one of many female Soviet snipers whose vintage likenesses she has colorized bears this out. The color version could be a fashion spread in a current magazine, except there’s nothing artificial-seeming about this 1943 pose.
“The world was never monochrome even during the war,” Shirnina reflected in the Daily Mail.
Military subjects pose a particular challenge:
When I colorize uniforms I have to search for info about the colours or ask experts. So I’m not free in choosing colors. When I colorize a dress on a 1890s photo, I look at what colors were fashionable at that time. When I have no limitations I play with colours looking for the best combination. It’s really quite arbitrary but a couple of years ago I translated a book about colours and hope that something from it is left in my head.
She also puts herself on a short leash where famous subjects are concerned. Eyewitness accounts of Vladimir Lenin’s eye color ensured that the revolutionary’s colorized irises would remain true to life.
And while there may be a market for representations of punked out Russian literary heroes, Shirnina plays it straight there too, eschewing the digital Manic Panic where Chekhov, Tolstoy, and Bulgakov are concerned.
Her hand with Photoshop CS6 may restore celebrity to those whose stars have faded with time, like Vera Komissarzhevskaya, the original ingenue in Chekhov’s much performed play The Seagull and wrestler Karl Pospischil, who showed off his physique sans culotte in a photo from 1912.
Even the unsung proletariat are given a chance to shine from the fields and factory floors.
We think of Johannes Gutenberg’s printing press (circa 1440) to have begun the era of the printed book, since his invention allowed for mass production of books on a scale unheard of before. But we must date the invention of printing itself much earlier—nearly 600 years earlier—to the Chinese method of xylography, a form of woodblock printing. Also used in Japan and Korea, this elegant method allowed for the reproduction of hundreds of books from the 9th century to the time of Gutenberg, most of them Buddhist texts created by monks. In the 11th century, writes Elizabeth Palermo at Live Science, a Chinese peasant named Bi Sheng (Pi Sheng) developed the world’s first movable type.” The technology may have also arisen independently in the 14th century Yuan Dynasty and in Korea around the same time.
Despite these innovations, xylography remained the primary method of printing in Asia. The “daunting task” of casting the thousands of characters in Chinese, Japanese, and Korean “may have made woodblocks seem like a more efficient option for printing these languages.” This still-labor-intensive process produced books and illustrations for several centuries, a good many of them incredible works of art in their own right.
Published by Hu Zhengyan’s Ten Bamboo Studio in Nanjiang, this manual for teachers contains 138 pages of multicolor prints by fifty different artists and calligraphers and 250 pages of accompanying text. “The method” that produced the stunning artifact “involves the use of multiple printing blocks which successively apply different coloured inks to the paper to reproduce the effect of watercolour painting.” Kept untouched in Cambridge’s “most secure vaults,” the book was unsealed for the first time just a couple years ago. “What surprised us,” remarked Charles Aylmer, head of the Library’s Chinese Department, “was the amazing freshness of the images, as if they had never been looked at for over 300 years.”
The 17th century copy is “unique in being complete, in perfect condition and in its original binding.” (Another, incomplete, copy was acquired in 2014 by the Huntington Library in San Marino, CA.) The book contains many “detailed instructions on brush techniques,” writes CNN, “but its phenomenal beauty has meant from the outset that it has held a greater position” than other such manuals. Like another gorgeous multicolor painting textbook, the Manual of the Mustard Seed Garden, made in 1679, this text had a significant impact on the arts in both China and Japan, “where it inspired a whole new branch of printing.”
In 2014, Google acquired DeepMind, a company which soon made news when its artificial intelligence software defeated the world’s best player of the Chinese strategy game, Go. What’s DeepMind up to these days? More elemental things–like teaching itself to walk. Above, watch what happens when, on the fly, DeepMind’s AI learns to walk, run, jump, and climb. Sure, it all seems a little kooky–until you realize that if DeepMind’s AI can learn to walk in hours, it can take your job in a matter of years.
What’s director Michel Gondry up to these days? Apparently, trying to show that you can do smart things–like make serious movies–with that smartphone in your pocket. The director of Eternal Sunshine of the Spotless Mind and the Noam Chomsky animated documentary Is the Man Who Is Tall Happy?has just released “Détour,” a short film shot purely on his iPhone 7 Plus. Subtitled in English, “Détour” runs about 12 minutes and follows “the adventures of a small tricycle as it sets off along French roads in search of its young owner.” Watch it, then ask yourself, was this really not made with a traditional camera? And then ask yourself, what’s my excuse for not getting out there and making movies?
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
In popular conceptions, we take the computer to be the natural outcome of empirical science, an inheritance of the Enlightenment and subsequent scientific revolutions in the 19th and 20th centuries. Of course, modern computers have their ancient precursors, like the Antikythera Mechanism, a 2,200-year-old bronze and wood machine capable of predicting the positions of the planets, eclipses, and phases of the moon. But even this fascinating artifact fits into the narrative of computer science as “a history of objects, from the abacus to the Babbage engine up through the code-breaking machines of World War II.” Much less do we invoke the names of “philosopher-mathematicians,” writes Chris Dixon at The Atlantic, like George Boole and Gottlob Frege, “who were themselves inspired by Leibniz’s dream of a universal ‘concept language,’ and the ancient logical system of Aristotle.” But these thinkers are as essential, if not more so, to computer science, especially, Dixon argues, Aristotle.
The ancient Greek thinker did not invent a calculating machine, though they may have existed in his lifetime. Instead, as Dixon writes in his recent piece, “How Aristotle Created the Computer,” Aristotle laid the foundations of mathematical logic, “a field that would have more impact on the modern world than any other.”
The claim may strike historians of philosophy as somewhat ironic, given that Enlightenment philosophers like Francis Bacon and John Locke announced their modern projects by thoroughly repudiating the medieval scholastics, whom they alleged were guilty of a slavish devotion to Aristotle. Their criticisms of medieval thought were varied and greatly warranted in many ways, and yet, like many an empiricist since, they often overlooked the critical importance of Aristotelian logic to scientific thought.
At the turn of the 20th century, almost three hundred years after Bacon sought to transcend Aristotle’s Organon with his form of natural philosophy, the formal logic of Aristotle could still be “considered a hopelessly abstract subject with no conceivable applications.” But Dixon traces the “evolution of computer science from mathematical logic” and Aristotelian thought, beginning in the 1930s with Claude Shannon, author of the groundbreaking essay “A Symbolic Analysis of Switching and Relay Circuits.” Shannon drew on the work of George Boole, whose name is now known to every computer scientist and engineer but who, in 1938, “was rarely read outside of philosophy departments.” And Boole owed his principle intellectual debt, as he acknowledged in his 1854 The Laws of Thought, to Aristotle’s syllogistic reasoning.
Boole derived his operations by replacing the terms in a syllogism with variables, “and the logical words ‘all’ and ‘are’ with arithmetical operators.” Shannon discovered that “Boole’s system could be mapped directly onto electrical circuits,” which hitherto “had no systematic theory governing their design.” The insight “allowed computer scientists to import decades of work in logic and mathematics by Boole and subsequent logicians.” Shannon, Dixon writes, “was the first to distinguish between the logical and the physical layer of computers,” a distinction now “so fundamental to computer science that it might seem surprising to modern readers how insightful it was at the time.” And yet, the field could not move forward without it—without, that is, a return to ancient categories of thought.
Since the 1940s, computer programming has become significantly more sophisticated. One thing that hasn’t changed is that it still primarily consists of programmers specifying rules for computers to follow. In philosophical terms, we’d say that computer programming has followed in the tradition of deductive logic, the branch of logic discussed above, which deals with the manipulation of symbols according to formal rules.
Dixon’s argument for the centrality of Aristotle to modern computer science takes many turns—through the quasi-mystical thought of 13th-century Ramon Llull and, later, his admirer Gottfried Leibniz. Through Descartes, and later Frege and Bertrand Russell. Through Alan Turing’s work at Bletchley Park. Nowhere do we see Aristotle, wrapped in a toga, building a circuit board in his garage, but his modes of reasoning are everywhere in evidence as the scaffolding upon which all modern computer science has been built. Aristotle’s attempts to understand the laws of the human mind “helped create machines that could reason according to the rules of deductive logic.” The application of ancient philosophical principles may, Dixon concludes, “result in the creation of new minds—artificial minds—that might someday match or even exceed our own.” Read Dixon’s essay at The Atlantic, or hear it read in its entirety in the audio above.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.