World War II officially ended on September 2, 1945. It followed, by less than three weeks, an equally momentous event, at least in the eyes of cinephiles: the birth of Wim Wenders. Though soon to turn 80 years old, Wenders has remained both productive and capable of drawing great critical acclaim. Witness, for example, his Tokyo-set 2023 film Perfect Days, which made it to the running for both the Palme d’Or and a Best International Feature Film Academy Award. Back on V‑J Day, it surely would’ve been difficult to imagine a Japanese-German co-production seriously competing for the most prestigious prizes in cinema — even one directed by a known Americaphile.
Wenders has long worked at revealing intersections of history and culture. Seen today, Wings of Desireseems for all the world to express the spirit about to be liberated by the fall of the Soviet Union, but by Wenders’ own admission, nobody working on the movie would have credited the idea of the Berlin Wall coming down any time in the foreseeable future.
In his new short film “The Keys to Freedom,” he commemorates the 80th anniversary of the Second World War’s conclusion by paying a visit to a school in Reims. Commandeered for the secret all-night meeting in which German generals signed the documents confirming their country’s total surrender to the Allies, it hosted the end of what Wenders called “the darkest period in the history of Europe.”
Closing up the temporary headquarters, Allied commander-in-chief Dwight D. Eisenhower returned its keys to the mayor of Reims, saying, “These are the keys to the freedom of the world.” As much as these words move Wenders, he also fears that, even as the Russia-Ukraine war roils on, younger generations of Europeans no longer grasp their meaning. Born into societies protected by the United States, they naturally take peace for granted. “We have to be aware of the fact that Uncle Sam isn’t doing our job for very much longer, and we might have to defend this freedom ourselves,” Wenders explains in a New York Times interview. The end of World War II marked the beginning of the so-called “American century.” If that century is well and truly drawing to its close, who better to observe it than Wenders?
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
It’s hard to imagine from this historical distance how upsetting Pablo Picasso’s 1907 modernist painting Les Demoiselles d’Avignonwas to Parisian society at its debut. On its 100th anniversary, Guardian critic Jonathan Jones described it as “the rift, the break that divides past and future.” The painting caused an uproar, even among the artist’s peers. It was a moment of culture shock, notes PBS. Its five nude figures, broken into proto-cubist planes and angles with faces painted like African masks, met “with almost unanimous shock, distaste, and outrage.”
Henri Matisse, himself often credited with ushering in modernist painting with his flattened fields of color, “is angered by the work, which he considers a hoax, an attempt to paint the fourth dimension.” Much of the outrage was purported to come from middle-class moral qualms about the painting’s subject, “the sexual freedom depicted in a brothel.”
This is a little hard to believe. Nude women in brothels, “odalisques,” had long been a favorite subject of some of the most revered European painters. But where the women in these paintings always appear passive, if not submissive, Picasso’s nudes pose suggestively and meet the viewer’s gaze, actively unashamed.
What likely most disturbed those first viewers was the perceived violence done to tradition. While we cannot recover the tender sensibilities of early 20th-century Parisian critics, we can, I think, experience a similar kind of shock by looking at work Picasso had done ten years earlier, such as the 1896 First Communion, further up, and 1897 study Science and Charity at the top, conservative genre paintings in an academic style, beautifully rendered with exquisite skill by a then 15-year-old artist. See an earlier drawing, Study for a Torso, above, completed in 1892 when Picasso was only 11.
Given his incredible precocity, it may seem hardly any wonder that Picasso innovated scandalously new means of using line, color, and composition. He was a prodigious master of technique at an age when many artists are still years away from formal study. Where else could his restless talent go? He painted a favorite subject in 1900, in the loose, impressionist Bullfight, above, a return of sorts to his first oil painting, Picador, below, made when he was 8. Further down, see a drawing from the following year in his early development, “Bullfight and Pigeons.”
This piece, with its realistic-looking birds carefully drawn upside-down atop a loose sketch of a bullfight, appeared in a 2006 show at the Phillips Collection in Washington, DC featuring childhood artworks from Picasso and Paul Klee. Contrary, perhaps, to our expectations, curator Jonathan Fineberg remarks of this drawing that “9‑year-old Picasso’s confident, playful scribble” gives us more indication of his talent than the finely-drawn birds.
“It’s not just that Picasso could render well, because you could teach anybody to do that,” Fineberg says. Maybe not anybody, but the point stands—technique can be taught, creative vision cannot. “It’s not about skill. It’s about unique qualities of seeing. That’s what makes Picasso a better artist than Andrew Wyeth. Art is about a novel way of looking at the world.” You may prefer Wyeth, or think the downward comparison unfair, but there’s no denying Picasso had a very “novel way of seeing,” from his earliest sketches to his most revolutionary modernist masterpieces. See several more highly accomplished early works from Picasso here.
Note: An earlier version of this post appeared on our site in 2018.
In South Korea, where I live, there may be no brand as respected as Habodeu. Children dream of it; adults seemingly do anything to play up their own connections to it, however tenuous those connections may be. But what is Habodeu? An electronics company? A line of clothing? Some kind of luxury car? Not at all: it is, in fact, the Korean pronunciation of Harvard, the American university. Practically everyone around the world is aware of Harvard’s prestige, but relatively few know that you can take many of its courses online without paying tuition, or even applying. In fact, you can find a list of more than 130 such courses right here, all available to take right now.
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
by Colin Marshall | Permalink | Make a Comment ( Comments Off on Harvard Lets You Take 133 Free Online Courses: Explore Courses on Justice, American Government, Literature, Religion, CompSci & More ) |
Though its answer has grown more complicated in recent years, the question of whether computers will ever truly think has been around for quite some time. Richard Feynman was being asked about it 40 years ago, as evidenced by the lecture clip above. As his fans would expect, he approaches the matter of artificial intelligence with his characteristic incisiveness and humor — as well as his tendency to re-frame the conversation in his own terms. If the question is whether machines will ever think like human beings, he says no; if the question is whether machines will ever be more intelligent than human beings, well, that depends on how you define intelligence.
Even today, it remains quite a tall order for any machine to meet our constant demands, as Feynman articulates, for better-than-human mastery of every conceivable task. And even when their skills do beat mankind’s — as in, say, the field of arithmetic, which computers dominate by their very nature — they don’t use their calculating apparatus in the same way as human beings use their brains.
Perhaps, in theory, you could design a computer to add, subtract, multiply, and divide in approximately the same slow, error-prone fashion we tend to do, but why would you want to? Better to concentrate on what humans can do better than machines, such as the kind of pattern recognition required to recognize a single human face in different photographs. Or that was, at any rate, something humans could do better than machines.
The tables have turned, thanks to the machine learning technologies that have lately emerged; we’re surely not far from the ability to pull up a portrait, and along with it every other picture of the same person ever uploaded to the internet. The question of whether computers can discover new ideas and relationships by themselves sends Feynman into a disquisition on the very nature of computers, how they do what they do, and how their high-powered inhuman ways, when applied to reality-based problems, can lead to solutions as bizarre as they are effective. “I think that we are getting close to intelligent machines,” he says, “but they’re showing the necessary weaknesses of intelligence.” Arthur C. Clarke said that any sufficiently advanced technology is indistinguishable from magic, and perhaps any sufficiently smart machine looks a bit stupid.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
by Colin Marshall | Permalink | Make a Comment ( Comments Off on Will Machines Ever Truly Think? Richard Feynman Contemplates the Future of Artificial Intelligence (1985) ) |
Is perpetual motion possible? In theory… I have no idea…. In practice, so far at least, the answer has been a perpetual no. As Nicholas Barrial writes at Makery, “in order to succeed,” a perpetual motion machine “should be free of friction, run in a vacuum chamber and be totally silent” since “sound equates to energy loss.” Trying to satisfy these conditions in a noisy, entropic physical world may seem like a fool’s errand, akin to turning base metals to gold. Yet the hundreds of scientists and engineers who have tried have been anything but fools.
The long list of contenders includes famed 12th-century Indian mathematician Bhāskara II, also-famed 17th-century Irish scientist Robert Boyle, and a certain Italian artist and inventor who needs no introduction. It will come as no surprise to learn that Leonardo da Vinci turned his hand to solving the puzzle of perpetual motion. But it seems, in doing so, he “may have been a dirty, rotten hypocrite,” Ross Pomery jokes at Real Clear Science. Surveying the many failed attempts to make a machine that ran forever, he publicly exclaimed, “Oh, ye seekers after perpetual motion, how many vain chimeras have you pursued? Go and take your place with the alchemists.”
In private, however, as Michio Kaku writes in Physics of the Impossible, Leonardo “made ingenious sketches in his notebooks of self-propelling perpetual motion machines, including a centrifugal pump and a chimney jack used to turn a roasting skewer over a fire.” He also drew up plans for a wheel that would theoretically run forever. (Leonardo claimed he tried only to prove it couldn’t be done.) Inspired by a device invented by a contemporary Italian polymath named Mariano di Jacopo, known as Taccola (“the jackdaw”), the artist-engineer refined this previous attempt in his own elegant design.
Leonardo drew several variants of the wheel in his notebooks. Despite the fact that the wheel didn’t work—and that he apparently never thought it would—the design has become, Barrial notes, “THE most popular perpetual motion machine on DIY and 3D printing sites.” (One maker charmingly comments, in frustration, “Perpetual motion doesn’t seem to work, what am I doing wrong?”) The gif at the top, from the British Library, animates one of Leonardo’s many versions of unbalanced wheels. This detailed study can be found in folio 44v of the Codex Arundel, one of several collections of Leonardo’s notebooks that have been digitized and previously made available online.
In his book The Innovators Behind Leonardo, Plinio Innocenzi describes these devices, consisting of “12 half-moon-shaped adjacent channels which allow the free movement of 12 small balls as a function of the wheel’s rotation…. At one point during the rotation, an imbalance will be created whereby more balls will find themselves on one side than the other,” creating a force that continues to propel the wheel forward indefinitely. “Leonardo reprimanded that despite the fact that everything might seem to work, ‘you will find the impossibility of motion above believed.’”
Leonardo also sketched and described a perpetual motion device using fluid mechanics, inventing the “self-filling flask” over two-hundred years before Robert Boyle tried to make perpetual motion with this method. This design also didn’t work. In reality, there are too many physical forces working against the dream of perpetual motion. Few of the attempts, however, have appeared in as elegant a form as Leonardo’s.
Note: An earlier version of this post appeared on our site in 2019.
“In the future, e‑mail will make the written word a thing of the past,” declares the narration of a 1999 television commercial for Orange, the French telecom giant. “In the future, we won’t have to travel; we’ll meet on video. In the future, we won’t need to play in the wind and rain; computer games will provide all the fun we need. And in the future, man won’t need woman, and woman won’t need man.” Not in our future, the voice hastens to add, speaking for Orange’s corporate vision: a bit of irony to those of us watching here in 2025, who could be forgiven for thinking that the predictions leading up to it just about sum up the progress of the twenty-first century so far. Nor will it surprise us to learn that the spot was directed by Ridley Scott, that cinematic painter of dystopian sheen.
Bleak futures constitute just one part of Scott’s advertising portfolio. Watch above through the feature-length compilation of his commercials (assembled by the YouTube channel Shot, Drawn & Cut), and you’ll see dens of Croesan wealth, deep-sea expeditions, the trenches of the Great War, the wastes of the Australian outback, acts of Cold War espionage, a dance at a neon-lined nineteen-fifties diner, and the arrival of space aliens in small-town America — who turn out just to be stopping by for a Pepsi.
Not that Scott is a brand loyalist: that he did a good deal of work for America’s second-biggest soda brand, some of them not just Miami Vice-themed but starring Don Johnson himself, didn’t stop him from also directing a Coca-Cola spot featuring Max Headroom. The decade was, of course, the nineteen eighties, at the beginning of which Scott made his most enduring mark as a visual stylist with Blade Runner.
A series of spots for Barclays bank (whose indictments of computerized service now seem prescient about our fast-approaching AI-“assisted” reality) hew so closely to the Blade Runner aesthetic that they might as well have been part of the same production. But of Scott’s dystopian advertisements, none are more celebrated than the Super Bowl spectacle for the Apple Macintosh in which a hammer-thrower smashes a Nineteen Eighty-Four-style dictator-on-video. The compilation also includes a less widely remembered commercial for the Macintosh’s technically innovative but commercially failed predecessor, the Apple Lisa. So associated did Scott become with cutting-edge technology that it’s easy to forget that he rose up through the advertising world of his native Britain by making big impacts, over and over, for downright quaint brands: Hovis bread, McDougall’s pastry mix, Findus frozen fish pies.
It may seem a contradiction that Scott, long practically synonymous with the large-scale Hollywood genre blockbuster, would have started out by crafting such nostalgia-suffused miniatures. And it would take an inattentive viewer indeed not to note that the man who oversaw the definitive cinematic vision of a menacing Asia-inflected urban dystopia would go on to make commercials for the Sony MiniDisc and the Nissan 300ZX. It all makes more sense if you take Scott’s artistic interests as having less to do with culture and more to do with bureaucracy, architecture, machinery, and other such systems in which humanity is contained: so natural a fit for the realm of advertising that it’s almost a surprise he’s made features at all. And indeed, he continues to do ad work, bringing movie-like grandeur to multi-minute promotions for brands like Hennessy and Turkish Airlines — each one introduced as “a Ridley Scott film.”
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
by Colin Marshall | Permalink | Make a Comment ( Comments Off on Ridley Scott’s Cinematic TV Commercials: An 80-Minute Compilation Spanning 1968–2023 ) |
By the time he filmed this video archived on Iowa Public Television’s YouTube channel, Jim Henson was just about to strike gold with a new children’s show called Sesame Street. The year was 1969, and he already had 15 years of puppetry experience under his belt, from children’s shows to commercials and experimental films.
On the cusp of success, Henson, along with fellow puppeteer Don Sahlin (the creator and voice of Rowlf), ventures to teach kids how to make a puppet out of pretty much anything you’ll find around the house. Such a vision appears easy, but it really shows the genius of Henson, as he and Sahlin make characters from a tennis ball, a mop, a wooden spoon, a cup, socks, an envelope, even potatoes and pears. (There is a lot to be said for the inherent comedy of googly eyes, and the importance of fake fur.)
An unknown assistant takes some of these puppets and brings them to life while Henson and his partner create more–funny voices, personalities, even a bit of anarchy are in play. Surprisingly, Kermit does not make an appearance, although his sock ancestor does.
The man who saw potential puppets in everything is in his element and relaxed. Check it out, smile, and then raid your kitchen for supplies for your own puppet show. And although Henson promises a further episode, it has yet to be found on YouTube, or elsewhere.
Note: An earlier version of this post appeared on our site in 2016.
Ted Mills is a freelance writer on the arts who currently hosts the artist interview-based FunkZone Podcast. You can also follow him on Twitter at @tedmills, read his other arts writing at tedmills.com and/or watch his films here.
by OC | Permalink | Make a Comment ( Comments Off on A Young Jim Henson Teaches You How to Make Puppets with Socks, Tennis Balls & Other Household Goods (1969) ) |
Homework has lately become unfashionable, at least according to what I’ve heard from teachers in certain parts of the United States. That may complicate various fairly long-standing educational practices, but it doesn’t necessarily reflect an absolute drop in standards and expectations. Those of us who went to school around the turn of the millennium may remember feeling entombed in homework, an intensified version of what the generation that came of age amid the early Cold War’s pressure for “more science,” would have dealt with. But late baby boomers and early Gen-Xers in the sixties and seventies had a much lighter load, as did the generation educated under John Dewey’s reforms of the early twentieth century.
We can follow this line all the way back to the times of the Babylonians, 4,000 years ago. In the video above from her channel Tibees, science YouTuber Toby Hendy shows us a few artifacts of homework from antiquity and explains how to interpret them.
Inscribed in a clay tablet, their simple but numerous marks reveal them to be examples of math homework, that most loathed category today, and perhaps then as well. (Even when interpreted in modern language, the calculations may seem unfamiliar, performed as they are not in our base ten, but base 60 — shades of the “new math” to come much later.) That the Babylonians had fairly advanced mathematics, which Hendy demonstrates using some clay of her own, may be as much of a surprise as the fact that they did homework.
Not that they all did it. Universal schooling itself dates only from the industrial age, and for the Babylonians, industry was still a long way off. They did, however, take the considerable step of creating civilization, which they couldn’t have done without writing. The ancient assignment Hendy shows would’ve been done by a student at an eduba, which she describes as a “scribe school.” Scribe, as we know, means one who writes — which, in Babylon, meant one who writes in Sumerian. That skill was transmitted through the network of eduba, or “house where tablets are passed out,” which were usually located in private residences, and which turned out graduates literate and numerate enough to keep the empire running, at least until the sixth century BC or so. From certain destructive forces, it seems, no amount of homework can protect a civilization forever.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
by Colin Marshall | Permalink | Make a Comment ( Comments Off on The World’s Oldest Homework: A Look at Babylonian Math Homework from 4,000 Years Ago ) |
Three Yale professors—Timothy Snyder, Jason Stanley and Marci Shore–have spent their careers studying fascism and authoritarianism. They know the signs of emerging authoritarianism when they see it. Now, they’re seeing those signs here in the United States, and they’re not sitting by idly. They’ve moved to the University of Toronto where they can speak freely, without fearing personal or institutional retribution. Above, they share their views in the NYTimes Op-Doc. It comes prefaced with the text below:
Legal residents of the United States sent to foreign prisons without due process. Students detained after voicing their opinions. Federal judges threatened with impeachment for ruling against the administration’s priorities.
In this Opinion video, Marci Shore, Timothy Snyder and Jason Stanley, all professors at Yale and experts in authoritarianism, explain why America is especially vulnerable to a democratic backsliding — and why they are leaving the United States to take up positions at the University of Toronto.
Professor Stanley is leaving the United States as an act of protest against the Trump administration’s attacks on civil liberties. “I want Americans to realize that this is a democratic emergency,” he said.
Professor Shore, who has spent two decades writing about the history of authoritarianism in Central and Eastern Europe, is leaving because of what she sees as the sharp regression of American democracy. “We’re like people on the Titanic saying our ship can’t sink,” she said. “And what you know as a historian is that there is no such thing as a ship that can’t sink.”
She borrows from political and apolitical Slavic motifs and expressions, arguing that the English language does not fully capture the democratic regression in this American moment.
Professor Snyder’s reasons are more complicated. Primarily, he’s leaving to support his wife, Professor Shore, and their children, and to teach at a large public university in Toronto, a place he says can host conversations about freedom. At the same time, he shares the concerns expressed by his colleagues and worries that those kinds of conversations will become ever harder to have in the United States.
“I did not leave Yale because of Donald Trump or because of Columbia or because of threats to Yale — but that would be a reasonable thing to do, and that is a decision that people will make,” he wrote in a Yale Daily News article explaining his decision to leave.
Their motives differ but their analysis is the same: ignoring or downplaying attacks on the rule of law, the courts and universities spells trouble for our democracy.
On his 84th birthday this past Saturday, Bob Dylan played a show. That was in keeping with not only his still-serious touring schedule, but also his apparently irrepressible instinct to work: on music, on writing, on painting, on sculpture. Even his occasional tweeting draws an appreciative audience every time. The Bob Dylan of 2025 is not, of course, the Bob Dylan of 1965, but then, the Bob Dylan of 1965 wasn’t the Bob Dylan of 1964. This constant artistic change is just what his fans appreciate, not that they don’t still put on his early stuff with regularity.
In the earliest of that early stuff, as music YouTuber David Hartley explains in the new video above, Dylan “wrote songs by reinventing tradition.” Using nothing but his voice, guitar, and harmonica, the young Dylan “imitated some of the most well-known folk melodies,” placing himself in that long American tradition of borrowing and reinterpretation. But as dramatized in the recent film A Complete Unknown, he soon “went electric,” and with the change in instrumentation came a change in songwriting method: “He would just come up with endless pages of lyrics, something he once called ‘the long piece of vomit.’ ”
The advice to “puke it out now and clean it up later” has long been given, in various forms, to aspiring artists everywhere. One aspect worth highlighting about the way Dylan did it was that, despite writing popular songs, he drew a great deal of inspiration from more traditional literature, to the point that his notes hardly appear to contain anything resembling verses or choruses at all. Only in the studio, with a band behind him, could Dylan give these ideas their final musical shape — or rather, their final shape on that particular album, often to be modified endlessly, and sometimes radically, over decades of live performances to come.
Hartley tells of more dramatic changes to Dylan’s music and his process of creating. The motorcycle crash, the Basement Tapes, the open E tuning, Blood on the Tracks: all of these now lie half a century or more in the past. To go over all the ways Dylan has approached music since then would require more hours than all but the most rabid enthusiasts (though there are many) would watch. The video does include a 60 Minutes clip from 2004 in which Dylan says that “those early songs were almost magically written,” and that he wouldn’t be able to create them anymore. But then, nor could the Dylan of Highway 61 Revisited have recorded Time Out of Mind, and nor, for that matter, could the Dylan of Time Out of Mind have recorded any of Dylan’s albums from this decade — or those that could, quite possibly, be still to come.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
Images or Orwell and Dali via Wikimedia Commons
Should we hold artists to the same standards of human decency that we expect of everyone else? Should talented people be exempt from ordinary morality? Should artists of questionable character have their work consigned to the trash along with their personal reputations? These questions, for all their timeliness in the present, seemed no less thorny and compelling 81 years ago when George Orwell confronted the strange case of Salvador Dali, an undeniably extraordinary talent, and—Orwell writes in his 1944 essay “Benefit of Clergy”—a “disgusting human being.”
The judgment may seem overly harsh except that any honest person would say the same given the episodes Dali describes in his autobiography, which Orwell finds utterly revolting. “If it were possible for a book to give a physical stink off its pages,” he writes, “this one would.” The episodes he refers to include, at six years old, Dali kicking his three-year-old sister in the head, “as though it had been a ball,” the artist writes, then running away “with a ‘delirious joy’ induced by this savage act.” They include throwing a boy from a suspension bridge, and, at 29 years old, trampling a young girl “until they had to tear her, bleeding, out of my reach.” And many more such violent and disturbing descriptions.
Dali’s litany of cruelty to humans and animals constitutes what we expect in the early life of serial killers rather than famous artists. Surely he is putting his readers on, wildly exaggerating for the sake of shock value, like the Marquis de Sade’s autobiographical fantasies. Orwell allows as much. Yet which of the stories are true, he writes, “and which are imaginary hardly matters: the point is that this is the kind of thing that Dali would have liked to do.” Moreover, Orwell is as repulsed by Dali’s work as he is by the artist’s character, informed as it is by misogyny, a confessed necrophilia and an obsession with excrement and rotting corpses.
But against this has to be set the fact that Dali is a draughtsman of very exceptional gifts. He is also, to judge by the minuteness and the sureness of his drawings, a very hard worker. He is an exhibitionist and a careerist, but he is not a fraud. He has fifty times more talent than most of the people who would denounce his morals and jeer at his paintings. And these two sets of facts, taken together, raise a question which for lack of any basis of agreement seldom gets a real discussion.
Orwell is unwilling to dismiss the value of Dali’s art, and distances himself from those who would do so on moralistic grounds. “Such people,” he writes, are “unable to admit that what is morally degraded can be aesthetically right,” a “dangerous” position adopted not only by conservatives and religious zealots but by fascists and authoritarians who burn books and lead campaigns against “degenerate” art. “Their impulse is not only to crush every new talent as it appears, but to castrate the past as well.” (“Witness,” he notes, the outcry in America “against Joyce, Proust and Lawrence.”) “In an age like our own,” writes Orwell, in a particularly jarring sentence, “when the artist is an exceptional person, he must be allowed a certain amount of irresponsibility, just as a pregnant woman is.”
At the very same time, Orwell argues, to ignore or excuse Dali’s amorality is itself grossly irresponsible and totally inexcusable. Orwell’s is an “understandable” response, writes Jonathan Jones at The Guardian, given that he had fought fascism in Spain and had seen the horror of war, and that Dali, in 1944, “was already flirting with pro-Franco views.” But to fully illustrate his point, Orwell imagines a scenario with a much less controversial figure than Dali: “If Shakespeare returned to the earth to-morrow, and if it were found that his favourite recreation was raping little girls in railway carriages, we should not tell him to go ahead with it on the ground that he might write another King Lear.”
Draw your own parallels to more contemporary figures whose criminal, predatory, or violently abusive acts have been ignored for decades for the sake of their art, or whose work has been tossed out with the toxic bathwater of their behavior. Orwell seeks what he calls a “middle position” between moral condemnation and aesthetic license—a “fascinating and laudable” critical threading of the needle, Jones writes, that avoids the extremes of “conservative philistines who condemn the avant garde, and its promoters who indulge everything that someone like Dali does and refuse to see it in a moral or political context.”
This ethical critique, writes Charlie Finch at Artnet, attacks the assumption in the art world that an appreciation of artists with Dali’s peculiar tastes “is automatically enlightened, progressive.” Such an attitude extends from the artists themselves to the society that nurtures them, and that “allows us to welcome diamond-mine owners who fund biennales, Gazprom billionaires who purchase diamond skulls, and real-estate moguls who dominate temples of modernism.” Again, you may draw your own comparisons.
Note: An earlier version of this post appeared on our site in 2018.
Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness
by OC | Permalink | Make a Comment ( Comments Off on George Orwell Reviews Salvador Dali’s Autobiography: “Dali is a Good Draughtsman and a Disgusting Human Being” (1944) ) |
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.