Taking a first glance at the Babylonian Map of the World, few of us could recognize it for what it is. But then again, few of us are anything like the British Museum Middle East department curator Irving Finkel, whose vast knowledge (and ability to share it compellingly) have made him a viewer favorite on the institution’s Youtube channel. In the Curator’s Corner video above, he offers an up-close view of the Babylonian Map of the World — or rather, the fragment of the clay tablet from the eighth or seventh century BC that he and other experts have determined contains a piece of the oldest map of the known world in existence.
“If you look carefully, you will see that the flat surface of the clay has a double circle,” Finkel says. Within the circle is cuneiform writing that describes the shape as the “bitter river” that surrounds the known world: ancient Mesopotamia, or modern-day Iraq.
Inside the circle lie representations of both the Euphrates River and the mighty city of Babylon; outside it lie a series of what scholars have determined were originally eight triangles. “Sometimes people say they are islands, sometimes people say they are districts, but in point of fact, they are almost certainly mountains,” which stand “far beyond the known world” and represent, to the ancient Babylonians, “places full of magic, and full of mystery.”
Coming up with a coherent explanation of the map itself hinged on the discovery, in the nineteen-nineties, of one of those triangles originally thought to have been lost. This owes to the enthusiasm of a non-professional, a student in Finkel’s cuneiform night classes named Edith Horsley. During one of her once-a-week volunteer shifts at the British Museum, she set aside a particularly intriguing clay fragment. As soon as Finkel saw it, he knew just the artifact to which it belonged. After the piece’s reattachment, much fell into place, not least that the map purported to show the distant location of the beached (or rather, mountained) ark built by “the Babylonian version of Noah” — the search for which continues these nine or so millennia later.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on Twitter at @colinmarshall or on Facebook.
Sometimes it can seem as though the more we think we know a historical figure, the less we actually do. Helen Keller? We’ve all seen (or think we’ve seen) some version of The Miracle Worker, right?—even if we haven’t actually read Keller’s autobiography. And Mark Twain? He can seem like an old family friend. But I find people are often surprised to learn that Keller was a radical socialist firebrand, in sympathy with workers’ movements worldwide. In a short article in praise of Lenin, for example, Keller once wrote, “I cry out against people who uphold the empire of gold…. I am perfectly sure that love will bring everything right in the end, but I cannot help sympathizing with the oppressed who feel driven to use force to gain the rights that belong to them.”
Twain took a more pessimistic, ironic approach, yet he thoroughly opposed religious dogma, slavery, and imperialism. “I am always on the side of the revolutionists,” he wrote, “because there never was a revolution unless there were some oppressive and intolerable conditions against which to revolute.” While a great many people grow more conservative with age, Twain and Keller both grew more radical, which in part accounts for another little-known fact about these two nineteenth-century American celebrities: they formed a very close and lasting friendship that, at least in Keller’s case, may have been one of the most important relationships in either figure’s lives.
Twain’s importance to Keller, and hers to him, begins in 1895, when the two met at a lunch held for Keller in New York. According to the Mark Twain Library’s extensive documentary exhibit, Keller “seemed to feel more at ease with Twain than with any of the other guests.” She would later write, “He treated me not as a freak, but as a handicapped woman seeking a way to circumvent extraordinary difficulties.”Twain was taken as well, surprised by “her quickness and intelligence.” After the meeting, he wrote to his benefactor Henry H. Rogers, asking Rogers to fund Keller’s education. Rogers, the Mark Twain Library tells us, “personally took charge of Helen Keller’s fortunes, and out of his own means made it possible for her to continue her education and to achieve for herself the enduring fame which Mark Twain had foreseen.”
Twain wrote to his wealthy friend, “It won’t do for America to allow this marvelous child to retire from her studies because of poverty. If she can go on with them she will make a fame that will endure in history for centuries.” Thereafter, the two would maintain a “special friendship,” sustained not only by their political sentiments, but also by a love of animals, travel, and other personal similarities. Both writers came to live in Fairfield County, Connecticut at the end of their lives, and she visited him at his Redding home, Stormfield, in 1909, the year before his death (see them there at the top of the post, and more photos here). Twain was especially impressed by Keller’s autobiography, writing to her, “I am charmed with your book—enchanted.” (See his endorsement in a 1903 advertisement, below.)
Twain also came to Keller’s defense, ten years later, after reading in her book about a plagiarism scandal that occurred in 1892 when, at only twelve years old, she was accused of lifting her short story “The Frost King” from Margaret Canby’s “Frost Fairies.” Though a tribunal acquitted Keller of the charges, the incident still piqued Twain, who called it “unspeakably funny and owlishly idiotic and grotesque” in a 1903 letter in which he also declared: “The kernel, the soul—let us go further and say the substance, the bulk, the actual and valuable material of all human utterance—is plagiarism.” What differs from work to work, he contends is “the phrasing of a story”; Keller’s accusers, he writes protectively, were “solemn donkeys breaking a little child’s heart.”
We also have Twain—not playwright William Gibson—to thank for the “miracle worker” title given to Keller’s teacher, Anne Sullivan. (See Keller, Sullivan, Twain, and Sullivan’s husband John Macy above at Twain’s home). As a tribute to Sullivan for her tireless work with Keller, he presented her with a postcard that read, “To Mrs. John Sullivan Macy with warm regard & with limitless admiration of the wonders she has performed as a ‘miracle-worker.’” In his 1903 letter to Keller, he called Sullivan “your other half… for it took the pair of you to make complete and perfect whole.”
Twain praised Sullivan effusively for “her brilliancy, penetration, originality, wisdom, character, and the fine literary competencies of her pen.” But he reserved his highest praise for Keller herself. “You are a wonderful creature,” he wrote, “The most wonderful in the world.” Keller’s praise of her friend Twain was no less lofty. “I have been in Eden three days and I saw a King,” she wrote in his guestbook during her visit to Stormfield, “I knew he was a King the minute I touched him though I had never touched a King before.” The last words in Twain’s autobiography, the first volume anyway—which he only allowed to be published in 2010—are Keller’s; “You once told me you were a pessimist, Mr. Clemons,” he quotes her as saying, “but great men are usually mistaken about themselves. You are an optimist.”
Many of us have put off a visit to Venice for fear of the hordes of tourists who roam its streets and boat down its canals day in and day out. To judge by the most visible of its economic activity, the once-mighty city-state now exists almost solely as an Instagramming destination. It wasn’t always this way. “Despite having no roads, no land, and no fresh water, the Venetians managed to turn a muddy swamp into the most powerful and wealthiest city of its time,” says the narration of the Primal Space video above. Its “unique layout of canals and bridges woven through hundreds of islands made Venice incredibly accessible, and it became the epicenter of all business.”
Venice, in other words, was at its height what world capitals like London or New York would become in later eras. But on a physical level, it faced challenges unknown in those cities, challenges that demanded a variety of ingenious medieval engineering solutions, most of which still function today. First, the builders of Venice had to bring timber from the forests of Croatia and drive it into the soft soil, creating a platform sturdy enough to bear the weight of an entire urban built environment. Construction of the buildings on top proved to be a trial-and-error affair, which came around to using bricks with lime mortar to ensure flexibility on the slowly shifting ground.
“Instead of expanding outwards like most cities,” Venice’s islands “expanded into each other.” Eventually, they had to be connected, though “there were no bridges for the first 500 years of Venice’s existence,” not until the Doge offered a prize for the best design that could link the financial center of Rialto to the rest of the city. But what really mattered was the test of time, one long since passed by the Ponte di Rialto, which has stood fundamentally unaltered since it was rebuilt in stone in 1591. The combination of bridges and canals, with what we would now call their separation of traffic, did its part to make Venice “the most powerful and richest city in Europe” by the fifteenth century.
Even the richest and most powerful cities need water, and Venice had an abundance of only the “extremely salty and undrinkable” kind. To meet the needs of the city’s fast-growing population, engineers built wells surrounded by sand-and-stone filtration systems into Venice’s characteristic squares, turning the city into “an enormous funnel.” The related problem of waste management necessitated the construction of “a network of underground tunnels” directed into canals, flushed out by the motion of the tides. Venice’s plumbing has since been brought up to modern standards, among other ambitious engineering projects. But on the whole, the city still works as it did in the days of the Doge, and that fact alone makes it a sight worth seeing.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on Twitter at @colinmarshall or on Facebook.
If hearing such stories sets off an existential panic attack because you squandered your 20s with too much reality TV and graduate school, then take heart — you’re not necessarily a failure.
As Adam Westbrook points out in his video essay The Long Game, Leonardo da Vinci was a loser before he painted The Last Supperat age 46. As a youth, Leonardo planned grandiose projects that he wouldn’t be able to finish. This, of course, did little for his reputation and even less for his career as a freelance artist. But he continued to work, eking out a living by enduring the demands of picky, small-minded clients, and, through this lean period, Leonardo emerged as a great artist. Robert Greene, in his book Mastery, calls this period “The Difficult Years.” Every successful creative slogs through some form of the Difficult Years, even child prodigies. Mozart just went through his struggles at a time when most children are learning to read.
In other words, “genius” has less to do with innate talent than just doing the work. Of course, that isn’t nearly as good a story as that of the romantic genius. But it is encouraging for those of us who haven’t quite yet won that MacArthur grant.
You can watch Westbrook’s video essay in various parts above.
Jonathan Crow is a writer and filmmaker whose work has appeared in Yahoo!, The Hollywood Reporter, and other publications. You can follow him at @jonccrow.
In the 1950s, it was fashionable to drop Freud’s name — often as not in pseudo-intellectual sex jokes. Freud’s preoccupations had as much to do with his fame as the actual practice of psychotherapy, and it was assumed — and still is to a great degree — that Freud had “won” the debate with his former student and friend Carl Jung, who saw religion, psychedelic drugs, occult practices, etc. as valid forms of individualizing and integrating human selves — selves that were after all, he thought, connected by far more than biological drives for sex and death.
Now Jung’s insights permeate the culture, in increasingly popular fields like transpersonal psychology, for example, that see humans as “radically interconnected, not just isolated individuals,” psychologist Harris L. Friedman argues. Movements like these grew out of the “counterculture movements of the 1960s,” psychology lecturer and author Steve Taylor explains, “and the wave of psycho-experimentation it involved, through psychedelic substances, meditation and other consciousness-changing practices” — the very practices Jung explored in his work.
Indeed, Jung was the first “to legitimize a spiritual approach to the practice of depth psychology,” Mark Kasprow and Bruce Scotton point out, and “suggested that psychological development extends to include higher states of consciousness and can continue throughout life, rather than stop with the attainment of adult ego maturation.” Against Freud, who thought transcendence was regression, Jung “proposed that transcendent experience lies within and is accessible to everyone, and that the healing and growth stimulated by such experience often make use of the languages of symbolic imagery and nonverbal experience.”
Jung’s work became increasingly important after his death in 1961, leading to the publication of his collected works in 1969. These introduced readers to all of his “key concepts and ideas, from archetypal symbols to analytical psychology to UFOs,” notes a companion guide. Near the end of his life, Jung himself provided a verbal survey of his life’s work in the form of four one-hour interviews conducted in 1957 by University of Houston’s Dr. Richard Evans at the Eidgenossische Technische Hoschschule (Federal Institute of Technology) in Zurich.
“The conversations were filmed as part of an educational project designed for students of the psychology department. Evans is a poor interviewer, but Jung compensates well,” the Gnostic Society Library writes. The edited interviews begin with a question about Jung’s concept of persona (also, incidentally, the theme and title of Ingmar Bergman’s 1966 masterpiece). In response, Jung describes the persona in plain terms and with everyday examples as a fictional self “partially dictated by society and partially dictated by the expectations or the wishes one nurses oneself.”
The less we’re consciously aware of our public selves as performances in these terms, the more we’re prone, Jung says, to neuroses, as the pressure of our “shadow,” exerts itself. Jung and Evans’ discussion of persona only grazes the surface of their wide-ranging conversation about the unconscious and the many ways to access it. Throughout, Jung’s examples are clear and his explanations lucid. Above, you can see a transcribed video of the same interviews. Read a published transcript in the collection C.G. Jung Speaking, and see more Jung interviews and documentaries at the Gnostic Society Library.
A recreation of the military sandals. (Photo: Bavarian State Office for Monument Preservation)
Whether you’re putting together a stage play, a film, or a television series, if the story is set in ancient Rome, you know you’re going to have to get a lot of sandals on order. This task may sound more straightforward than it is, for simply copying the styles of classic productions that take place in the Roman Empire will put you on the wrong side of the historical research. We now know, for instance, that some ancient Romans wore their sandals with socks, a look that, seen in today’s cultural context, may not give quite the desired impression. And thanks to an even more recent discovery, it seems we also need to think about what’s on their soles.
Discovered near the Bavarian city of Oberstimm, “an ancient Roman sandal, largely decayed but reconstructed through X‑ray, suggests the spread of military fashion to local populations.” So writes Madeleine Muzdakis at My Modern Met, explaining that its type were known as caligae, which “had tough soles with hobnails [that] provided traction for the troops,” who did a fair bit of marching.
This particular caliga dates from between 60 and 130, around the time the Roman army switched from sandals to boots, and it shows that, during their time in this part of Bavaria, their footwear had an influence on what the civilians were wearing.
An x‑ray of the ancient sandals. (Photo: Bavarian State Office for Monument Preservation
The idea that standard-issue military gear could influence popular fashion may surprise anyone who’s ever had to wear a pair of “GI glasses.” But in its heyday, the Roman army wasn’t just a group of occupiers installed to project force on the part of a distant metropole, but an extension of civilization itself. If the hobnails in Roman military sandals afforded extra traction in addition to the subtle suggestion of cultural sophistication, so much the better. Though the question of just how far and wide this particular type of footwear (which appears reconstructed at the top of the post, and in X‑ray just above) spread through the Roman Empire remains a matter for further research, now would be as good a time as any for costume designers to stock up on nails.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on Twitter at @colinmarshall or on Facebook.
The Olympic Games have their origins in antiquity, but their modern revival has also been going on longer than any of us has been here. Even the fifth Summer Olympics, which took place in Stockholm in 1912, has passed out of living memory. But thanks to the technology of the twenty-first century, we can call up surprisingly crisp footage of its competitions any time we like, much as we’re doing with that of the currently ongoing thirty-third Summer Olympics in Paris. One especially fascinating use of these resources, for those invested in sporting history, is to compare the performances of Olympic athletes over time: we know they’ve improved, but it’s one thing to see the numbers, and quite another to see a side-by-side comparison.
Take the venerable men’s 100 meters, whose 1912 and 2020 finals both appear in the video above. 112 years ago, the United States of America’s Ralph Craig won the day (after seven false starts, and arguably an eighth as well) with a time of 10.8 seconds. Three years ago (Tokyo 2020 having been delayed by COVID-19 to 2021), the victor of that same event was Italy’s Marcell Jacobs, who crossed the finish line at 9.8 seconds.
An even greater evolution manifests in the javelin throw, in which the Swedish Eric Lemming’s 60.64 meters in 1912 becomes Neeraj Chopra’s 87.58 meters in 2020. (Nor has Chopra finished setting records, at least judging by the media fanfare in his homeland that attended his recent arrival in Paris’ Olympic village.)
Pole vaulting, too, has undergone a great leap forward, or rather, upward. Just above, you can see the 1912 record of 3.95 meters set by Henry S. Babcock of the United States, then the 2020 record of 6.02 meters set by Armand “Mondo” Duplantis of Sweden — or technically, of both Sweden and the U.S., having been born and raised in the latter, but able to represent the former due to his mother’s being Swedish. In recent decades, such cases of nationally mixed parentage (the American-born Italian Jacobs being another) have become more common in the Olympics, which in that and other respects has long reflected changes in the wider world. And though whether humanity is improving on the whole remains a matter of heated debate, we’ve undeniably been getting a lot better at running, throwing, and jumping with the aid of big sticks.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on Twitter at @colinmarshall or on Facebook.
“We’re on the cusp of something exhilarating and terrifying.”
The year is 1999 and David Bowie, in shaggy hair and groovy glasses, has seen the future and it is the Internet.
In this short but fascinating interview with BBC’s stalwart and withering interrogator cum interviewer Jeremy Paxman, Bowie offers a forecast of the decades to come, and gets most of it right, if not all. Paxman dolefully plays devil’s advocate, although I suspect he did really see the Net as a “tool”– simply a repackaging of an existing medium.
“It’s an alien life form that just landed,” Bowie counters.
Bowie, who had set up his own bowie.net as a private ISP the previous year, begins by saying that if he had started his career in 1999, he would not have been a musician, but a “fan collecting records.”
It sounded provocative at the time, but Bowie makes a point here that has taken on more credence in recent years–that the revolutionary status of rock in the ‘60s and ‘70s was tied to its rarity, that the inability to readily hear music gave it power and currency. Rock is now “a career opportunity,” he says, and the Internet now has the allure that rock once did.
What Bowie might not have seen is how quickly that allure would wear off. The Internet no longer has a mystery to it. It’s closer to a public utility, oddly a point that Bowie makes later when talking about the invention of the telephone.
Bowie also approved of the demystification between the artist and audience that the Internet was providing. In his final decade, however, he would seek out anonymity and privacy, dropping his final two albums suddenly without fanfare and refusing all interviews. He also didn’t foresee the kind of trolling that sends celebrities and artists off of social media.
Paxman sees the fragmentation of the Internet as a problem; Bowie sees it as a plus.
“The potential of what the Internet is going to do to society, both good and bad, is unimaginable.”
There’s a lot more to unpack in this segment, and let your differing viewpoints be known in the comments. It’s what Bowie would have wanted.
Ted Mills is a freelance writer on the arts who currently hosts the artist interview-based FunkZone Podcast. You can also follow him on Twitter at @tedmills, read his other arts writing at tedmills.com and/or watch his films here.
What do you imagine when you hear the phrase “cat piano”? Some kind of whimsical furry beast with black and white keys for teeth, maybe? A relative of My Neighbor Totoro’s cat bus? Or maybe you picture a piano that contains several caged cats who shriek along an entire scale when keys are pressed that slam sharpened nails into their tails. If this is your answer, you might find people slowly backing away from you at times, or gently suggesting you get some psychiatric help.
But then, imagine that such a perverse oddity was in use by psychiatrists, like the 18th-century German physician Johann Christian Reil, who—reports David McNamee at The Guardian—“wrote that the device was intended to shake mental patients who had lost the ability to focus out of a ‘fixed state’ and into ‘conscious awareness.’”
So long, meds. See you, meditation and mandala coloring books.… I joke, but apparently Dr. Reil was in earnest when he wrote in an 1803 manual for the treatment of mental illness that patients could “be placed so that they are sitting in direct view of the cat’s expressions when the psychiatrist plays a fugue.”
A bafflingly cruel and nonsensical experiment, and we might rejoice to know it probably never took place. But the bizarre idea of the cat piano, or Katzenklavier, did not spring from the weird delusions of one sadistic psychiatrist. It was supposedly invented by German polymath and Jesuit scholar Athanasius Kircher (1602–1680), who has been called “the last Renaissance man” and who made pioneering discoveries in the fields of microbiology, geology, and comparative religion. He was a serious scholar and a man of science. Maybe the Katzenklavier was intended as a sick joke that others took seriously—and for a very long time at that. The illustration of a Katzenklavier above dates from 1667, the one below from 1883.
Kircher’s biographer John Glassie admits that, for all his undoubted brilliance, several of his “actual ideas today seem wildly off-base; if not simply bizarre” as well as “inadvertently amusing, right, wrong, half-right, half-baked, ridiculous….” You get the idea. He was an eccentric, not a psychopath. McNamee points to other, likely apocryphal, stories in which cats were supposedly used as instruments. Perhaps, cruel as it seems to us, the cat piano seemed no crueler in previous centuries than the way we taunt our cats today to make them perform for animated GIFs.
But to the cats these distinctions are meaningless. From their point of view, there is no other way to describe the Katzenklavier than as a sinister, terrifying torture device, and those who might use it as monstrous villains. Personally I’d like to give cats the last word on the subject of the Katzenklavier—or at least a few fictional animated, walking, talking, singing cats. Watch the short animation at the top, in which Nick Cave reads a poem by Eddie White about talented cat singers who mysteriously go missing, scooped up by a human for a “harpsichord of harm, the cruelest instrument to spawn from man’s gray cerebral soup.” The story has all the dread and intrigue of Edgar Allan Poe’s best work, and it is in such a milieu of gothic horror that the Katzenklavier belongs.
Hayao Miyazaki began his career as an animator in 1963, getting in the door at Toei Animation not long before the company ceased to hire regularly. Miyazaki’s equally retirement-resistant contemporary Tetsuya Chiba, already well on his way to fame as a mangaka, or comic artist, published the series Yuki no Taiyou, or Yuki’s Sun, that same year. But the paths of their work wouldn’t cross until 1972, when Yuki no Taiyou was adapted into a pilot for a prospective animated series, the very first project Miyazaki — who had by that point amassed a good deal of experience as not just a key animator and collaborator with Studio Ghibli co-founder-to-be Isao Takahata, but also as a mangaka himself — directed solo.
Despite having been orphaned as an infant, the titular Yuki grows into a high-spirited tomboy so cheerful as to have developed the odd habit of physically striking other people when she gets too happy.
And as with so many parentless protagonists in children’s fiction, she has not just a distinctive personality but also a story-driving desire to discover the truth about her origins — which, it’s intimated, may not be ordinary, and may indeed be special. Her search for her mother sends her on a quest through mountain, valley, wood, and, given the setting of Japan’s northernmost island of Hokkaido, a great deal of snow (the Japanese word for which is a homophone of Yuki’s name).
Alas, this is a quest television audiences would never see, since the plot for Yuki noTaiyou, footage from which you can see in the five-minute teaser above, didn’t draw a network order for a full series. In some respects, it seems conceptually similar to Sasurai no Shoujo Nell, or Wandering Girl Nell, a voluminously loose adaptation of Charles Dickens’ The Old Curiosity Shop that aired in Japan at the end of the seventies. By that time, Miyazaki had completed work on his first animated feature as director, The Castle of Cagliostro. A few years thereafter, he would adapt his own manga Nausicaä of the Valley of the Wind into a motion picture now widely considered the debut of the lavish, captivating Studio Ghibli style — and whose eponymous protagonist has more than a little in common with the adventurous, nature-loving Yuki.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on Twitter at @colinmarshall or on Facebook.
Howlin’ Wolf may well have been the greatest blues singer of the 20th century. Certainly many people have said so, but there are other measurements than mere opinion, though it’s one I happen to share. The man born Chester Arthur Burnett also had a profound historical effect on popular culture, and on the way the Chicago blues carried “the sound of Jim Crow,” as Eric Lott writes, into American cities in the north, and into Europe and the UK. Recording for both Chess and Sun Records in the 50s (Sam Phillips said of his voice, “It’s where the soul of man never dies”), Burnett’s raw sound “was at once urgently urban and country plain… southern and rural in instrumentation and howlingly electric in form.”
He was also phenomenal on stage. His hulking six-foot-six frame and intense glowering stare belied some very smooth moves, but his finesse only enhanced his edginess. He seemed at any moment like he might actually turn into a wolf, letting the impulse give out in plaintive, ragged howls and prowls around the stage. “I couldn’t do no yodelin’,” he said, “so I turned to howlin’. And it’s done me just fine.” He played a very mean harmonica and did acrobatic guitar tricks before Hendrix, picked up from his mentor Charlie Patton. And he played with the best musicians, in large part because he was known to pay well and on time. If you wanted to play electric blues, Howlin’ Wolf was a man to watch.
This reputation was Wolf’s entrée to the stage of ABC variety show Shindig! in 1965, opening for the Rolling Stones. He had just returned from his 1964 tour of Europe and the UK with the American Folk Blues Festival, playing to large, appreciative crossover crowds. He’d also just released “Killing Floor,” a record Ted Gioia notes “reached out to young listeners without losing the deep blues feeling that stood as the cornerstone of Wolf’s sound.” The following year, the Rolling Stones insisted that Shindig!’s producers “also feature either Muddy Waters or Howlin’ Wolf” before they would go on the show. Wolf won out over his rival Waters, toned down the theatrics of his act for a more prudish white audience, and “for the first time in his storied career, the celebrated bluesman performed on a national television broadcast.”
Why is this significant? Over the decades, the Stones regularly performed with their blues heroes. But this was new media ground. Brian Jones’ shy, starstruck introduction to Wolf before his performance above conveys what he saw as the importance of the moment. Jones’ biographer Paul Trynka may overstate the case, but in some degree at least, Wolf’s appearance on Shindig! “built a bridge over a cultural abyss and connected America with its own black culture.” The show constituted “a life-changing moment, both for the American teenagers clustered round the TV in their living rooms, and for a generation of blues performers who had been stuck in a cultural ghetto.” One of these teenagers described the event as “like Christmas morning.”
Eric Lott points to the show’s formative importance to the Stones, who “sit scattered around the Shindig! set watching Wolf in full-metal idolatry” as he sings “How Many More Years,” a song Led Zeppelin would later turn into “How Many More Times.” (See the Stones do their Shindig! performance of jangly, subdued “The Last Time,” here.) The performance represents more, however, than the “British Invasion embrace” of the blues. It shows Wolf’s mainstream breakout, and the Stones paying tribute to a founding father of rock and roll, an act of humility in a band not especially known or appreciated for that quality.
“It was altogether appropriate,” says music writer Peter Guralnick, “that they would be sitting at Wolf’s feet… that’s what it represented. His music was not simply the foundation or the cornerstone; it was the most vital thing you could ever imagine.” Guralnick, notes John Burnett at NPR, calls it “one of the greatest cultural moments of the 20th century.” At minimum, Burnett writes, it’s “one of the most incongruous moments in American pop music”—up until the mid-sixties, at least.
Whether or not the moment could live up to its legend, the people involved saw it as groundbreaking. The venerable Son House sat in attendance—“the man who knew Robert Johnson and Charley Patton,” remarked Brian Jones in awe. And the Rolling Stone positioning himself in deference to “Chicago blues,” Trynka writes, “uncompromising music aimed at a black audience, was a radical, epoch-changing step, both for baby boomer Americans and the musicians themselves. Fourteen and fifteen-year-old kids… hardly understood the growth of civil rights; but they could understand the importance of a handsome Englishman who described the mountainous, gravel-voiced bluesman as a ‘hero’ and sat smiling at his feet.”
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.