The global tourism industry has seen better days than these. In regions like western Europe, to which travelers from all parts have long flocked and spent their money, the coronavirus’ curtailment of world travel this year has surely come as a severe blow. This goes even more so for a country like Italy, whose stock of historic structures, both ruined and immaculately preserved, has long assured it touristic preeminence in its part of the world. So much the worse, then, when Italy became one of the countries hardest hit by the virus this past spring. But its recovery is well underway, as is Europe’s reopening to travelers.
Or at least Europe is reopening to certain travelers: much of the continent has remained closed to those from certain afflicted countries, including but not limited to the United States of America. Of course, the U.S. has also banned entry to travelers who have recently been in many of those European countries, and however you look at it, this situation will take some time to untangle.
Until that happens, those of us who’ve had to indefinitely suspend our planned trips to Italy — or even those of us who’d never considered going before the option was removed from the table — can content ourselves with this set of high-resolution journeys on foot from the Youtube channel ProWalk Tours, all shot at length in real tourist spots amid visitors and locals alike.
Whether the Colosseum and Palatine Hill in Rome, St. Peter’s Basilica in Vatican City, and the towns of Pompeii (in twoparts) and Herculaneum both ruined and preserved by Mt. Vesuvius, ProWalk’s videos show you all you’d see on an in-person waking tour. But they also include features like maps, marks in the timeline denoting each important site, and onscreen facts and explanations of the features of these historic places. Combine these with the immersive virtual museum tours previously featured here on Open Culture, as well as the recreations of ancient Rome in its prime and Pompeii on the day of Vesuvius eruption, and you’ll have the kind of understanding you couldn’t get in person — and with no danger of being whacked by your fellow tourists’ selfie sticks.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall, on Facebook, or on Instagram.
In June of 1969, the original Jimi Hendrix Experience, the band that introduced the sixties to its reigning guitar god, disbanded for good with the departure of Noel Redding following a messy Denver Pop Festival appearance. The story of that gig sounds so apocalyptic—involving heroin, riots, and tear gas—that it reads like cosmic foreshadowing of the tragedy to come: the decades’ greatest psych-rockers go out in a haze of smoke. A little over one year later, Jimi is dead.
But if he seemed burned out in Denver, according to his bandmates, it was no indication at all of where his music was headed. Much of the tension in the band came from Hendrix’s readiness to embark on the next phase of his evolution. After Redding left, he was immediately replaced by Billy Cox, who played with Hendrix at Woodstock in the first incarnation of the Band of Gypsys, with whom Hendrix recorded “Machine Gun,” described by musicologist Andy Aledort as “the premiere example of his unparalleled genius as a rock guitarist.”
In wildly improvisatory performances, Hendrix strove to incorporate the radical moves of Coltrane. He had “transcended the medium of rock music,” writes Aledort, “and set an entirely new standard for the potential of electric guitar.” The drugs intervened, again, and after a disastrous gig at Madison Square Garden in January 1970, the Band of Gypsys broke up. Then, the Experience reformed, with Cox on bass and Mitch Mitchell on drums, and began recording and touring the U.S.
When Jimi wasn’t too high to play, he delivered some of the most blistering performances of his career, including two legendary sets in Hawaii in July, at the foot of Haleakala volcano, that would end up being his final concert appearances in the U.S. These sets were not, in fact, scheduled tour stops but over 50 minutes of performance for a semi-fictional psychedelic film called Rainbow Bridge, notorious for making little sense and for cutting almost all of the promised live footage of Hendrix’s performance, angering everyone who saw it.
The film’s promised soundtrack never materialized, and fans have long coveted these recordings, especially the second set, “a testing ground,” one fan writes, “for his new direction.” Now, they’re finally getting an official release, on CD, Blu-Ray, and LP on November 20th. (See a full tracklist of the two sets here.) This is no outtakes & rarities cash grab, but an essential document of Hendrix at the height of his powers, one year after the Experience seemed to crash and burn. See for yourself in the clip of “Voodoo Child (Slight Return)” at the top.
It’s too bad that this high point of Hendrix’s final year has been overshadowed by the dismal failure of the film that made it happen. But a new documentary, Music, Money, Madness… Jimi Hendrix in Mauiaims to restore this episode of Hendrix history. Coming out on the same day as the live recordings, November 20th, the film (see trailer above) includes more live Hendrix footage than appeared in Rainbow Bridge, and tells the story of how a terrible movie got made around the greatest rock musician of the day. The performances that didn’t make the cut tell another story—about how Hendrix was, again, doing things with the guitar that no one had ever done before.
As a longtime fan of all things Dune, there’s no living director I’d trust more to take over the “property” than Denis Villeneuve. But why remake Dune at all? Oh, I know, the original film—directed (in several cuts) by “Alan Smithee,” also known as David Lynch—is a disaster, so they say. Even Lynch says it. (Maybe the nicest thing he’s ever said about the movie is, “I started selling out on Dune.”) Critics hated, and largely still hate, it; the film’s marketing was a mess (Universal promoted it like a family-friendly Star Wars clone); and the studio felt it necessary to hand glossaries to early audiences to define terms like Kwisatz Haderach, gom jabber, and sardaukar.
But when I first saw David Lynch’s Dune, I did not know any of this. I hardly knew Lynch or his filmography and had yet to read Frank Herbert’s books. I was a young science fiction fan who saw in the movie exactly what Lynch said he intended: “I saw tons and tons of possibilities for things I loved, and this was the structure to do them in. There was so much room to create a world.” I did not know to be upset about his deviations from the books in the grotesque imagining of the Third Stage Guild Navigator or the decision to cover Baron Harkonnen in bloody, oozing pustules. The film’s impenetrability seemed like a feature, not a bug. This was a world, totally alien and yet uncannily familiar.
In hindsight, I can see its many flaws, though not its total failure, but I still find it mesmerizing (and what a cast!). Villeneuve, I think, was in a very difficult position in updating such a divisive work of cinema. Should he appeal to fans of the books who hate Lynch’s film, or to fans of the classic film who love its imagery, or to the kinds of theatergoers Universal Studios feared would need a glossary to make it through the movie? Add to this the pressures of filmmaking during a pandemic, and you can imagine he might be feeling a little stressed.
But Villeneuve seems perfectly relaxed in a recent interview above for the Shanghai International Film Festival, and the trailer for the new film has so far passed muster with everyone who’s seen it, generating excitement among all of the above groups of potential viewers. As you can see in the video at the top, which matches shots from the preview with the same scenes from the 1984 film, the new Dune both does its own thing and references Lynch’s disputed classic in interesting ways.
No director should try to please everyone, but few adaptations come laden with more baggage than Dune. Maybe it’s a good idea to play it safe, anchoring the film to its troubled past while bringing it in line with the current size and scope of Hollywood blockbusters? Not if you ask the director of the Dune that never was. Alejandro Jodorowsky intended to bring audiences the most epic Dune of all time, and was relieved to find that Lynch’s adaptation was “a shitty picture.” By contrast, he pronounces the Villeneuve trailer “very well done” but also compromised by its “industrial” need to appeal to a mass audience. “The form is identical to what is done everywhere,” he says, “The lighting, the acting, everything is predictable.”
Maybe this is inevitable with a story that filmgoers already know. Maybe Villeneuve’s movie has surprises even Jodorowsky won’t see coming. And maybe it’s impossible—and always has been—to make the Dune that the cult Chilean master wanted (though breaking it into two parts, as Villeneuve has done, is surely a wise choice). Herbert’s vision was vast; every Dune is a compromise—“Nobody can do it. It’s a legend,” says Jodorowsky. But every great director who tries leaves behind indelible images that burrow into the mind like shai-hulud.
At one time paperback books were thought of as trash, a term that described their perceived artistic and cultural level, production value, and utter disposability. This changed in the mid-20th century, when certain paperback publishers (Doubleday Anchor, for example, who hired Edward Gorey to design their covers in the 1950s) made a push for respectability. It worked so well that the signature aesthetics they developed still, nearly a lifetime later, pique our interest more readily than those of any other era.
Even today, graphic designers put in the time and effort to master the art of the midcentury paperback cover and transpose it into other cultural realms, as Matt Stevens does in his “Good Movies as Old Books” series. In this “ongoing personal project,” Stevens writes, “I envision some of my favorite films as vintage books. Not a best of list, just movies I love.”
These movies, for the most part, date from more recent times than the mid-20th century. Some, like Jordan Peele’s Us, the Safdie brothers’ Uncut Gems, and Bong Joon-ho’s Parasite, came out just last year. The oldest pictures among them, such as Alfred Hitchcock’s The Birds, date from the early 1960s, when this type of graphic design had reached the peak of its popularity.
Suitably, Stevens also gives the retro treatment to a few already stylized period pieces like Steven Spielberg’s Raiders of the Lost Ark, Joe Johnston’s The Rocketeer, and Andrew Niccol’s Gattaca, a sci-fi vision of the future itself imbued with the aesthetics of the 1940s. Each and every one of Stevens’ beloved-movies-turned-old-books looks convincing as a work of graphic design from roughly the decade and a half after the Second World War, and some even include realistic creases and price tags. This makes us reflect on the connections certain of these films have to literature, most obviously those, like David Fincher’s Fight Club and Stephen Frears’ High Fidelity, adapted from novels in the first place.
More subtle are Rian Johnson’s recent Knives Out, a thoroughgoing tribute to (if not an adaptation of) the work of Agatha Christie; Ridley Scott’s Blade Runner, which hybridizes a Philip K. Dick novella with pulp detective noir; and Wes Anderson’s Rushmore, a statement of its director’s intent to revive the look and feel of the early 1960s (its books and otherwise) for his own cinematic purposes. Stevens has made these imagined covers available for purchase as prints, but some retro design-inclined, bibliophilic film fans may prefer to own them in 21st-century book form. See all of his adaptations in web format here.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall, on Facebook, or on Instagram.
One of our goals on Pretty Much Pop: A Culture Podcast has been to look into not just our favorite creators and genres but into things that get a lot of buzz but which we really don’t know anything about. Manga is a great example of a “look what these crazy kids are into today” kind of area for many (older) Americans.
Deborah Shamoon, an American who teaches Japanese studies at the National University of Singapore and has loved manga since adolescence, here schools manga noobs Mark Linsenmayer and Brian Hirt–along with Erica Spyres, who also doesn’t read manga but at least has a complicated history with anime. What are the barriers for Americans (whether comics readers or not) to appreciate manga? For some of us, manga is actually easier to appreciate than anime given the latter’s sound and pacing.
We talk about manga’s publication history, how fast to read manga, and its use of iconography to depict sound and movement. Deborah gives us the truth about the famed Osamu Tezuka’s place as “god of comics”; we discuss his Metropolis, Astro Boy and Princess Knight, which is not as you may have been told the first “shojo” manga, meaning aimed at girls. Shojo manga is Deborah’s specialty: She wrote a book called Passionate Friendship: The Aesthetics of Girls’ Culture in Japan. We discuss The Heart of Thomas, Sailor Moon, and how Tezuka actually copied that big-eye style from Hideko Mizuno’s Silver Petals. Do you need to get a handle on these old classics to appreciate the newer stuff that’s made such a dent in America like Death Note? Probably not, though some Akira wouldn’t hurt you.
Each generation takes what it needs from early punk and discards what it doesn’t, so that countless subgenres have descended from a small, eccentric collection of punk bands from the late 1970s. The speed and brute simplicity of the Ramones took over in the 80s. The Clash’s strident, reggae-inflected anthems guided much of the 90s. The angular art rock and new wave disco of Television, Talking Heads, and Blondie defined the 2000s.
But some things became almost terminally passé, or terminally stupid, after punk’s first wave: like signing to major labels or wearing swastikas, ironically or otherwise. Already out of fashion by 1978, the first punk dance, the pogo, was so tragically unhip that Debbie Harry pronounced it dead on arrival in the U.S. on famed Manhattan cable access show TV Party, above. She offers to demonstrate it anyway as a “historical” artifact.
Her commentary seems like both a sarcastic rip on the ridiculous spread of trends and a genuine warning to those who might try to make this, like, a thing in New York. Don’t bring a creaky pogo stick with you to the club. Do pour beer over your head after a sweaty half-hour of whatever dance you do. There was so much to learn about punk etiquette even then. Unless you happened to be Sid Vicious, or in the audience of the first Sex Pistols shows. Then it was all fair game.
The pogo originated, so the lore goes, with Sid. As Steve Severin of Siouxsie and the Banshees remembers it, “We first met [Sid] at one of the concerts. He began bouncing around the dance floor, the so called legend of the pogo dance. It was merely Sid jumping up and down, trying to see the band, leaping up and down because he was stuck in the back somewhere.” Just as everyone who saw the Sex Pistols started their own band, everyone who saw Sid bounce around started to pogo.
What at first looks like harmless fun, especially compared to the brutal mosh pits that took over for the pogo, was anything but. “Pogoing was very violent and very painful,” one eyewitness remembers. “People were not quite crushed to death, but serious injuries occurred.” We might rethink Men Without Hats’ “The Safety Dance,” the 80s hit written in defense of pogoing. Lead singer Ivan Doroschuk penned the tune after he was kicked out of a club for doing the pogo. “I think people can relate to the empowering kind of message of ‘The Safety Dance,’” he says.
“The Safety Dance” would not have been the empowering worldwide smash it was had it been called “Pogo Dancing,” a minor hit for the Vibrators in 1976. Not nearly as iconic, and overshadowed by a hipper dance of the same name in the 80s, was the robot, elegized by The Saints in “Doing the Robot.” This dance was “both more expressive and less spontaneous,” as cultural theorist Dick Hebdige describes it in Subculture: The Meaning of Style, consisting of “barely perceptible twitches of the head or hands or more extravagant lurches (Frankenstein’s first steps?) which were abruptly halted at random points.” Hardly as practical as the pogo, but probably a lot safer.
For keen observers of pop culture, the floodtide of zombie films and television series over the past several years has seemed like an especially ominous development. As social unrest spreads and increasing numbers of people are uprooted from their homes by war, climate catastrophe, and, now, COVID-related eviction, one wonders how advisable it might have been to prime the public with so many scenarios in which heroes must fight off hordes of infectious disease carriers? Zombie movies seem intent, after all, on turning not only the dead but also other living humans into objects of terror.
Zombies themselves have a complicated history; like many New World monsters, their origins are tied to slavery and colonialism. The first zombies were not flesh-eating cannibals; they were people robbed of freedom and agency by Voodoo priests, at least in legends that emerged during the brutal twenty-year American occupation of Haiti in the early 20th century. The first feature-length Hollywood zombie film, 1932’s White Zombie, was based on occultist and explorer William Seabrook’s 1929 book The Magic Island and starred Bela Lugosi as a Haitian Voodoo master named “Murder,” who enslaves the heroine and turns her into an instrument of his will.
Subtle the film is not, but no zombie film ever warranted that adjective. Zombie entertainment induces maximum fear of a relentless Other, detached, after White Zombie, from its Haitian context, so that the undead horde can stand in for any kind of invasion. The genre’s history may go some way toward explaining why the U.S. government has an official zombie preparedness plan, called CONOP 8888. The document was written in April 2011 by junior military officers at the U.S. Strategic Command (USSTRATCOM), as a training exercise to formulate a nonspecific invasion contingency plan.
Despite the use of a “fictitious scenario,” CONOP 8888 explicitly states that it “was not actually designed as a joke.” And “indeed, it’s not,” All that’s Interesting assures us, quoting the following from the plan’s introduction:
Zombies are horribly dangerous to all human life and zombie infections have the potential to seriously undermine national security and economic activities that sustain our way of life. Therefore having a population that is not composed of zombies or at risk from their malign influence is vital to U.S. and Allied National Interests.
Substitute “zombies” with any outgroup and the verbiage sounds alarmingly like the rhetoric of state terror. The plan, as you might expect, details a martial law scenario, noting that “U.S. and international law regulate military operations only insofar as human and animal life are concerned. There are almost no restrictions on hostile actions… against pathogenic life forms, organic-robotic entities, or ‘traditional’ zombies,’” whatever that means.
This all seems deadly serious, until we get to the reports’ subsections, which detail scenarios such as “Evil Magic Zombies (EMZ),” “Space Zombies (SZ),” “Vegetarian Zombies (VZ),” and “Chicken Zombies (CZ)” (in fact, “the only proven class of zombie that actually exists”). It’s fascinating to see a military document absorb the many comic permutations of the genre, from George Romero’s subversive satires to Pride and Prejudice and Zombies. No matter how funny zombies are, however, the genre seems to require horrific violence, gore, and siege-like survivalism as key thematic elements.
Tufts University professor Daniel W. Drezner, author of Theories of International Politics and Zombies, has read the Pentagon’s zombie plan closely and discovered some serious problems (and not only with its zombie classification system). While the plan assumes the necessity of “barricaded counter-zombie operations,” it also admits that “USSTRATCOM forces do not currently hold enough contingency stores (food, water) to support” such operations for even 30 days. “So… maybe 28 days later,” Drezner quips, supplies run out? (We’ve all seen what happens next….) Also, alarmingly, the plan is “trigger-happy about nuclear weapons,” adding the possibility of radiation poisoning to the likelihood of starving (or being eaten by the starving).
It turns out, then, that just as in so many modern zombie stories, the zombies may not actually be the worst thing about a zombie apocalypse. Not to be outdone, the CDC decided to capitalize on the zombie craze—rather late, we must say—releasing their own materials for a zombie pandemic online in 2018. These include entertaining blogs, a poster (above), and a graphic novel full of useful disaster preparedness tips for ordinary citizens. The campaign might be judged in poor taste in the COVID era, but the agency assures us, in the event of a zombie apocalypse, “Never Fear—CDC is Ready.” I leave it to you, dear reader, to decide how comforting this promise sounds in 2020.
Although not the debut film of director Wes Anderson, and certainly not of star Bill Murray, Rushmore introduced the world to the both of them. Anderson’s first feature Bottle Rocket (an expansion of the short film previously featured here on Open Culture) hadn’t found a particularly large audience upon its theatrical release in 1996. But quite a few of the viewers who had seen and appreciated it seemed to run in Murray’s circles, and in a 1999 Charlie Rose interview the actor told of being sent copy after unwatched copy by friends and professional contacts alive.
But Murray only needed to read a few pages of Anderson’s new script to understand that the young director knew what he was doing, and his abilities became even more evident on set. “I said, ‘What’s this shot we got?’He goes, ‘Oh, it’s one I saw in Barry Lyndon.’ ” But in Rushmore it depicts “the intermission of the school play,” a full-fledged Kubrickian shot “coming past a lot of, you know, mothers and fathers going — jabbering, and all the way out past people buying Cokes and drinks.” Yes, “the good ones copy, the great ones steal,” but to Murray’s mind that saying “sort of sends a misdirection.”
Not to Anderson, however, whose rare combination of cinephilia and directorial skill have inspired him to make films both rich in cinematic homage and possessed of their own distinctive sensibility — a sensibility that let Murray break out of the standard goofball roles that had threatened to imprison him. In the video essay “Steal Like Wes Anderson,”Thomas Fight examines the now no-longer-young filmmaker’s more recent repurposing of the work of auteurs who came before. In 2014’s The Grand Budapest Hotel, for example, Anderson nearly remakes an entire scene from Torn Curtain, Alfred Hitchcock’s Cold-War thriller with Paul Newman and Julie Andrews that also happens to involve an eastern European hotel.
Anderson doesn’t simply lift Hitchcock’s shots but recomposes them to “fit within his more planometric and symmetrical style,” using the cinematic reference “to add to the experience of the story” and play with audience expectations. If you’ve seen Torn Curtain, you know how Newman’s character shakes the man tailing him; if you’ve seen The Grand Budapest Hotel, you know it doesn’t work out quite so well for Jeff Goldblum’s character. But only if you’ve seen both films can you appreciate Anderson’s sequence — and indeed, Hitchcock’s original — to the fullest. Even now, those of us excitedly anticipating the October release of Anderson’s latest feature The French Dispatch are speculating about not only which classic films inspired it, but also which classic films it will compel us to revisit and enjoy afresh.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall, on Facebook, or on Instagram.
Courtesy of the Met Museum comes the 1984 documentary, In a Brilliant Light: Van Gogh in Arles, narrated by Edward Herrmann:
Near the end of his life, Vincent van Gogh moved from Paris to the city of Arles in southeastern France, where he experienced the most productive period of his artistic career. During his 444 days there, he completed over two hundred paintings and one hundred drawings inspired by the region’s light, wildlife, and inhabitants. This film presents the stories behind many beloved works alongside beautiful footage of daily life in Provence, as well as glimpses of rarely seen canvases held in private collections.
First came the album and tour in 2018. Then the Broadway show in 2019. And now the latest incarnation of David Byrne’s American Utopia–the concert film directed by Spike Lee. Debuting on HBO Max on October 17th, this Spike Lee joint shows David Byrne “joined by an ensemble of 11 musicians, singers, and dancers from around the globe, inviting audiences into a joyous dreamworld where human connection, self-evolution, and social justice are paramount.” If the movie is anything like the tour, it will be sublime. For now, we’ll whet your appetite with the sneak preview above.
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
“Though eagerly embraced by a public in love with a handful of memorable images and spellbound by the thought of an artist who would cut off his own ear,” the authors argue, “Stone’s suicide yarn was based on bad history, bad psychology, and, as a definitive new expert analysis makes clear, bad forensics.” An expert analysis, you say? Yes. the world’s biggest posthumous art star has become an unsolved mystery, the subject of a Buzzfeed video above, part a series that also includes Edgar Allan Poe, JFK, Jimmy Hoffa, and Natalie Wood.
Van Gogh’s suicide seems accepted as a fact by the Van Gogh Museum, at least according to their website, evidenced by the morbid gloom of his late letters to his brother. But Van Gogh wrote “not a word about his final days,” Smith and Naifeh point out, and he left behind no suicide note, “odd for a man who churned out letters so profligately.” A piece of writing found on him turned out to be an early draft of his last letter to Theo, which was “upbeat—even ebullient—about the future.” He had every reason to be, given the glowing success of his first show. “He had placed a large order for more paints only a few days before a bullet put a hole in his abdomen.”
The story of how the hole got there involves a then-16-year-old Paris pharmacist named René Secrétan, who cruelly bullied Van Gogh during his 1890 summer in Auvers. (He also sat for some paintings and a drawing.) His involvement explains the “studied silence” the community maintained after Van Gogh’s death. No one mentioned suicide, but no one would say much of anything else either. Secrétan became a wealthy banker and lived to see Kirk Douglas portray the eccentric artist he mocked as “Toto.” He later admitted to owning the gun that killed Van Gogh, but denied firing the shot.
The new evidence surrounding Van Gogh’s possible murder has been in the public eye for several years now, but it hasn’t made much of a dent in the Van Gogh suicide legend. Still, we must admit, that story has always made little sense. Even scholars at the Van Gogh museum privately admitted to the artist’s biographers that they had serious doubts about it. These were dismissed, they claimed, as being “too controversial.” Now that Van Gogh has become a YouTube true crime unsolved mystery, there’s no shutting the door on speculation about his untimely and tragic demise.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.