Ever wonder what it was like to really fight while wearing a full suit of armor? We’ve featured a few historical reconstructions here on Open Culture, including a demonstration of the various ways combatants would vanquish their foe—including a sword right between the eyes. We’ve also shown you how long it took to create a suit of armor and the clever flexibility built into them. But really, don’t we want to see what it would be like in a full melee? In the above Vice documentary, you can finally sate your bloodlust.
Not that anyone dies in the MMA-like sword-and-chainmail brawls. In these public competitions, the weapons are blunted and contestants fight “not to the death, just until they fall over,” as the narrator somewhat sadly explains. It is just a legit sport as any other fighting challenge, and the injuries are real. There’s no fooling around with these people. They are serious, and a nation’s honor is still at stake.
This mini-doc follows the American team to the International Medieval Combat Federation World Championships in Montemor-o-Velho in Portugal. What looks like a regular Renaissance faire is only the decorations around the main, incredibly violent event. We see battles with longswords, short axes, shields used offensively and defensively, and a lot of pushing and shoving. Contestants go head-to-head, or five against five, or twelve against twelve.
Twenty-six countries take part, and I have to say for all the jingoistic hoo-hah I try to ignore, the American team’s very nicely designed stars and stripes battle gear looked pretty damn cool. The Vice team also discover an interesting cast of characters, like the Texan who wears his cowboy hat when he’s not wearing his combat helmet; the man who describes his fighting style as “nerd rage”; and the couple on their honeymoon who met while brutally beating each other in an earlier competition. (No, the knights here are not all men.).
There are injuries, sprains, broken bones. There’s also the madness of inhaling too much of your own CO2 inside the helmet; and smelling the ozone when a spark of metal-upon-metal flies into the helmet.
Thankfully nobody is fighting to the death or for King/Queen and Country. Just for the fun of adrenalin-based competition and bragging rights.
Ted Mills is a freelance writer on the arts who currently hosts the Notes from the Shed podcast and is the producer of KCRW’s Curious Coast. You can also follow him on Twitter at @tedmills, and/or watch his films here.
The practice and privilege of academic science has been slow in trickling down from its origins as a pursuit of leisured gentleman. While many a leisured lady may have taken an interest in science, math, or philosophy, most women were denied participation in academic institutions and scholarly societies during the scientific revolution of the 1700s. Only a handful of women — seven known in total — were granted doctoral degrees before the year 1800. It wasn’t until 1678 that a female scholar was given the distinction, some four centuries or so after the doctorate came into being. While several intellectuals and even clerics of the time held progressive attitudes about gender and education, they were a decided minority.
Curiously, four of the first seven women to earn doctoral degrees were from Italy, beginning with Elena Cornaro Piscopia at the University of Padua. Next came Laura Bassi, who earned her degree from the University of Bologna in 1732. There she distinguished herself in physics, mathematics, and natural philosophy and became the first salaried woman to teach at a university (she was at one time the university’s highest paid employee). Bassi was the chief popularizer of Newtonian physics in Italy in the 18th century and enjoyed significant support from the Archbishop of Bologna, Prospero Lambertini, who — when he became Pope Benedict XIV — elected her as the 24th member of an elite scientific society called the Benedettini.
“Bassi was widely admired as an excellent experimenter and one of the best teachers of Newtonian physics of her generation,” says Paula Findlen, Stanford professor of history. “She inspired some of the most important male scientists of the next generation while also serving as a public example of a woman shaping the nature of knowledge in an era in which few women could imagine playing such a role.” She also played the role available to most women of the time as a mother of eight and wife of Giuseppe Veratti, also a scientist.
Bassi was not allowed to teach classes of men at the university — only special lectures open to the public. But in 1740, she was granted permission to lecture at her home, and her fame spread, as Findlen writes at Physics World:
Bassi was widely known throughout Europe, and as far away as America, as the woman who understood Newton. The institutional recognition that she received, however, made her the emblematic female scientist of her generation. A university graduate, salaried professor and academician (a member of a prestigious academy), Bassi may well have been the first woman to have embarked upon a full-fledged scientific career.
Poems were written about Bassi’s successes in demonstrating Newtonian optics; “news of her accomplishments traveled far and wide,” reaching the ear of Benjamin Franklin, whose work with electricity Bassi followed keenly. In Bologna, surprise at Bassi’s achievements was tempered by a culture known for “celebrating female success.” Indeed, the city was “jokingly known as a ‘paradise for women,’” writes Findlen. Bassi’s father was determined that she have an education equal to any of her class, and her family inherited money that had been equally divided between daughters and sons for generations; her sons “found themselves heirs to the property that came to the family through Laura’s maternal line,” notes the Stanford University collection of Bassi’s personal papers.
Bassi’s academic work is held at the Academy of Sciences in Bologna. Of the papers that survive, “thirteen are on physics, eleven are on hydraulics, two are on mathematics, one is on mechanics, one is on technology, and one is on chemistry,” writes a University of St. Andrew’s biography. In 1776, a year usually remembered for the formation of a government of leisured men across the Atlantic, Bassi was appointed to the Chair of Experimental Physics at Bologna, an appointment that not only meant her husband became her assistant, but also that she became the “first woman appointed to a chair of physics at any university in the world.”
Bologna was proud of its distinguished daughter, but perhaps still thought of her as an oddity and a token. As Dr. Eleonora Adami notes in a charming biography at sci-fi illustrated stories, the city once struck a medal in her honor, “commemorating her first lecture series with the phrase ‘Soli cui fas vidisse Minervam,’” which translates roughly to “the only one allowed to see Minerva.” But her example inspired other women, like Cristina Roccati, who earned a doctorate from Bologna in 1750, and Dorothea Erxleben, who became the first woman to earn a Doctorate in Medicine four years later at the University of Halle. Such singular successes did not change the patriarchal culture of academia, but they started the trickle that would in time become several branching streams of women succeeding in the sciences.
No one living has experienced a viral event the size and scope of COVID-19. Maybe the unprecedented nature of the pandemic explains some of the vaccine resistance. Diseases of such virulence became rare in places with ready access to vaccines, and thus, ironically, over time, have come to seem less dangerous. But there are still many people in wealthy nations who remember polio, an epidemic that dragged on through the first half of the 20th century before Jonas Salk perfected his vaccine in the mid-fifties.
Polio’s devastation has been summed up visually in textbooks and documentaries by the terrifying iron lung, an early ventilator. “At the height of the outbreaks in the late 1940s,” Meilan Solly writes at Smithsonian, “polio paralyzed an average of more than 35,000 people each year,” particularly affecting children, with 3,000 deaths in 1952 alone. “Spread virally, it proved fatal for two out of ten victims afflicted with paralysis. Though millions of parents rushed to inoculate their children following the introduction of Jonas Salk’s vaccine in 1955, teenagers and young adults had proven more reluctant to get the shot.”
At the time, there were no violent, organized protests against the vaccine, nor was resistance framed as a patriotic act of political loyalty. But “cost, apathy and ignorance became serious setbacks to the eradication effort,” says historian Stephen Mawdsley. And, then as now, irresponsible media personalities with large platforms and little knowledge could do a lot of harm to the public’s confidence in life-saving public health measures, as when influential gossip columnist Walter Winchell wrote that the vaccine “may be a killer,” discouraging countless readers from getting a shot.
When Elvis Presley made his first appearance on Ed Sullivan’s show in 1956, “immunization levels among American teens were at an abysmal 0.6 percent,” note Hal Hershfield and Ilana Brody at Scientific American. To counter impressions that the polio vaccine was dangerous, public health officials did not solely rely on getting more and better information to the public; they also took seriously what Hershfield and Brody call the “crucial ingredients inherent to many of the most effective behavioral change campaigns: social influence, social norms and vivid examples.” Satisfying all three, Elvis stepped up and agreed to get vaccinated “in front of millions” backstage before his second appearance on the Sullivan show.
Elvis could not have been more famous, and the campaign was a success for its target audience, establishing a new social norm through influence and example: “Vaccination rates among American youth skyrocketed to 80 percent after just six months.” Despite the threat he supposedly posed to the establishment, Elvis himself was ready to serve the public. “I certainly never wanna do anything,” he said, “that would be a wrong influence.” See in the short video at the top how American public health officials stopped millions of preventable deaths and disabilities by admitting a fact propagandists and advertisers never shy from — humans, on the whole, are easily persuaded by celebrities. Sometimes they can even be persuaded for the good.
The British have a number of sayings that strike listeners of other English-speaking nationalities as odd. “Safe as houses” has always had a curious ring to my American ear, but it turns out to be quite ironic as well: the expression grew popular in the Victorian era, a time when Londoners were as likely to be killed by their own houses as anything else. That, at least, is the impression given by “The Bizarre Ways Victorians Sabotaged Their Own Health & Lives,” the documentary investigation starring historian Suzannah Lipscomb above.
Throughout the second half of the 19th century, many an Englishman would have regarded himself as living at the apex of civilization. He wouldn’t have been wrong, exactly, since that place and time witnessed an unprecedented number of large-scale innovations industrial, scientific, and domestic.
But a little knowledge can be a dangerous thing, and the Victorians’ understanding of their favorite new technologies’ benefits ran considerably ahead of their understanding of the attendant threats. The hazards of the dark satanic mills were comparatively obvious, but even the heights of domestic bliss, as that era conceived of it, could turn deadly.
Speaking with a variety of experts, Lipscomb investigates the dark side of a variety of accoutrements of the Victorian high (or at least comfortably middle-class) life. These harmed not just men but women and children as well: take the breeding-ground of disease that was the infant feeding bottle, or the organ-compressing corset — one of which, adhering to the experiential sensibility of British television, Lipscomb tries on and struggles with herself. Members of the eventual anti-corset revolt included Constance Lloyd, wife of Oscar Wilde, and it is Wilde’s apocryphal final words that come to mind when the video gets into the arsenic content of Victorian wallpaper. “Either that wallpaper goes, or I do,” Wilde is imagined to have said — and as modern science now proves, it could have been more than a matter of taste.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
When Rome conquered Carthage in the Third Punic War (149–146 BC), the Republic renamed the region Africa, for Afri, a word the Berbers used for local people in present-day Tunisia. (The Arabic word for the region was Ifriqiya.) Thereafter would the Roman Empire have a stronghold in North Africa: Carthage, the capital of the African Province under Julius and Augustus Caesar and their successors. The province thrived. Second only to the city of Carthage in the region, the city of Thysdrus was an important center of olive oil production and the hometown of Roman Emperor Septimius Severus, who bestowed imperial favor upon it, granting partial Roman citizenship to its inhabitants.
In 238 AD, construction began on an amphitheater in Thysdrus that would rival its largest cousins in Rome, the famed Amphitheater of El Jem. “Designed to seat a whopping crowd of 35,000 people,” writes Atlas Obscura, El Jem was listed as a UNESCO World Heritage site in 1979. Built entirely of stone blocks, the massive theater was “modeled on the Coliseum of Rome,” notes UNESCO, “without being an exact copy of the Flavian construction…. Its facade comprises three levels of arcades of Corinthian or composite style. Inside, the monument has conserved most of the supporting infrastructure for the tiered seating. The wall of the podium, the arena and the underground passages are practically intact.”
Although the small city of El Jem hardly features on tours of the classical past, it was, in the time of the Amphitheater’s construction, a prominent site of struggle for control over the Empire. The year 238 “was particularly tumultuous,” Atlas Obscura explains, due to a “revolt by the population of Thysdrus (El Jem), who opposed the enormous taxation amounts being levied by the Emperor Maximinus’s local procurator.” A riot of 50,000 people led to the ascension of Gordian I, who ruled for 21 days during the “Year of the Six Emperors,” when “in just one year, six different people were proclaimed Emperors of Rome.”
From such fraught beginnings, the massive stone structure of the El Jem Amphitheater went on to serve as a fortress during invasions of Vandals and Arabs in the 5th-7th centuries. A thousand years after the Islamic conquest, El Jem became a fortress during the Revolutions of Tunis. Later centuries saw the amphitheater used for saltpetre manufacture, grain storage, and market stalls.
Despite hundreds of years of human activity, in violent upheavals and everyday business, El Jem remains one of the best preserved Roman ruins in the world and one of the largest outdoor theaters ever constructed. More importantly, it marks the site of one of North Africa’s first imperial occupations, one that would designate a region — and eventually a continent with a dizzyingly diverse mix of peoples — as “African.”
The internet as we know it today began with a coffee pot. Despite the ring of exaggeration, that claim isn’t actually so far-fetched. When most of us go online, we expect something new: often not just something new to read, but something new to watch. This, as those of us past a certain age will recall, was not the case with the early World Wide Web, consisting as it mostly did of static pages of text, updated irregularly if at all. Younger readers will have to imagine even that being a cutting-edge thrill, but we didn’t really feel like we were living in the future until the fall of 1993, when XCoffee first went live.
This groundbreaking technological project “started back in the dark days of 1991,” writes co-creator Quentin Stafford-Fraser, “when the World Wide Web was little more than a glint in CERN’s eye.” At the time, Stafford-Fraser was employed as one of fifteen researchers in the “Trojan Room” of the University of Cambridge Computer Lab. “Being poor, impoverished academics, we only had one coffee filter machine between us, which lived in the corridor just outside the Trojan Room. However, being highly dedicated and hard-working academics, we got through a lot of coffee, and when a fresh pot was brewed, it often didn’t last long.”
It occurred to Stafford-Fraser to train an unused video camera from the Trojan Room on the coffee pot (and thus the amount of coffee available within), then connect it to a computer, specifically an Acorn Archimedes. His colleague Paul Jardetzky “wrote a ‘server’ program, which ran on that machine and captured images of the pot every few seconds at various resolutions, and I wrote a ‘client’ program which everybody could run, which connected to the server and displayed an icon-sized image of the pot in the corner of the screen. The image was only updated about three times a minute, but that was fine because the pot filled rather slowly, and it was only greyscale, which was also fine, because so was the coffee.”
XCoffee, the resulting program, was meant only to provide this much-needed information to Computer Lab members elsewhere in the building. But after the release of image-displaying web browsers in 1993, it found a much wider audience as the world’s first streaming webcam. Stafford-Fraser’s successors “resurrected the system, treated it to a new frame grabber, and made the images available on the World Wide Web. Since then, hundreds of thousands of people have looked at the coffee pot, making it undoubtedly the most famous in the world.” Stafford-Fraser wrote these words in 1995; in the years thereafter XCoffee went on to receive millions of views before its eventual shutdown in 2001.
In the Centre for Computing History video above, Stafford-Fraser shows the very Olivetti camera he originally used to monitor the coffee level. (He’d previously worked at the Olivetti Research Laboratory, whose parent company also owned Acorn Computers.) “We could see things at a distance before,” he says. “We could view television programs, we could look through telescopes.” But only after the Trojan Room’s coffee pot hit the internet could we “see what’s happening now, somewhere else in the world,” on demand. Thirty years after XCoffee’s development, we’re mesmerized by live-streaming stars and surrounded by “smart” home appliances, hoping for nothing so much as way to concentrate on our immediate surroundings again — to wake up, if you like, and smell the coffee.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
But we here at Open Culture think his greatest gift to home viewers are his Art History School profiles of well-known artists like Henri de Toulouse-Lautrec and Vincent Van Gogh.
An avid storyteller, he’s drawn to those with tragic histories — the decision to pivot from impersonating the artist, as he did with Van Gogh, to serving as a reporter interested in how such details as syphilis and alcoholism informed lives and careers is a wise one.
Priestly makes a convincing case that Lautrec’s aristocratic upbringing contributed to his misery. His short stature was the result, not of dwarfism, but Pyknodysostosis (PYCD) a rare bone weakening disease that surely owed something to his parents’ status as first cousins.
His appearance made him a subject of lifelong mockery, and ensured that the freewheeling artist scene in Montmartre would prove more welcoming than the blueblood milieu into which he’d been born.
Priestly makes a meal of that Demi-monde, introducing viewers to many of the players.
He heightens our appreciation for Lautrec’s masterpiece, At the Moulin Rouge, by briefly orienting us to who’s seated around the table: writer and critic Édouard Dujardin, dancer La Macarona, photographer Paul Secau, and “champagne salesman and debauchee” Maurice Guibert, who earlier posed as a lecherous patron in Lautrec’s At the Café La Mie.
Queen of the Cancan La Goulue hangs out in the background with another dancer, the wonderfully named La Môme Fromage.
Lautrec places himself squarely in the mix, looking very much at home.
Consider that these names, like those of frequent Lautrec subjects acrobatic dancer Jane Avril and chanteuse Yvette Guilbert were as celebrated in Belle Epoque Montmartre as many of the painters Lautrec rubbed shoulders with — Degas, Pissarro, Cézanne, Van Gogh and Manet.
In an article in The Smithsonian, Paul Trachtman recounts how Lautrec discovered the model for Manet’s famous nude Olympia, Victorine Meurent, “living in abject poverty in a top-floor apartment down a Montmartre alley. She was now an old, wrinkled, balding woman. Lautrec called on her often, and took his friends along, presenting her with gifts of chocolate and flowers — as if courting death itself.”
Meanwhile Degas sniffed that Lautrec’s studies of women in a brothel “stank of syphilis.”
Perhaps Priestly will delve into Degas for an upcoming Art History School episode … there’s no shortage of material there.
Above are three more of Paul Priestly’s Art History School profiles that we’ve enjoyed on Frances Bacon,Edvard Munch and Gustav Klimt. You can subscribe to his channel here.
Like most renowned abstract painters, Wassily Kandinsky could also paint realistically. Unlike most renowned abstract painters, he only took up art in earnest after studying economics and law at the University of Moscow. He then found early success teaching those subjects, which seem to have proven too worldly for his sensibilities: at age 30 he enrolled in the Munich Academy to continue the study of art that he’d left off while growing up in Odessa. The surviving paintings he produced at the end of the 19th century and the beginning of the 20th, displayed on Wikipedia’s list of his works, include a variety of landscapes, most presenting German and Russian (or today Ukrainian) landscapes undisturbed by a single human figure.
Kandinsky made dramatic change come with 1903’s The Blue Rider (above). The presence of the titular figure made for an obvious difference from so many of the images he’d created over the previous half-decade; a shift in its very perception of reality made for a less obvious one.
This is not the world as we normally see it, and Kandinsky’s track record of highly representative paintings tells us that he must deliberately have chosen to paint it it that way. With fellow artists like August Macke, Franz Marc, Albert Bloch, and Gabriele Münter, he went on to form the Blue Rider Group, whose publications argued for abstract art’s capability to attain great spiritual heights, especially through color.
“Gradually Kandinsky makes departures from the external ‘world as a model’ into the world of ‘paint as a thing in itself,’ ” writes painter Markus Ray. “Still depicting ‘worldly scenes,’ these paintings start to take on purer colors and shapes. He reduces volumes into simple shapes, and colors into bright and vibrant hues. One can still make out the scene, but the shapes and colors begin to take on a life of their own.” This is especially true of the scenes Kandinsky painted in Bavaria, such as 1909’s Railway near Murnau above. The outbreak of World War I five years later sent him back to Russia, where he continued his pioneering journey toward a visual art equal in expressive power to music, which he called his “ultimate teacher.” But by the early 1920s it had become clear that his increasingly individualistic and non-representative tendencies wouldn’t sit well with the Soviet cultural powers that be.
A return to Germany was in order. “In 1921, at the age of 55, Kandinsky moved to Weimar to teach mural painting and introductory analytical drawing at the newly founded Bauhaus school,” says Christie’s. “There he worked alongside the likes of Paul Klee, László Moholy-Nagy and Josef Albers,” and also expanded on Goethe’s theories of color. A true believer in the Bauhaus’ “philosophy of social improvement through art,” Kandinsky also wound up among the artists whose work was exhibited in the Nazi Party’s “Degenerate Art Exhibition” of 1937. By that time the Bauhaus was dissolved and Kandinsky had resettled in Paris, where until his death in 1944 (as evidenced by Wikipedia’s list of his paintings) he kept pushing further into abstraction, seeking ever-purer expressions of the human soul until the very end.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.