How Italian Physicist Laura Bassi Became the First Woman to Have an Academic Career in the 18th Century

The practice and privilege of academic science has been slow in trickling down from its origins as a pursuit of leisured gentleman. While many a leisured lady may have taken an interest in science, math, or philosophy, most women were denied participation in academic institutions and scholarly societies during the scientific revolution of the 1700s. Only a handful of women — seven known in total — were granted doctoral degrees before the year 1800. It wasn’t until 1678 that a female scholar was given the distinction, some four centuries or so after the doctorate came into being. While several intellectuals and even clerics of the time held progressive attitudes about gender and education, they were a decided minority.

Curiously, four of the first seven women to earn doctoral degrees were from Italy, beginning with Elena Cornaro Piscopia at the University of Padua. Next came Laura Bassi, who earned her degree from the University of Bologna in 1732. There she distinguished herself in physics, mathematics, and natural philosophy and became the first salaried woman to teach at a university (she was at one time the university’s highest paid employee). Bassi was the chief popularizer of Newtonian physics in Italy in the 18th century and enjoyed significant support from the Archbishop of Bologna, Prospero Lambertini, who — when he became Pope Benedict XIV — elected her as the 24th member of an elite scientific society called the Benedettini.

“Bassi was widely admired as an excellent experimenter and one of the best teachers of Newtonian physics of her generation,” says Paula Findlen, Stanford professor of history. “She inspired some of the most important male scientists of the next generation while also serving as a public example of a woman shaping the nature of knowledge in an era in which few women could imagine playing such a role.” She also played the role available to most women of the time as a mother of eight and wife of Giuseppe Veratti, also a scientist.

Bassi was not allowed to teach classes of men at the university — only special lectures open to the public. But in 1740, she was granted permission to lecture at her home, and her fame spread, as Findlen writes at Physics World:

 Bassi was widely known throughout Europe, and as far away as America, as the woman who understood Newton. The institutional recognition that she received, however, made her the emblematic female scientist of her generation. A university graduate, salaried professor and academician (a member of a prestigious academy), Bassi may well have been the first woman to have embarked upon a full-fledged scientific career.

Poems were written about Bassi’s successes in demonstrating Newtonian optics; “news of her accomplishments traveled far and wide,” reaching the ear of Benjamin Franklin, whose work with electricity Bassi followed keenly. In Bologna, surprise at Bassi’s achievements was tempered by a culture known for “celebrating female success.” Indeed, the city was “jokingly known as a ‘paradise for women,’” writes Findlen. Bassi’s father was determined that she have an education equal to any of her class, and her family inherited money that had been equally divided between daughters and sons for generations; her sons “found themselves heirs to the property that came to the family through Laura’s maternal line,” notes the Stanford University collection of Bassi’s personal papers.

Bassi’s academic work is held at the Academy of Sciences in Bologna. Of the papers that survive, “thirteen are on physics, eleven are on hydraulics, two are on mathematics, one is on mechanics, one is on technology, and one is on chemistry,” writes a University of St. Andrew’s biography. In 1776, a year usually remembered for the formation of a government of leisured men across the Atlantic, Bassi was appointed to the Chair of Experimental Physics at Bologna, an appointment that not only meant her husband became her assistant, but also that she became the “first woman appointed to a chair of physics at any university in the world.”

Bologna was proud of its distinguished daughter, but perhaps still thought of her as an oddity and a token. As Dr. Eleonora Adami notes in a charming biography at sci-fi illustrated stories, the city once struck a medal in her honor, “commemorating her first lecture series with the phrase ‘Soli cui fas vidisse Minervam,’” which translates roughly to “the only one allowed to see Minerva.” But her example inspired other women, like Cristina Roccati, who earned a doctorate from Bologna in 1750, and Dorothea Erxleben, who became the first woman to earn a Doctorate in Medicine four years later at the University of Halle. Such singular successes did not change the patriarchal culture of academia, but they started the trickle that would in time become several branching streams of women succeeding in the sciences.

Related Content: 

Marie Curie Became the First Woman to Win a Nobel Prize, the First Person to Win Twice, and the Only Person in History to Win in Two Different Sciences

Jocelyn Bell Burnell Changed Astronomy Forever; Her Ph.D. Advisor Won the Nobel Prize for It

Women Scientists Launch a Database Featuring the Work of 9,000 Women Working in the Sciences

“The Matilda Effect”: How Pioneering Women Scientists Have Been Denied Recognition and Written Out of Science History

The Little-Known Female Scientists Who Mapped 400,000 Stars Over a Century Ago: An Introduction to the “Harvard Computers”

Real Women Talk About Their Careers in Science

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Alice in Wonderland Syndrome: The Real Perceptual Disorder That May Have Shaped Lewis Carroll’s Creative World

Alice’s Adventures in Wonderland isn’t just a beloved children’s story: it’s also a neuropsychological  syndrome. Or rather the words “Alice in Wonderland,” as Lewis Carroll’s book is commonly known, have also become attached to a condition that, though not harmful in itself, causes distortions in the sufferer’s perception of reality. Other names include dysmetropsia or Todd’s syndrome, the latter of which pays tribute to the consultant psychiatrist John Todd, who defined the disorder in 1955. He described his patients as seeing some objects as much larger than they really were and other objects as much smaller, resulting in challenges not entirely unlike those faced by Alice when put by Carroll through her growing-and-shrinking paces.

Todd also suggested that Carroll had written from experience, drawing inspiration from the hallucinations he experienced when afflicted with what he called “bilious headache.”  The transformations Alice feels herself undergoing after she drinks from the “DRINK ME” bottle and eats the “EAT ME” cake are now known, in the neuropsychological literature, as macropsia and micropsia.

“I was in the kitchen talking to my wife,” writes novelist Craig Russell of one of his own bouts of the latter. “I was hugely animated and full of energy, having just put three days’ worth of writing on the page in one morning and was bursting with ideas for new books. Then, quite calmly, I explained to my wife that half her face had disappeared. As I looked around me, bits of the world were missing too.”

Though “many have speculated that Lewis Carroll took some kind of mind-altering drug and based the Alice books on his hallucinatory experiences,” writes Russell, “the truth is that he too suffered from the condition, but in a more severe and protracted way,” combined with ocular migraine. Russell also notes that the sci-fi visionary Philip K. Dick, though “never diagnosed as suffering from migrainous aura or temporal lobe epilepsy,” left behind a body of work that has has given rise to “a growing belief that the experiences he described were attributable to the latter, particularly.” Suitably, classic Alice in Wonderland syndrome “tends to be much more common in childhood” and disappear in maturity. One sufferer documented in the scientific literature is just six years old, younger even than Carroll’s eternal little girl — presumably, an eternal seer of reality in her own way.

Related Content:

A Beautiful 1870 Visualization of the Hallucinations That Come Before a Migraine

Behold Lewis Carroll’s Original Handwritten & Illustrated Manuscript for Alice’s Adventures in Wonderland (1864)

Lewis Carroll’s Photographs of Alice Liddell, the Inspiration for Alice in Wonderland

Ralph Steadman’s Warped Illustrations of Alice’s Adventures in Wonderland on the Story’s 150th Anniversary

Alice’s Adventures in Wonderland, Illustrated by Salvador Dalí in 1969, Finally Gets Reissued

Curious Alice — The 1971 Anti-Drug Movie Based on Alice in Wonderland That Made Drugs Look Like Fun

Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletter Books on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

The Strange Magic of Jimi Hendrix’s “Voodoo Chile”

Poor Polyphonic. He was just about to deliver another perfectly mixed treatise on a classic rock magnum opus when the YouTube algorithm and the Jimi Hendrix Estate stepped in to stop him before publishing. So while you can watch this real-time explication of Hendrix’s more-than-just-a-jam “Voodoo Chile” with just the the graphics and the narration, you should cue up the 15 minute track however you can (for example on Spotify), and then press play when when the video gives the signal. (This might be the first YouTube explainer video to ask for copyright-skirting help.)

And anyway, you should have a copy of Electric Ladyland, right? It’s the one where Hendrix and the Experience really push all the boundaries, taking rock, blues, jazz, psychedelia, sci-fi, everything…all out as far as possible in the studio. It’s the one that introduced future members of the Band of Gypsies. And it’s the one that hints of everything that might have been, if Hendrix hadn’t passed away soon after.

Now, classic rock radio usually plays the much shorter and less laid back “Voodoo Child (Slight Return)” that closes the album. But this essay is about the longest track on Electric Ladyland, the one that ends side one. This is the track that Hendrix wanted to sound like a light night jam at New York club The Scene—and which he recorded after one particular night doing just that. He taped the audience effects soon after. Steve Winwood is on keyboards. Jack Casady from Jefferson Airplane plays bass. And Mitch Mitchell turns in one of his greatest performances and solos.

In the lyrics, Polyphonic notes, Hendrix connects the blues to his Cherokee heritage and to voodoo, to sex, and then beyond into science fiction landscapes. The song is a self-portrait, showing the past, the influence, the training, and then the potential that music, magic, and (let’s face it) LSD could bring. The band is vibing. Winwood drops riffs that are more British folk than Chicago blues. Hendrix strays far beyond the orbit of blues, swings past it one more time on his own slight return, and then explodes into stardust.

Polyphonic’s video also looks beautiful and perfectly intersperses his critique with the song’s main sections. It may have sounded like a jam, but Hendrix carefully designed it to flow the way it does. And Polyphonic follows suit. It is a highly enjoyable walk through a track (again find it on Spotify here) many already know, reawakening a sense of wonder about all its inherent, strange genius.

Related Content:

How Science Fiction Formed Jimi Hendrix

Jimi Hendrix’s Home Audio System & Record Collection Gets Recreated in His London Flat

Behold Moebius’ Many Psychedelic Illustrations of Jimi Hendrix

Ted Mills is a freelance writer on the arts who currently hosts the Notes from the Shed podcast and is the producer of KCRW’s Curious Coast. You can also follow him on Twitter at @tedmills, and/or watch his films here.

Elvis Presley Gets the Polio Vaccine on The Ed Sullivan Show, Persuading Millions to Get Vaccinated (1956)

No one living has experienced a viral event the size and scope of COVID-19. Maybe the unprecedented nature of the pandemic explains some of the vaccine resistance. Diseases of such virulence became rare in places with ready access to vaccines, and thus, ironically, over time, have come to seem less dangerous. But there are still many people in wealthy nations who remember polio, an epidemic that dragged on through the first half of the 20th century before Jonas Salk perfected his vaccine in the mid-fifties.

Polio’s devastation has been summed up visually in textbooks and documentaries by the terrifying iron lung, an early ventilator. “At the height of the outbreaks in the late 1940s,” Meilan Solly writes at Smithsonian, “polio paralyzed an average of more than 35,000 people each year,” particularly affecting children, with 3,000 deaths in 1952 alone. “Spread virally, it proved fatal for two out of ten victims afflicted with paralysis. Though millions of parents rushed to inoculate their children following the introduction of Jonas Salk’s vaccine in 1955, teenagers and young adults had proven more reluctant to get the shot.”

At the time, there were no violent, organized protests against the vaccine, nor was resistance framed as a patriotic act of political loyalty. But “cost, apathy and ignorance became serious setbacks to the eradication effort,” says historian Stephen Mawdsley. And, then as now, irresponsible media personalities with large platforms and little knowledge could do a lot of harm to the public’s confidence in life-saving public health measures, as when influential gossip columnist Walter Winchell wrote that the vaccine “may be a killer,” discouraging countless readers from getting a shot.

When Elvis Presley made his first appearance on Ed Sullivan’s show in 1956, “immunization levels among American teens were at an abysmal 0.6 percent,” note Hal Hershfield and Ilana Brody at Scientific American. To counter impressions that the polio vaccine was dangerous, public health officials did not solely rely on getting more and better information to the public; they also took seriously what Hershfield and Brody call the “crucial ingredients inherent to many of the most effective behavioral change campaigns: social influence, social norms and vivid examples.” Satisfying all three, Elvis stepped up and agreed to get vaccinated “in front of millions” backstage before his second appearance on the Sullivan show.

Elvis could not have been more famous, and the campaign was a success for its target audience, establishing a new social norm through influence and example: “Vaccination rates among American youth skyrocketed to 80 percent after just six months.” Despite the threat he supposedly posed to the establishment, Elvis himself was ready to serve the public. “I certainly never wanna do anything,” he said, “that would be a wrong influence.” See in the short video at the top how American public health officials stopped millions of preventable deaths and disabilities by admitting a fact propagandists and advertisers never shy from — humans, on the whole, are easily persuaded by celebrities. Sometimes they can even be persuaded for the good.

Related Content: 

Yo-Yo Ma Plays an Impromptu Performance in Vaccine Clinic After Receiving 2nd Dose

Dying in the Name of Vaccine Freedom

How Do Vaccines (Including the COVID-19 Vaccines) Work?: Watch Animated Introductions

How Vaccines Improved Our World In One Graphic

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

How Victorian Homes Turned Deadly: Exploding Stoves, Poison Wallpaper, Ever-Tighter Corsets & More

The British have a number of sayings that strike listeners of other English-speaking nationalities as odd. “Safe as houses” has always had a curious ring to my American ear, but it turns out to be quite ironic as well: the expression grew popular in the Victorian era, a time when Londoners were as likely to be killed by their own houses as anything else. That, at least, is the impression given by “The Bizarre Ways Victorians Sabotaged Their Own Health & Lives,” the documentary investigation starring historian Suzannah Lipscomb above.

Throughout the second half of the 19th century, many an Englishman would have regarded himself as living at the apex of civilization. He wouldn’t have been wrong, exactly, since that place and time witnessed an unprecedented number of large-scale innovations industrial, scientific, and domestic.

But a little knowledge can be a dangerous thing, and the Victorians’ understanding of their favorite new technologies’ benefits ran considerably ahead of their understanding of the attendant threats. The hazards of the dark satanic mills were comparatively obvious, but even the heights of domestic bliss, as that era conceived of it, could turn deadly.

Speaking with a variety of experts, Lipscomb investigates the dark side of a variety of accoutrements of the Victorian high (or at least comfortably middle-class) life. These harmed not just men but women and children as well: take the breeding-ground of disease that was the infant feeding bottle, or the organ-compressing corset — one of which, adhering to the experiential sensibility of British television, Lipscomb tries on and struggles with herself. Members of the eventual anti-corset revolt included Constance Lloyd, wife of Oscar Wilde, and it is Wilde’s apocryphal final words that come to mind when the video gets into the arsenic content of Victorian wallpaper. “Either that wallpaper goes, or I do,” Wilde is imagined to have said — and as modern science now proves, it could have been more than a matter of taste.

Related Content:

A 108-Year-Old Woman Recalls What It Was Like to Be a Woman in Victorian England

The Color That May Have Killed Napoleon: Scheele’s Green

The 1855 Map That Revolutionized Disease Prevention & Data Visualization: Discover John Snow’s Broad Street Pump Map

Hand-Colored Maps of Wealth & Poverty in Victorian London: Explore a New Interactive Edition of Charles Booth’s Historic Work of Social Cartography (1889)

Poignant and Unsettling Post-Mortem Family Portraits from the 19th Century

Behold the Steampunk Home Exercise Machines from the Victorian Age

Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletter Books on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

The Roman Colosseum Has a Twin in Tunisia: Discover the Amphitheater of El Jem, One of the Best-Preserved Roman Ruins in the World

Image via Wikimedia Commons

When Rome conquered Carthage in the Third Punic War (149-146 BC), the Republic renamed the region Africa, for Afri, a word the Berbers used for local people in present-day Tunisia. (The Arabic word for the region was Ifriqiya.) Thereafter would the Roman Empire have a stronghold in North Africa: Carthage, the capital of the African Province under Julius and Augustus Caesar and their successors. The province thrived. Second only to the city of Carthage in the region, the city of Thysdrus was an important center of olive oil production and the hometown of Roman Emperor Septimius Severus, who bestowed imperial favor upon it, granting partial Roman citizenship to its inhabitants.

In 238 AD, construction began on an amphitheater in Thysdrus that would rival its largest cousins in Rome, the famed Amphitheater of El Jem. “Designed to seat a whopping crowd of 35,000 people,” writes Atlas Obscura, El Jem was listed as a UNESCO World Heritage site in 1979. Built entirely of stone blocks, the massive theater was “modeled on the Coliseum of Rome,” notes UNESCO, “without being an exact copy of the Flavian construction…. Its facade comprises three levels of arcades of Corinthian or composite style. Inside, the monument has conserved most of the supporting infrastructure for the tiered seating. The wall of the podium, the arena and the underground passages are practically intact.”

Image via Wikimedia Commons


Although the small city of El Jem hardly features on tours of the classical past, it was, in the time of the Amphitheater’s construction, a prominent site of struggle for control over the Empire. The year 238 “was particularly tumultuous,” Atlas Obscura explains, due to a “revolt by the population of Thysdrus (El Jem), who opposed the enormous taxation amounts being levied by the Emperor Maximinus’s local procurator.” A riot of 50,000 people led to the ascension of Gordian I, who ruled for 21 days during the “Year of the Six Emperors,” when “in just one year, six different people were proclaimed Emperors of Rome.”

Image via Wikimedia Commons

From such fraught beginnings, the massive stone structure of the El Jem Amphitheater went on to serve as a fortress during invasions of Vandals and Arabs in the 5th-7th centuries. A thousand years after the Islamic conquest, El Jem became a fortress during the Revolutions of Tunis. Later centuries saw the amphitheater used for saltpetre manufacture, grain storage, and market stalls.

Despite hundreds of years of human activity, in violent upheavals and everyday business, El Jem remains one of the best preserved Roman ruins in the world and one of the largest outdoor theaters ever constructed. More importantly, it marks the site of one of North Africa’s first imperial occupations, one that would designate a region — and eventually a continent with a dizzyingly diverse mix of peoples — as “African.”

via @WassilDZ

Related Content: 

Explore the Ruins of Timgad, the “African Pompeii” Excavated from the Sands of Algeria

Archaeologists Discover an Ancient Roman Snack Bar in the Ruins of Pompeii

A Virtual Tour of Ancient Rome, Circa 320 CE: Explore Stunning Recreations of The Forum, Colosseum and Other Monuments

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

King Arthur in Film: Our Most Enduring Popular Entertainment Franchise? Pretty Much Pop: A Culture Podcast #104

With the recent theatrical release of The Green Knight, your Pretty Much Pop host Mark Linsenmayer, returning host Brian Hirt, plus Den of Geek’s David Crow and the very British Al Baker consider the range of cinematic Arthuriana, including Excalibur (1981), Camelot (1967), King Arthur (2004), King Arthur: Legend of the Sword (2017), First Knight (1995), Sword of the Valiant (1983), Sir Gawain and the Green Knight (1973), and Monty Python and the Holy Grail (1975).

Arthuriana encompasses numerous (sometimes contradicting) stories that accrued and evolved for nearly 1000 years after the probable existence of the unknown person who was the historical source for the character before the 14th century poem (author unknown) Sir Gawain and the Green Knight, and then in the 15th century Sir Thomas Malory wrote Le Morte d’Arthur, which provided the template for well-known modern retellings like T.H. White’s The Once and Future King (1958).

The length and complexity of this mythology makes a single film problematic, with most settling on the love triangle between Arthur, Lancelot, and Guinevere leading to Camelot’s downfall. Multiple TV treatments have tried to do it justice, and if Guy Ritchie’s King Arthur: Legend of the Sword had been a box office success, then we’d currently be seeing multiple films in an Arthurian cinematic universe. By picking a smaller story and not trying too hard to tie it to King Arthur (who appears but is not named), The Green Knight is able to be more creative in painting and updating the strange story of Sir Gawain, who in previous cinematic outings (including Sword of the Valiant where Sean Connery played The Green Knight) involved Gawain involved in a series of nonsensical adventures far removed from the events told in the original poem.

We talk through characterization in a mythic story, stylizing the epic (how much violence? how weird?), its status as public domain material (like Robin Hood and Sherlock Holmes), and the moral lesson of the original Gawain poem and what director David Lowery did with that for the new film. Is the new film actually enjoyable, or just carefully thought through and artfully shot? Note that we don’t spoil anything significant about The Green Knight until the last ten minutes, so it’s fine if you haven’t seen it (Al hadn’t either).

Here are song articles by David Crow on our topic:

Other articles we used to prep for this included:

The YouTube versions of the source material that Mark listened to are here and here, and the relevant Great Courses offering is here.

This episode includes bonus discussion you can access by supporting the podcast at This podcast is part of the Partially Examined Life podcast network.

Pretty Much Pop: A Culture Podcast is the first podcast curated by Open Culture. Browse all Pretty Much Pop posts.

The Very First Webcam Was Invented to Keep an Eye on a Coffee Pot at Cambridge University

The internet as we know it today began with a coffee pot. Despite the ring of exaggeration, that claim isn’t actually so far-fetched. When most of us go online, we expect something new: often not just something new to read, but something new to watch. This, as those of us past a certain age will recall, was not the case with the early World Wide Web, consisting as it mostly did of static pages of text, updated irregularly if at all. Younger readers will have to imagine even that being a cutting-edge thrill, but we didn’t really feel like we were living in the future until the fall of 1993, when XCoffee first went live.

This groundbreaking technological project “started back in the dark days of 1991,” writes co-creator Quentin Stafford-Fraser, “when the World Wide Web was little more than a glint in CERN’s eye.” At the time, Stafford-Fraser was employed as one of fifteen researchers in the “Trojan Room” of the University of Cambridge Computer Lab. “Being poor, impoverished academics, we only had one coffee filter machine between us, which lived in the corridor just outside the Trojan Room. However, being highly dedicated and hard-working academics, we got through a lot of coffee, and when a fresh pot was brewed, it often didn’t last long.”

It occurred to Stafford-Fraser to train an unused video camera from the Trojan Room on the coffee pot (and thus the amount of coffee available within), then connect it to a computer, specifically an Acorn Archimedes. His colleague Paul Jardetzky “wrote a ‘server’ program, which ran on that machine and captured images of the pot every few seconds at various resolutions, and I wrote a ‘client’ program which everybody could run, which connected to the server and displayed an icon-sized image of the pot in the corner of the screen. The image was only updated about three times a minute, but that was fine because the pot filled rather slowly, and it was only greyscale, which was also fine, because so was the coffee.”

XCoffee, the resulting program, was meant only to provide this much-needed information to Computer Lab members elsewhere in the building. But after the release of image-displaying web browsers in 1993, it found a much wider audience as the world’s first streaming webcam. Stafford-Fraser’s successors “resurrected the system, treated it to a new frame grabber, and made the images available on the World Wide Web. Since then, hundreds of thousands of people have looked at the coffee pot, making it undoubtedly the most famous in the world.” Stafford-Fraser wrote these words in 1995; in the years thereafter XCoffee went on to receive millions of views before its eventual shutdown in 2001.

In the Centre for Computing History video above, Stafford-Fraser shows the very Olivetti camera he originally used to monitor the coffee level. (He’d previously worked at the Olivetti Research Laboratory, whose parent company also owned Acorn Computers.) “We could see things at a distance before,” he says. “We could view television programs, we could look through telescopes.” But only after the Trojan Room’s coffee pot hit the internet could we “see what’s happening now, somewhere else in the world,” on demand. Thirty years after XCoffee’s development, we’re mesmerized by live-streaming stars and surrounded by “smart” home appliances, hoping for nothing so much as way to concentrate on our immediate surroundings again — to wake up, if you like, and smell the coffee.

via BoingBoing

Related Content:

See Web Cams of Surreally Empty City Streets in Venice, New York, London & Beyond

Sci-Fi “Portal” Connects Citizens of Lublin & Vilnius, Allowing Passersby Separated by 376 Miles to Interact in Real Time

George Orwell Predicted Cameras Would Watch Us in Our Homes; He Never Imagined We’d Gladly Buy and Install Them Ourselves

The Coffee Pot That Fueled Honoré de Balzac’s Coffee Addiction

The Hertella Coffee Machine Mounted on a Volkswagen Dashboard (1959): The Most European Car Accessory Ever Made

Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletter Books on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.

What Makes Basquiat’s Untitled Great Art: One Painting Says Everything Basquiat Wanted to Say About America, Art & Being Black in Both Worlds

They wouldn’t have let Jean-Michel into a Tiffany’s if he wanted to use the bathroom or if he went to buy an engagement ring and pulled a wad of cash out of his pocket. 

— Stephen Torton, Jean-Michel Basquiat’s studio assistant

When Jean-Michel Basquiat’s Untitled (Skull) sold for $110.5 million in 2017 to Japanese billionaire Yusaku Maesawa, the artist joined the ranks of Da Vinci, De Kooning, and Picasso as one of the top selling painters in the world, surpassing a previous record set in 2013 by his mentor Andy Warhol’s work. Untitled dates from 1982, during “the young Basquiat’s mercurial early years,” writes Ben Davis at Artnet, “even before his first gallery show at Annina Nosei, when he was still a Caribbean-American kid from Brooklyn energetically bootstrapping himself into the limelight of the downtown art scene.” It is this period that most interests collectors like Maesawa.

Basquiat’s transition from graffiti artist to art world darling was dramatic, celebratory, and self-destructive, all characteristics of his work. But critical primitivism reduced him to a token — an art world attitude saw Basquiats as objects to be stripped of context, turned into decorative badges of authenticity and worldliness. “Maezawa’s head painting possesses a loud, gnashing, and confident aura,” Shannon Lee writes at Artsy. But the artist’s “use of skulls… is deeply rooted in his identity as a Black artist in America. They are strongly evocative of African masks, which have been so fetishized by the art market since modernists like Picasso appropriated them from their native contexts.”

But head/skull motifs in Basquiat’s work are not only statements of diasporic Black identity — they emerge through his thematic play of human embodiment, mental illness/health, the competitions of the graffiti world and the headgames of the art world, which Basquiat both mastered and critiqued as a canny outsider. “No subject is more powerful or more sought after in the oeuvre of Jean-Michel Basquiat,” notes Christie’s New York, “than the singular skull.” Though maybe not the most reproduced of Basquiat’s heads, 1982’s Untitled — argues the Great Art Explained video above — exemplifies the themes.

At only 22 years old, Basquiat produced “a single painting” that said “everything he wanted to say about America, about art and about being black in both worlds.” So singular is Untitled that it became its own one-painting show in 2018 when its new owner sent it on a tour of the world, beginning in the artist’s hometown at the Brooklyn Museum. Maesawa’s decision to share the painting presents a contrast to the way Basquiat has been treated differently by other owners of his work like Tiffany & Co., who explain their purchase and recent, controversial commercial use of his Equals Pi by citing his “affinity for the company’s statement blue color,” writes Tirhakah Love at Daily Beast — a color they trademarked ten years after Basquiat’s death.

The proprietary co-optation of Basquiat’s life and work to sell symbols of colonialism like diamonds, among other luxury goods — and the turning of his work into the ultimate luxury good — debases his purposes. Why show Equals Pi “as a prop to an ad?” asked his friend and former roommate Alexis Adler. “Loan it out to a museum. In a time where there were very few Black artists represented in Western museums, that was his goal: to get to a museum.” Find out in the Great Art Explained video how one of his most famous — and most expensive — works encapsulates that struggle through its vivid color and symbolic visual language.

Related Content: 

Take a Close Look at Basquiat’s Revolutionary Art in a New 500-Page, 14-Pound, Large Format Book by Taschen

The Story of Jean-Michel Basquiat’s Rise in the 1980s Art World Gets Told in a New Graphic Novel

An Animated Introduction to the Chaotic Brilliance of Jean-Michel Basquiat: From Homeless Graffiti Artist to Internationally Renowned Painter

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness


Zoom Into a Super High Resolution Photo of Van Gogh’s “The Starry Night”

“Just as we take the train to get to Tarascon or Rouen, we take death to reach a star,” Vincent Van Gogh wrote to his brother from Arles in the summer of 1888:

What’s certainly true in this argument is that while alive, we cannot go to a star, any more than once dead we’d be able to take the train.

The following summer, as a patient in the asylum of Saint-Paul-de-Mausole in Provence, he painted what would become his best known work — The Starry Night.

The summer after that, he was dead of a gunshot wound to the abdomen, commonly believed to be self-inflicted.

Judging from thoughts expressed in that same letter, Van Gogh may have conceived of such a death as a “celestial means of locomotion, just as steamboats, omnibuses and the railway are terrestrial ones”:

To die peacefully in old age would be to go there on foot.

Although his window at the asylum afforded him a sunrise view, and a private audience with the prominent morning star he mentioned in another letter to Theo, Starry Night’s vista is “both an exercise in observation and a clear departure from it,” according to 2019’s MoMA Highlights: 375 Works from The Museum of Modern Art:

The vision took place at night, yet the painting, among hundreds of artworks van Gogh made that year, was created in several sessions during the day, under entirely different atmospheric conditions. The picturesque village nestled below the hills was based on other views—it could not be seen from his window—and the cypress at left appears much closer than it was. And although certain features of the sky have been reconstructed as observed, the artist altered celestial shapes and added a sense of glow.

Those who can’t visit MoMA to see The Starry Night in person may enjoy getting up close and personal with Google Arts and Culture’s zoomable, high res digital reproduction. Keep clicking into the image to see the painting in greater detail.

Before or after formulating your own thoughts on The Starry Night and the emotional state that contributed to its execution, get the perspective of singer-songwriter Maggie Rogers in the below episode of Art Zoom, in which popular musicians share their thoughts while navigating around a famous canvas.

Bonus! Throw yourself into a free coloring page of The Starry Night here.

Related Content: 

A Gallery of 1,800 Gigapixel Images of Classic Paintings: See Vermeer’s Girl with the Pearl Earring, Van Gogh’s Starry Night & Other Masterpieces in Close Detail

Vincent Van Gogh’s “The Starry Night”: Why It’s a Great Painting in 15 Minutes

1,000+ Artworks by Vincent Van Gogh Digitized & Put Online by Dutch Museums: Enter Van Gogh Worldwide

Rare Vincent van Gogh Painting Goes on Public Display for the First Time: Explore the 1887 Painting Online

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Follow her @AyunHalliday.

Art History School: Learn About the Art & Lives of Toulouse-Lautrec, Gustav Klimt, Frances Bacon, Edvard Munch & Many More

Artist and videographer Paul Priestly is an enthusiastic and generous sort of fellow.

His free online drawing tutorials abound with encouraging words for beginners, and he clearly relishes lifting the curtain to reveal his home studio set up and self designed camera rig.

But we here at Open Culture think his greatest gift to home viewers are his Art History School profiles of well-known artists like Henri de Toulouse-Lautrec and Vincent Van Gogh.

An avid storyteller, he’s drawn to those with tragic histories — the decision to pivot from impersonating the artist, as he did with Van Gogh, to serving as a reporter interested in how such details as syphilis and alcoholism informed lives and careers is a wise one.

Priestly makes a convincing case that Lautrec’s aristocratic upbringing contributed to his misery. His short stature was the result, not of dwarfism, but Pyknodysostosis (PYCD) a rare bone weakening disease that surely owed something to his parents’ status as first cousins.

His appearance made him a subject of lifelong mockery, and ensured that the freewheeling artist scene in Montmartre would prove more welcoming than the blueblood milieu into which he’d been born.

Priestly makes a meal of that Demi-monde, introducing viewers to many of the players.

He heightens our appreciation for Lautrec’s masterpiece, At the Moulin Rouge, by briefly orienting us to who’s seated around the table: writer and critic Édouard Dujardin, dancer La Macarona, photographer Paul Secau, and “champagne salesman and debauchee” Maurice Guibert, who earlier posed as a lecherous patron in Lautrec’s At the Café La Mie.

Queen of the Cancan La Goulue hangs out in the background with another dancer, the wonderfully named La Môme Fromage.

Lautrec places himself squarely in the mix, looking very much at home.

Consider that these names, like those of frequent Lautrec subjects acrobatic dancer Jane Avril and chanteuse Yvette Guilbert were as celebrated in Belle Epoque Montmartre as many of the painters Lautrec rubbed shoulders with — Degas, Pissarro, Cézanne, Van Gogh and Manet.

In an article in The Smithsonian, Paul Trachtman recounts how Lautrec discovered the model for Manet’s famous nude Olympia, Victorine Meurent, “living in abject poverty in a top-floor apartment down a Montmartre alley. She was now an old, wrinkled, balding woman. Lautrec called on her often, and took his friends along, presenting her with gifts of chocolate and flowers — as if courting death itself.”

Meanwhile Degas sniffed that Lautrec’s studies of women in a brothel “stank of syphilis.”

Perhaps Priestly will delve into Degas for an upcoming Art History School episode … there’s no shortage of material there.

Above are three more of Paul Priestly’s Art History School profiles that we’ve enjoyed on Frances Bacon, Edvard Munch and Gustav Klimt. You can subscribe to his channel here.

Related Content: 

The Art History Web Book

Art Historian Provides Hilarious & Surprisingly Efficient Art History Lessons on TikTok

Free Art & Art History Courses 

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine.  Follow her @AyunHalliday.

  • Great Lectures



    Get the best cultural and educational resources on the web curated for you in a daily email. We never spam. Unsubscribe at any time.


  • About Us

    Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.

    Advertise With Us

  • Archives

  • Quantcast
    Open Culture was founded by Dan Colman.