Dying from Overwork: Disturbing Looks Inside Japan’s Karoshi and China’s “996” Work System

By most measures, Japan boasts the highest life expectancy in the world. But that ranking, of course, doesn’t mean that every Japanese person sees old age. Though the country’s rate of violent crime is low enough to be the envy of most of the world, its suicide rate isn’t, and it says even more that the Japanese language has a word that refers specifically to death by overwork. I first encountered it nearly thirty years ago in Dilbert comic strip. “In Japan, employees occasionally work themselves to death. It’s called karōshi,” says Dilbert’s pointy-haired boss. “I don’t want that to happen to anybody in my department. The trick is to take a break as soon as you see a bright light and hear dead relatives beckon.”

You can see the phenomenon of karōshi examined more seriously in the short Nowness video at the top of the post. In it, a series of Japanese salarymen (a Japanese English term now well-known around the world) speak to the exhausting and unceasing rigors of their everyday work schedules — and, in some cases, to the emptiness of the homes that await them each night.

The CNBC segment just above investigates what can be done about such labor conditions, which even in white-collar workplaces contribute to the heart attacks, strokes, and other immediate causes of deaths ultimately ascribed to karōshi. In a grim irony, Japan has the lowest productivity among the G7 nations: its people work hard, yet their companies are hardly working.

Initiatives to put a stop to the ill effects of overwork, up to and including karōshi, include mandatory vacation days and office lights that switch off automatically at 10:00 p.m. Among the latest is “Premium Friday,” a program explained in the Vice video above. Developed by Keidanren, Japan’s oldest business lobby, it was initially received as “a direct response to karōshi,” but it has its origins in marketing. “We wanted to create a national event that bolstered consumption,” says the director of Keidanren’s industrial policy bureau. By that logic, it made good sense to let workers out early on Fridays — let them out to shop. But Premium Friday has yet to catch on in most Japanese enterprises, aware as they are that Japan’s economic might no longer intimidates the world.

The aforementioned low productivity, along with a rapidly aging and even contracting population, contributed to Japan’s loss of its position as the world’s second-largest economy. It was overtaken in 2011 by China, a country with overwork problems of its own. The Vice report above covers the “996” system, which stands for working from 9:00 a.m. to 9:00 p.m, six days a week. Prevalent in Chinese tech companies, it has been blamed for stress, illness, and death among employees. Laws limiting working hours have thus far proven ineffective, or at least circumventable. Certain pundits never stop insisting that the future is Chinese; if they’re right, all this ought to give pause to the workers of the world, Eastern and Western alike.

Related content:

“Inemuri,” the Japanese Art of Taking Power Naps at Work, on the Subway, and Other Public Places

Why 1999 Was the Year of Dystopian Office Movies: What The Matrix, Fight Club, American Beauty, Office Space & Being John Malkovich Shared in Common

“Tsundoku,” the Japanese Word for the New Books That Pile Up on Our Shelves, Should Enter the English Language

The Employment: A Prize-Winning Animation About Why We’re So Disenchanted with Work Today

What is the Secret to Living a Long, Happy & Creatively Fulfilling Life?: Discover the Japanese Concept of Ikigai

Charles Bukowski Rails Against 9-to-5 Jobs in a Brutally Honest Letter (1986)

Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletter Books on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall, on Facebook, or on Instagram.

Jon Kabat-Zinn Presents an Introduction to Mindfulness (and Explains Why Our Lives Just Might Depend on It)

The practice of cultivating mindfulness through meditation first took root in Europe and the U.S. in the 1960s, when Buddhist teachers from Japan, Tibet, Vietnam, and elsewhere left home, often under great duress, and taught Western students hungry for alternative forms of spirituality. Though popularized by countercultural figures like Alan Watts and Allen Ginsberg, the practice didn’t seem at first like it might reach those who seemed to need it most — stressed out denizens of the corporate world and military industrial complex who hadn’t changed their consciousness with mind-altering drugs, or left the culture to become monastics.

Then professor of medicine Jon Kabat-Zinn came along, stripped away religious and new age contexts, and began redesigning mindfulness for the masses in 1979 with his mindfulness-based stress reduction (MBSR) program. Now everyone knows, or thinks they know, what mindfulness is. As meditation teacher Lokadhi Lloyd tells The Guardian, Kabat-Zinn is “Mr Mindfulness in relation to our secular strand. Without him, I don’t think mindfulness would have risen to the prominence it has.”

His secularization of mindfulness, however, has not, in practical terms, taken it very far from its roots, which explains why Kabat-Zinn’s groundbreaking 1990 book Full Catastrophe Living receives high praise from Buddhist teachers like Joseph Goldstein, Sharon Salzburg, and Kabat-Zinn’s own former Zen teacher, Thich Nhat Hanh.

While Kabat-Zinn says he himself is not (or is no longer) a Buddhist, his definitions of mindfulness might sound just close enough to those who study and practice the religion. As he says in the short segment at the top: “It’s paying attention, on purpose, in the present moment, non-judgmentally.” And then, “sometimes,” he says, “I like to add, as if your life depended on it.” The quality of our lives, the clarity of our lives, and the depth and richness of our lives depend on our ability to be aware of what’s happening around and inside us. This ability, Kabat-Zinn insists, is the inheritance of all human beings. It can be found in spiritual practices around the world. No one owns a patent on awareness.

Nevertheless, Kabat-Zinn is particularly leery of what he calls McMindfulness, the commodity-driven industry selling coloring books, apps, puzzles, t-shirts, and novelties touting mindful benefits. Mindfulness based stress reduction is “not a trick,” he says. It isn’t something we buy and try out here and there. “MBSR is exceedingly challenging,” Kabat-Zinn writes in Full Catastrophe Living. “In many ways, being in the present moment with a spacious orientation toward what is happening may really be the hardest work in the world for us humans. At the same time, it is also infinitely doable.” It can also be highly unpleasant, forcing us to sit with the things we’d rather ignore about ourselves. Why should we do it? We might consider the alternatives.

MBSR began (“in the basement of the University of Massachusetts Medical Center,” notes NPR) helping patients with chronic pain recover. It proved so effective, Kabat-Zinn applied the insight more globally — “using the wisdom of your body and mind to face stress, pain, and illness.” This is not a cure-all, but a way of living that reduces unnecessary suffering caused by overactive discursive thinking, which traps us in patterns of blame, shame, fear, regret, judgment, and self-criticism (illustrated in Scottish psychologist R.D. Laing’s book of neurotic narratives, Knots) — traps us, that is, in stories about the past and future, which affect our physical and mental health, our work, and our relationships.

The medical evidence for mindfulness has only begun to catch up with Kabat-Zinn’s work, yet it weighs heavily on the side of the outcomes he has seen for over 40 years. MBSR also comes highly recommended by Harvard neuroscientist Sara Lazar and trauma expert Bessel Van Der Kok, among so many others who have done the research. The evidence is why, as you can see in the longer presentations above at Dartmouth and Google, Kabat-Zinn has become something of an evangelist for mindfulness. “If this is another fad, I don’t want to have any part of it,” he says. “If in the past 50 years I had found something more meaningful, more healing, more transformative and with more potential social impact, I would be doing that.”

As Kabat-Zinn’s 2005 book, Wherever You Go, There You Are, shows, we can bring what happens in meditation into our everyday life, letting assumptions go, and “letting life become both the meditation teacher and the practice, moment by moment, no matter what arises,” he tells Mindful magazine. This isn’t about escaping into blissed out moments of Zen. It’s fostering “deep connections,” over and over again, with ourselves, families, friends, communities, the planet we live on, and, in turn, “the future that we’re bequeathing to our future generations.”

Related Content:

Daily Meditation Boosts & Revitalizes the Brain and Reduces Stress, Harvard Study Finds

How Mindfulness Makes Us Happier & Better Able to Meet Life’s Challenges: Two Animated Primers Explain

De-Mystifying Mindfulness: A Free Online Course by Leiden University 

Stream 18 Hours of Free Guided Meditations

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

The History of Birth Control: From Alligator Dung to The Pill

The history of birth control is almost as old as the history of the wheel.

Pessaries dating to Mesopotamia and ancient Egypt provide the launching pad for documentarian Lindsay Holiday‘s overview of birth control throughout the ages and around the world.

Holiday’s History Tea Time series frequently delves into women’s history, and her pledge to donate a portion of the above video’s ad revenue to Pathfinder International serves as reminder that there are parts of the world where women still lack access to affordable, effective, and safe means of contraception.

One goal of the World Health Organization’s Ending Preventable Maternal Mortality initiative is for 65% of women to be able to make informed and empowered decisions regarding sexual relations, contraceptive use, and their reproductive health by 2025.

As Holiday points out, expense, social stigma, and religious edicts have impacted ease of access to birth control for centuries.

The further back you go, you can be certain that some methods advocated by midwives and medicine women have been lost to history, owing to unrecorded oral tradition and the sensitive nature of the information.

Holiday still manages to truffle up a fascinating array of practices and products that were thought – often erroneously – to ward off unwanted pregnancy.

Some that worked and continue to work to varying degrees, include barrier methods, condoms, and more recently the IUD and The Pill.

Definitely NOT recommended: withdrawal, holding your breath during intercourse, a post-coital sneezing regimen, douching with Lysol or Coca-Cola, toxic cocktails of lead, mercury or copper salt, anything involving alligator dung, and slugging back water that’s been used to wash a corpse.

As for silphium, an herb that likely did have some sort of spermicidal properties, we’ll never know for sure. By 1 CE, demand outstripped supply of this remedy, eventually wiping it off the face of the earth despite increasingly astronomical prices. Fun fact: silphium was also used to treat sore throat, snakebite, scorpion stings, mange, gout, quinsy, epilepsy, and anal warts

The history of birth control can be considered a semi-secret part of the history of prostitution, feminism, the military, obscenity laws, sex education and attitudes toward public health.

From Margaret Sanger and the 60,000 women executed as witches in the 16th and 17th centuries, to economist Thomas Malthus‘ 1798 Essay on the Principle of Population and legendary adventurer Giacomo Casanova’s satin ribbon-trimmed jimmy hat, this episode of History Tea Time with Lindsay Holiday touches on it all.

Ayun Halliday is the Chief Primatologist of the East Village Inky zine and author, most recently, of Creative, Not Famous: The Small Potato Manifesto.  Follow her @AyunHalliday.

Related Content 

The Birth Control Handbook: The Underground Student Publication That Let Women Take Control of Their Bodies (1968)

I’m Just a Pill: A Schoolhouse Rock Classic Gets Reimagined to Defend Reproductive Rights in 2017

The Story Of Menstruation: Watch Walt Disney’s Sex Ed Film from 1946

What Did People Do Before the Invention of Eyeglasses?

You remember it — one of the most heartbreaking scenes on TV. A man longs for nothing more than time to read, to be free of all those people Sartre told us make our hells. Finally granted his wish by the H-Bomb, he then accidentally break his glasses, rendering himself unable make out a word. Oh, cruel irony! Not an optometrist or optician in sight! Surely, there are “Time Enough at Last” jokes at eye care conventions worldwide.

Morality tales wrapped in science fiction might make us think about all sorts of things, but one of the most obvious questions when we witness the fate of Mr. Henry Bemis, “charter member in the fraternity of dreamers,” might be, but what did people do before corrective lenses? Were millions forced to accept his fate, living out their lives with farsightedness, nearsightedness, and other defects that impede vision? How did early humans survive in times much less hospitable to disabilities? At least there were others to read and describe things for them….

In truth, the Twilight Zone is not far off the mark. Or at least nearsightedness and reading are closely linked. “As long as primates have been around, there’s probably been myopia,” says professor of ophthalmology Ivan Schwab. But Schwab argues in his book Evolution’s Witness: How Eyes Evolved that the rise of reading likely caused skyrocketing rates of myopia over the past three hundred years. “Though genes and nutrition may play a role in nearsightedness,” Natalie Jacewicz writes at NPR, “[Schwab] says education and myopia seem to be linked, suggesting that when people do a lot of close work, their eyes grow longer.”

As the History Dose video above explains, the oldest image of a pair of glasses dates from a 1351 painting of Cardinal Hugh of Saint-Cher. The painting is an anachronism — spectacles, the narrator tells us, were invented 23 years earlier in Pisa, after the cardinal’s death. They “gradually spread across Europe and travelled the Silk Road to China.” (The oldest surviving pair of glasses dates from around 1475). So what happened before 1286? As you’ll learn, glasses were not the only way to enlarge small items. In fact, humans have been using some form of magnifying lens to read small print (or manuscript or cuneiform or what-have-you) for thousands of years. Those lenses, however, corrected presbyopia, or far-sightedness.

Those with myopia were mostly out of luck until the invention of sophisticated lens-grinding techniques and improved vision tests. But for most of human history, unless you were a sailor or a soldier, you “likely spent your day as an artisan, smith, or farm worker,” occupations where distance vision didn’t matter as much. In fact, artisans like medieval scribes and illuminators, says Neil Handley — museum curator of the College of Optometrists, London — were “actually encouraged to remain in their myopic condition, because it was actually ideal for them doing this job.”

It wasn’t until well after the time of Gutenberg that wearing lenses on one’s face became a thing — and hardly a popular thing at first. Early glasses were held up to the eyes, not worn. They were heavy, thick, and fragile. In the 15th century, “because… they were unusual and rare,” says Handley, “they were seen as having magical powers” and their wearers viewed as “in league with the devil, immoral.” That stigma went away, even if glasses picked up other associations that sometimes made their users the subject of taunts. But by the nineteenth century, glasses were common around the world.

Given that we all spend most of our time interacting with small text and images on handheld screens, it seems maybe they haven’t spread widely enough. “More than a billion, and maybe as many as 2.5 billion, people in the world need but don’t have glasses to correct for various vision impairments,” notes Livescience, citing figures from The New York Times. For many people, especially in the developing world, the question of how to get by in the world without eyeglasses is still a very pressing, present-day issue.

Related Content:

The World’s Oldest Surviving Pair of Glasses (Circa 1475)

James Joyce, With His Eyesight Failing, Draws a Sketch of Leopold Bloom (1926)

Oliver Sacks Explains the Biology of Hallucinations: “We See with the Eyes, But with the Brain as Well”

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

The Anti-Vaxxer Who Waged War Against Jonas Salk & His Polio Vaccine: When History Keeps Repeating

Almost immediately after Scottish doctor Edward Jenner learned how to inoculate humans against smallpox in 1796, mass movements sprang up in England and the U.S. to oppose the measure. The rejection of inoculation and vaccination generally made its stand on “political grounds,” says Yale historian Frank Snowden. For over two hundred years, people have “widely considered [vaccines to be] another form of tyranny.” In the 19th century, fears of government control mutated into pseudoscientific conspiracy theories claiming the smallpox vaccine might cause, for example, the growth of hooves and horns or the birth of human/cow hybrid babies.

The pushback against the smallpox vaccine, writes Slate’s Nick Keppler, occurred during a time “when arguments about bodily integrity and religious objection carried as much weight as scientific evidence.” But vaccine science progressed nonetheless, and scientific institutions – very much in league with government by the mid-20th century – shared their largesse in the form of medical breakthroughs and consumer conveniences. “The postwar era was a very trust-in-science-era,” says researcher scientist Jonathan M. Berman, author of Anti-Vaxxers: How to Challenge a Misinformed Movement. “The public not just accepted, but cheered, the headline-making work of guys in white lab coats,” Keppler remarks.

Not everyone was cheering for Jonas Salk, the March of Dimes, and the polio vaccine, however. While celebrities like Elvis Presley legitimized the vaccine in the eyes of a previously skeptical public, a few fervent anti-vaxxers rose to prominence, some using the same combination of fear mongering, pseudoscientific speculation, and conspiratorial thinking common to the smallpox era – and common, once again, in the time of COVID-19.

One of these figures, Florida businessman Duon Miller, founded a cosmetics company, then invested his own money and that of others into an organization called Polio Prevention Inc., a one-man operation that purported to fight polio with information about nutrition. Miller’s organization actually served to undermine the vaccine with a host of outrageous, logically fallacious claims about the causes of polio and the dangers of vaccination. As Keppler notes:

Like today’s COVID skeptics, Miller cherry-picked physicians who were skeptical of polio as a virus and misrepresented facts. One mailer was a rapid fire of out-of-context information: Salk “isn’t entirely satisfied with the vaccine.” Some children still got polio after being vaccinated. And just as the “real” number of COVID-19 deaths pales in comparison to vaccine deaths in some dark corners of the internet, so it was with polio in Miller’s world: “Polio ‘CRIPPLES’ and Polio ‘DEATHS’ are merely ‘Statistics’ to the ‘Charity-Brokers,’ whose record to date of ‘Cripples’ and ‘Deaths’ is TRULY DISGRACEFUL.”

Like many conspiracy theorists today, Miller’s claims contained several kernels of truth, misplaced in the service of a bizarre crusade. Research now ties excess consumption of soft drinks, white flour, and refined sugar to an increase in cancers and heart disease. In this, Miller was prescient, given that these are the some of the biggest killers in the country. But this had nothing to do with the polio virus. Miller’s uncritical thinking, mistaking large-scale correlations for causation, typifies conspiracy theories. His appeal to the welfare of children also strikes a familiar chord, but it’s unsurprising in this case, given that “polio was a disease of children,” says René F. Najera, editor of the College of Physicians of Philadelphia’s History of Vaccines project, “so people were already afraid for their children.” Comparatively, COVID-19 “has largely left children alone … so we don’t mobilize as much.”

Keppler draws many other parallels between Miller’s personal anti-polio vaccine project and the efforts today to resist the COVID-19 vaccine, all representative of American anti-intellectualism and the well-funded will to disbelieve what the science clearly demonstrates. Miller distributed mailers in schools around Florida, accepted hundreds in donations, and printed thousands of pamphlets for distribution. He even offered to get injected with the polio virus to show that it was harmless. However, “federal charges ended Miller’s crusade,” when he was charged with “sending ‘libellous, scurrilous and defamatory’ statements through the mail” in 1954, the year Salk readied nationwide trials of the vaccine. Five years later, “U.S. polio cases were about 14 percent of what they were in 1952, thanks to vaccination,” not, as Miller would have the public believe, a change in diet. “Give us proper diets,” he continued to write to newspapers, “and we’ll solve the physical imperfections of Americans young and old.” He might have been on to a good argument about nutrition just by chance, but the public had no reason to listen to his opinions about polio simply because he could afford to circulate them.

via Steve Silberman

Related Content: 

Elvis Presley Gets the Polio Vaccine on The Ed Sullivan Show, Persuading Millions to Get Vaccinated (1956)

How the World’s First Anti-Vax Movement Started with the First Vaccine for Smallpox in 1796, and Spread Fears of People Getting Turned into Half-Cow Babies

How Do Vaccines (Including the COVID-19 Vaccines) Work?: Watch Animated Introductions

Dying in the Name of Vaccine Freedom

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

The Omicron Variant Explained by Neil deGrasse Tyson & Regeneron President George Yancopoulos

What is the Omicron Variant? How do vaccines work? And what about monoclonal antibody therapy? On this episode of StarTalk, Neil deGrasse Tyson has a wide-ranging and quite informative conversation with George Yancopoulos, president of Regeneron, the company that created the monoclonal antibody therapy now being used in the fight against COVID-19. And there’s an interesting side note: During the 1970s, Tyson and Yancopoulos were high school classmates together at Bronx Science. They’ve both come a long way, and now they re-unite to explain the science behind the latest phase of the pandemic.

If you would like to get Open Culture post’s via email, please sign up for our free email newsletter here.

And if you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, Venmo (@openculture) and Crypto. Thanks for your support!

Related Content 

1,700 Free Online Courses from Top Universities.

Neil deGrasse Tyson Lists 8 (Free) Books Every Intelligent Person Should Read

MIT Presents a Free Course on the COVID-19 Pandemic

How the COVID-19 Vaccines Could Be Created So Quickly: Two Animated Videos Explain the How mRNA Vaccines Were Developed, and How They Work

How the World’s First Anti-Vax Movement Started with the First Vaccine for Smallpox in 1796, and Spread Fears of People Getting Turned into Half-Cow Babies

A cartoon from a December 1894 anti-vaccination publication (Courtesy of The Historical Medical Library of The College of Physicians of Philadelphia)

For well over a century people have queued up to get vaccinated against polio, smallpox, measles, mumps, rubella, the flu or other epidemic diseases. And they have done so because they were mandated by schools, workplaces, armed forces, and other institutions committed to using science to fight disease. As a result, deadly viral epidemics began to disappear in the developed world. Indeed, the vast majority of people now protesting mandatory vaccinations were themselves vaccinated (by mandate) against polio, smallpox, measles, mumps, rubella, etc., and hardly any of them have contracted those once-common diseases. The historical argument for vaccines may not be the most scientific (the science is readily available online). But history can act as a reliable guide for understanding patterns of human behavior.

In 1796, Scottish physician Edward Jenner discovered how an injection of cowpox-infected human biological material could make humans immune to smallpox. For the next 100 years after this breakthrough, resistance to inoculation grew into “an enormous mass movement,” says Yale historian of medicine Frank Snowden. “There was a rejection of vaccination on political grounds that it was widely considered as another form of tyranny.”

Fears that injections of cowpox would turn people into mutants with cow-like growths were satirized as early as 1802 by cartoonist James Gilray (below). While the anti-vaccination movement may seem relatively new, the resistance, refusal, and denialism are as old as vaccinations to infectious disease in the West.

Image via Wikimedia Commons

“In the early 19th century, British people finally had access to the first vaccine in history, one that promised to protect them from smallpox, among the deadliest diseases in the era,” writes Jess McHugh at The Washington Post. Smallpox killed around 4,000 people a year in the UK and left hundreds more disfigured or blinded. Nonetheless, “many Britons were skeptical of the vaccine…. The side effects they dreaded were far more terrifying: blindness, deafness, ulcers, a gruesome skin condition called ‘cowpox mange’ — even sprouting hoofs and horns.” Giving a person one disease to frighten off another one probably seemed just as absurd a notion as turning into a human/cow hybrid.

Jenner’s method, called variolation, was outlawed in 1840 as safer vaccinations replaced it. By 1867, all British children up to age 14 were required by law to be vaccinated against smallpox. Widespread outrage resulted, even among prominent physicians and scientists, and continued for decades. “Every day the vaccination laws remain in force,” wrote scientist Alfred Russel Wallace in 1898, “parents are being punished, infants are being killed.” In fact, it was smallpox claiming lives, “more than 400,000 lives per year throughout the 19th century, according to the World Health Organization,” writes Elizabeth Earl at The Atlantic“Epidemic disease was a fact of life at the time.” And so it is again. Covid has killed almost 800,000 people in the U.S. alone over the past two years.


Then as now, medical quackery played its part in vaccine refusal — in this case a much larger part. “Never was the lie of ‘the good old days’ more clear than in medicine,” Greig Watson writes at BBC News. “The 1841 UK census suggested a third of doctors were unqualified.” Common causes of illness in an 1848 medical textbook included “wet feet,” “passionate fear or rage,” and “diseased parents.” Among the many fiery lectures, caricatures, and pamphlets issued by opponents of vaccination, one 1805 tract by William Rowley, a member of the Royal College of Physicians, alleged that the injection of cowpox could mar an entire bloodline. “Who would marry into any family, at the risk of their offspring having filthy beastly diseases?” it asked hysterically.

Then, as now, religion was a motivating factor. “One can see it in biblical terms as human beings created in the image of God,” says Snowden. “The vaccination movement injecting into human bodies this material from an inferior animal was seen as irreligious, blasphemous and medically wrong.” Granted, those who volunteered to get vaccinated had to place their faith in the institutions of science and government. After medical scandals of the recent past like the Tuskegee experiments or Thalidomide, that can be a big ask. In the 19th century, says medical historian Kristin Hussey, “people were asking questions about rights, especially working-class rights. There was a sense the upper class were trying to take advantage, a feeling of distrust.”

The deep distrust of institutions now seems intractable and fully endemic in our current political climate, and much of it may be fully warranted. But no virus has evolved — since the time of the Jenner’s first smallpox inoculation — to care about our politics, religious beliefs, or feelings about authority or individual rights. Without widespread vaccination, viruses are more than happy to exploit our lack of immunity, and they do so without pity or compunction.

via Washington Post

Related Content: 

Dying in the Name of Vaccine Freedom

How Vaccines Improved Our World In One Graphic

How Do Vaccines (Including the COVID-19 Vaccines) Work?: Watch Animated Introductions

Elvis Presley Gets the Polio Vaccine on The Ed Sullivan Show, Persuading Millions to Get Vaccinated (1956)

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

Elvis Presley Gets the Polio Vaccine on The Ed Sullivan Show, Persuading Millions to Get Vaccinated (1956)

No one living has experienced a viral event the size and scope of COVID-19. Maybe the unprecedented nature of the pandemic explains some of the vaccine resistance. Diseases of such virulence became rare in places with ready access to vaccines, and thus, ironically, over time, have come to seem less dangerous. But there are still many people in wealthy nations who remember polio, an epidemic that dragged on through the first half of the 20th century before Jonas Salk perfected his vaccine in the mid-fifties.

Polio’s devastation has been summed up visually in textbooks and documentaries by the terrifying iron lung, an early ventilator. “At the height of the outbreaks in the late 1940s,” Meilan Solly writes at Smithsonian, “polio paralyzed an average of more than 35,000 people each year,” particularly affecting children, with 3,000 deaths in 1952 alone. “Spread virally, it proved fatal for two out of ten victims afflicted with paralysis. Though millions of parents rushed to inoculate their children following the introduction of Jonas Salk’s vaccine in 1955, teenagers and young adults had proven more reluctant to get the shot.”

At the time, there were no violent, organized protests against the vaccine, nor was resistance framed as a patriotic act of political loyalty. But “cost, apathy and ignorance became serious setbacks to the eradication effort,” says historian Stephen Mawdsley. And, then as now, irresponsible media personalities with large platforms and little knowledge could do a lot of harm to the public’s confidence in life-saving public health measures, as when influential gossip columnist Walter Winchell wrote that the vaccine “may be a killer,” discouraging countless readers from getting a shot.

When Elvis Presley made his first appearance on Ed Sullivan’s show in 1956, “immunization levels among American teens were at an abysmal 0.6 percent,” note Hal Hershfield and Ilana Brody at Scientific American. To counter impressions that the polio vaccine was dangerous, public health officials did not solely rely on getting more and better information to the public; they also took seriously what Hershfield and Brody call the “crucial ingredients inherent to many of the most effective behavioral change campaigns: social influence, social norms and vivid examples.” Satisfying all three, Elvis stepped up and agreed to get vaccinated “in front of millions” backstage before his second appearance on the Sullivan show.

Elvis could not have been more famous, and the campaign was a success for its target audience, establishing a new social norm through influence and example: “Vaccination rates among American youth skyrocketed to 80 percent after just six months.” Despite the threat he supposedly posed to the establishment, Elvis himself was ready to serve the public. “I certainly never wanna do anything,” he said, “that would be a wrong influence.” See in the short video at the top how American public health officials stopped millions of preventable deaths and disabilities by admitting a fact propagandists and advertisers never shy from — humans, on the whole, are easily persuaded by celebrities. Sometimes they can even be persuaded for the good.

Related Content: 

Yo-Yo Ma Plays an Impromptu Performance in Vaccine Clinic After Receiving 2nd Dose

Dying in the Name of Vaccine Freedom

How Do Vaccines (Including the COVID-19 Vaccines) Work?: Watch Animated Introductions

How Vaccines Improved Our World In One Graphic

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

More in this category... »
Open Culture was founded by Dan Colman.