Without climate change, we couldn’t inhabit the Earth as we do today. The greenhouse effect, by which gases in a planet’s atmosphere increase the heat of that planet’s surface, “makes life on Earth possible.” So says Carl Sagan in the video above. He adds that without it, the temperature would be about 30 degrees centigrade cooler: “That’s well below the freezing point of water everywhere on the planet. The oceans would be solid.” A little of the climate change induced by the greenhouse effect, then, is a good thing, but “here we are pouring enormous quantities of CO2 and these other gases into the atmosphere every year, with hardly any concern about its long-term and global consequences.”
It’s fair to say that the level of concern has increased since Sagan spoke these words in 1985, when “climate change” wasn’t yet a household term. But even then, his audience was Congress, and his fifteen-minute address, preserved by C‑SPAN, remains a succinct and persuasive case for more research into the phenomenon as well as strategies and action to mitigate it.
What audience would expect less from Sagan, who just five years earlier had hosted the hit PBS television series Cosmos, based on his book of the same name. Its broadcast made contagious his enthusiasm for scientific inquiry in general and the nature of the planets in particular. Who could forget, for example, his introduction to the “thoroughly nasty place” that is Venus, research into whose atmosphere Sagan had conducted in the early 1960s?
Venus is “the nearest planet — a planet of about the same mass, radius, density, as the Earth,” Sagan tells Congress, but it has a “surface temperature about 470 degrees centigrade, 900 Fahrenheit.” The reason? “A massive greenhouse effect in which carbon dioxide plays the major role.” As for our planet, estimates then held that, without changes in the rates of fossil fuel-burning and “infrared-absorbing” gases released into the atmosphere, there will be “a several-centigrade-degree temperature increase” on average “by the middle to the end of the next century.” Given the potential effects of such a rise, “if we don’t do the right thing now, there are very serious problems that our children and grandchildren will have to face.” It’s impossible to know how many listeners these words convinced at the time, though they certainly seem to have stuck with a young senator in the room by the name of Al Gore.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
A cartoon from a December 1894 anti-vaccination publication (Courtesy of The Historical Medical Library of The College of Physicians of Philadelphia)
For well over a century people have queued up to get vaccinated against polio, smallpox, measles, mumps, rubella, the flu or other epidemic diseases. And they have done so because they were mandated by schools, workplaces, armed forces, and other institutions committed to using science to fight disease. As a result, deadly viral epidemics began to disappear in the developed world. Indeed, the vast majority of people now protesting mandatory vaccinations were themselves vaccinated (by mandate) against polio, smallpox, measles, mumps, rubella, etc., and hardly any of them have contracted those once-common diseases. The historical argument for vaccines may not be the most scientific (the science is readily available online). But history can act as a reliable guide for understanding patterns of human behavior.
In 1796, Scottish physician Edward Jenner discovered how an injection of cowpox-infected human biological material could make humans immune to smallpox. For the next 100 years after this breakthrough, resistance to inoculation grew into “an enormous mass movement,” says Yale historian of medicine Frank Snowden. “There was a rejection of vaccination on political grounds that it was widely considered as another form of tyranny.”
Fears that injections of cowpox would turn people into mutants with cow-like growths were satirized as early as 1802 by cartoonist James Gilray (below). While the anti-vaccination movement may seem relatively new, the resistance, refusal, and denialism are as old as vaccinations to infectious disease in the West.
Image via Wikimedia Commons
“In the early 19th century, British people finally had access to the first vaccine in history, one that promised to protect them from smallpox, among the deadliest diseases in the era,” writes Jess McHugh at The Washington Post. Smallpox killed around 4,000 people a year in the UK and left hundreds more disfigured or blinded. Nonetheless, “many Britons were skeptical of the vaccine.… The side effects they dreaded were far more terrifying: blindness, deafness, ulcers, a gruesome skin condition called ‘cowpox mange’ — even sprouting hoofs and horns.” Giving a person one disease to frighten off another one probably seemed just as absurd a notion as turning into a human/cow hybrid.
Jenner’s method, called variolation, was outlawed in 1840 as safer vaccinations replaced it. By 1867, all British children up to age 14 were required by law to be vaccinated against smallpox. Widespread outrage resulted, even among prominent physicians and scientists, and continued for decades. “Every day the vaccination laws remain in force,” wrote scientist Alfred Russel Wallace in 1898, “parents are being punished, infants are being killed.” In fact, it was smallpox claiming lives, “more than 400,000 lives per year throughout the 19th century, according to the World Health Organization,” writes Elizabeth Earl at The Atlantic. “Epidemic disease was a fact of life at the time.” And so it is again. Covid has killed almost 800,000 people in the U.S. alone over the past two years.
Then as now, medical quackery played its part in vaccine refusal — in this case a much larger part. “Never was the lie of ‘the good old days’ more clear than in medicine,” Greig Watson writes at BBC News. “The 1841 UK census suggested a third of doctors were unqualified.” Common causes of illness in an 1848 medical textbook included “wet feet,” “passionate fear or rage,” and “diseased parents.” Among the many fiery lectures, caricatures, and pamphlets issued by opponents of vaccination, one 1805 tract by William Rowley, a member of the Royal College of Physicians, alleged that the injection of cowpox could mar an entire bloodline. “Who would marry into any family, at the risk of their offspring having filthy beastly diseases?” it asked hysterically.
Then, as now, religion was a motivating factor. “One can see it in biblical terms as human beings created in the image of God,” says Snowden. “The vaccination movement injecting into human bodies this material from an inferior animal was seen as irreligious, blasphemous and medically wrong.” Granted, those who volunteered to get vaccinated had to place their faith in the institutions of science and government. After medical scandals of the recent past like the Tuskegee experiments or Thalidomide, that can be a big ask. In the 19th century, says medical historian Kristin Hussey, “people were asking questions about rights, especially working-class rights. There was a sense the upper class were trying to take advantage, a feeling of distrust.”
The deep distrust of institutions now seems intractable and fully endemic in our current political climate, and much of it may be fully warranted. But no virus has evolved — since the time of the Jenner’s first smallpox inoculation — to care about our politics, religious beliefs, or feelings about authority or individual rights. Without widespread vaccination, viruses are more than happy to exploit our lack of immunity, and they do so without pity or compunction.
The book can be downloaded an .epub file which can be opened in a compatible e‑reader application on many devices. An email address, along with a name of college/university, is required. Find the book here.
When did Americans lose the ability to think and act rationally? Or did they ever, on the whole, have such ability? These are the questions at the heart of the Big Think video above, a supercut of interview clips from public intellectuals — Neil DeGrasse, Michael Shermer, Tyson, Kurt Andersen, Bill Nye, and Margaret Atwood — opining on the state of the nation’s intellectual health. Unsurprisingly, the prognosis is not good, as Carl Sagan predicted over 25 years ago.
Of interest here is the diagnosis: How did the country get to a place where it is unable to defend itself against a deadly virus because millions of citizens refuse to take it seriously? How did Americans let Exxon wreck the climate because millions of Americans refused to believe in human-caused climate change? How did a failed mogul and reality TV star become president? How did Qanon, Pizzagate…. How did any of it happen?
The roots are long and deep, says writer and former host of NPR’s Studio 360, Kurt Andersen, who has spent a significant amount of time thinking about the culture of American irrationalism. On the one hand, “Americans have always been magical thinkers and passionate believers in the untrue,” from the time of the Puritans, who were not persecuted refugees so much as fanatics no one in England could stand. And the problem is even older than the country’s founding, Andersen argues in his book Fantasyland: How America Went Haywire: A 500-Year History — it dates to the foundations of the modern world.
On the other hand, and somewhat contradictorily, it was those Puritans again who kept the worst of things in check. “We also have the virtues embodied by the Puritans and their secular descendants,” Andersen writes at The Atlantic: “steadiness, hard work, frugality, sobriety, and common sense” — such virtues as helped build the country’s scientific industries and research institutions, which have been steadily undermined by the relativism of the 1960s (Andersen argues), the effects of the internet, and a series of devastating political choices. The delusional irrationalism was built in — but hyper-individualism and profiteering of the last several decades supercharged it. “The United States used to be the world leader in technology,” says Bill Nye, but no more.
Margaret Atwood, who is Canadian not American, talks mostly about the universal human difficulty of letting go of comforting core beliefs, and the uses the example of the outcry against Darwinian evolution. Yet her very presence in the discussion will make viewers think of her most famous novel,The Handmaid’s Tale, in which she imagined what lies beneath the supposedly enlightened common sense of the country’s government. The stage was long ago set for a revolution that could easily turn the country against science, she believed.
As Atwood wrote in 2018 of the novel’s genesis: “Nations never build apparently radical forms of government on foundations that aren’t there already.… The deep foundation of the United States — so went my thinking — was not the comparatively recent 18th-century Enlightenment structures of the Republic, with their talk of equality and their separation of Church and State, but the heavy-handed theocracy of 17th-century Puritan New England — with its marked bias against women — which would need only the opportunity of a period of social chaos to reassert itself.”
Rather than identifying the problems with Puritans or 60s hippies, Neil DeGrasse Tyson — as he has done throughout his career — discusses issues of science education and communication. On both fronts, there has been some improvement. “More journalists who are science fluent… are writing about science than was the case 20 years ago,” he says, “so now I don’t have to worry about the journalist missing something fundamental.… And [science] reporting has been much more accurate in recent years, I’m happy to report.”
But while the internet has amplified our opportunities for scientific literacy, it has also done the opposite, grossly muddying the intellectual waters with misinformation and a competitive need to get the story first. “If it’s not yet verified, it’s not there yet.… So be more open about how wrong the thing you’re reporting on could be, because otherwise you’re doing a disservice to the public. And that disservice is that people out there say, ‘Scientists don’t know anything.’ ”
There are also those who choose to side with handful of contrarian scientists who disagree with the consensus. “This is irresponsible,” says Tyson. “Plus it means you don’t know how science works.” Or it means you’re looking to confirm biases rather than genuinely take an interest in the scientific process. For all of their insights, the talking head critics in the video fail to mention a primary driver behind so much of the U.S.‘s science denialism, a motivation as foundational to the country as the Puritan’s zealotry: profit, at all costs.
For all its talk of liberty, the US government has practiced dehumanizing authoritarianism and mass murder since its founding. And since the rise of fascism in the early 20th century, it has never been self-evident that it cannot happen here. On the contrary — wrote Yale historian Timothy Snyder before and throughout the Trump presidency — it happened here first, though many would like us to forget. The histories of southern slavocracy and manifest destiny directly informed Hitler’s plans for the German colonization of Europe as much as did Europe’s 20th-century colonization of Africa and Asia.
Snyder is not a scholar of American history, though he has much to say about his country’s present. His work has focused on WWII’s totalitarian regimes and his popular books draw from a “deep knowledge of twentieth-century European history,” write Françoise Mouly and Genevieve Bormes at The New Yorker.
Indeed, the problem with rigid conformity to populist ideas became the subject of Snyder’s 2017 bestseller, On Tyranny: Twenty Lessons from the Twentieth Century, “a slim volume,” Mouly and Bormes note, “which interspersed maxims such as ‘Be kind to our language’ and ‘Defend institutions’ with biographical and historical sketches.” (We posted an abridged version of Snyder’s 20 lessons that year.) On Tyrannybecame an “instant best-seller… for those who were looking for ways to combat the insidious creep of authoritarianism at home.”
If you’ve paid any attention to the news lately, maybe you’ve noticed that the threat has not receded. Ideas about how to combat anti-democratic movements remain relevant as ever. It’s also important to remember that Snyder’s book dates from a particular moment in time and draws on a particular historical perspective. Contextual details that can get lost in writing come to the fore in images — clothing, cars, the use of color or black and white: these all key us in to the historicity of his observations.
“We don’t exist in a vacuum,” says artist Nora Krug, the designer and illustrator of a new, graphic edition of On Tyranny just released this month. “I use a variety of visual styles and techniques to emphasize the fragmentary nature of memory and the emotive effects of historical events.” Krug worked from artifacts she found at flea markets and antique stores, “depositories of our collective consciousness,” as she writes in an introductory note to the new edition.
Krug’s choice of a variety of mediums and creative approaches “allows me to admit,” she says, “that we can only exist in relationship to the past, that everything we think and feel is thought and felt in reference to it, that our future is deeply rooted in our history, and that we will always be active contributors to shaping how the past is viewed and what our future will look like.”
The “Lying Flat” movement taking hold among young people in China involves doing exactly what it suggests: working little, resting a lot, and cultivating the most minimalist lifestyle possible. Unlike Timothy Leary’s 1960’s mantra, “turn on, tune in, drop out,” lying flat, or tang ping (躺平), takes no stance on a countercultural ethos or the consumption of mind-altering drugs. But it has caused the authorities alarm, even among English-language observers. Consider the Brookings Institute headline, “The ‘lying flat’ movement standing in the way of China’s innovation drive.” Standing in the way of innovation is a cardinal sin of capitalism, one reason the “niche Chinese Gen Z meme” of tang ping,Jane Li writes, “is ringing alarm bells for Beijing.”
The phenomenon began — where else — on social media, when 31-year-old former factory worker Luo Huazhong “drew the curtains and crawled into bed,” Cassady Rosenblum writes at The New York Times. Luo then “posted a picture of himself [in bed] to the Chinese website Baidu along with a message: ‘Lying Flat is Justice.’”
His manifesto (above) claimed the “right to choose a slow lifestyle” by doing little work to get by, reading, gardening, exercising, and, yes, lying supine as often as he liked. To further elaborate, Luo wrote, “lying flat is my sophistic movement,” with a reference to Diogenes the Cynic, the Greek philosopher “said to have lived inside a barrel to criticize the excesses of Athenian aristocrats.”
Diogenes did more than that. He and his followers rejected everything about Athenian society, from work and marriage to the abstract reasoning of Plato. Luo might have turned to a more traditional source for “lying flat” — the Daoist principle of wu-wei, or non-doing. But lying flat is not so much about living in harmony with nature as it is a state of exhaustion, a full-body admission that the promises of capitalism — work hard now, rest hard later — have not and will not materialize. They are phantoms, mirages, precisely the kind of fictions that made Diogenes bark with laughter. The truth, Rosenblum writes, is that for “essential” workers at the bottom all the way up to the “inner sanctums” of Goldman Sachs, “work has become intolerable. Rest is resistance.”
In a work culture that celebrates “996” — 12-hour days, six days a week– rest may be the only form of resistance. Political repression and lack of upward mobility have fostered “an almost monastic outlook” in China, writes Li, “including not getting married, not having children, not having a job, not owning property, and consuming as little as possible.” Since picking up tens of thousands of followers online, the lying flat movement has become the target of a censorship campaign aimed at stopping young Chinese workers from checking out. One government-backed newspaper called the movement “shameful,” and news agency Xinhua unfavorably compared “lying flattists” to front-line medical workers. The original manifesto, Lying Flat groups, and message boards where users posted photos of seals, cats, and themselves lying flat have been taken down.
Zijia Song writes of tang ping as partly a response to a traditional Chinese culture of competitiveness and overwork, but notes that there are similar movements in Japan, Korea, and the U.S., where “Black activists, writers and thinkers are among the clearest voices articulating this spiritual malaise and its solutions,” writes Rosenblum, “perhaps because they’ve borne the brunt of capitalism more than other groups of Americans.” Whatever their national origin, each of these statements defiantly claims the right to rest, posing a threat not only to the Party but to an ideal of human life as endless overwork for shiny trinkets and empty promises, during a global pandemic and climate crisis that have revealed to us like nothing else the need to slow down, rest, and completely reimagine the way we live.
We should just trust the experts. But wait: to identify true expertise requires its own kind of even more specialized expertise. Besides, experts disagree with each other, and over time disagree with themselves as well. This makes it challenging indeed for all of us non-experts — and we’re all non-experts in the fields to which we have not dedicated our lives — to understand phenomena of any complexity. As for grasping climate change, with its enormous historical scale and countless many variables, might we as well just throw up our hands? Many have done so: Neil Halloran, creator of the short documentary Degrees of Uncertainty above, labels them “climate denialists” and “climate defeatists.”
Climate denialists choose to believe that manmade climate change isn’t happening, climate defeatists choose to believe that it’s inevitable, and both thereby let themselves off the hook. Not only do they not have to address the issue, they don’t even have to understand it — which itself can seem a fairly daunting task, given that scientists themselves express no small degree of uncertainty about climate change’s degree and trajectory. “The only way to learn how sure scientists are is to dig in a little and view their work with some healthy skepticism,” says Halloran. This entails developing an instinct not for refutation, exactly, but for examining just how the experts arrive at their conclusions and what pitfalls they encounter along the way.
Often, scientists “don’t know how close they are to the truth, and they’re prone to confirmation bias,” and as anyone professionally involved in the sciences knows full well, they work “under pressure to publish noteworthy findings.” Their publications then find their way to a media culture in which, increasingly, “trusting or distrusting scientists is becoming a matter of political identity.” As he did in his previous documentary The Fallen of World War II, Halloran uses animation and data visualization to illuminate his own path to understanding a global occurrence whose sheer proportions make it difficult to perceive.
This journey takes Halloran not just around the globe but back in time, starting in the year 19,000 B.C. and ending in projections of a future in which ring seas swallow much of Amsterdam, Miami, and New Orleans. The most important stop in the middle is the Age of Enlightenment and the Industrial Revolution of the 17th through the 19th century, when science and technology rose to prominence and brought about an unprecedented human flourishing — with climatic consequences that have begun to make themselves known, albeit not with absolute certainty. But as Halloran sees it, “uncertainty, the very thing that clouds our view, also frees us to construct possible answers.”
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
Over the past year, the story of evictions during COVID has often risen above the muck. It’s made headlines in major newspapers and TIME magazine, and received serious attention from the government, with stop-gap eviction moratoriums put in effect and renewed several times, and likely due to be renewed again. Stopping evictions is not enough. “For many landlords,” notes the United Way, “the order created a financial burden of housing renters with no payments,” and without income, they have no way to pay. But these measures have kept many thousands of vulnerable adults and children from experiencing homelessness.
And yet moratoriums aside, the number of people losing their homes is on the rise during the pandemic, with a disproportionate impact on Black, Latinx, and Indigenous communities, and shelters have been forced to close or lower capacity. Framing increasing homelessnes solely as a crisis driven by the virus misses the fact that it has been growing since 2016, though it is down from pre-2007 levels. “Even before the current health/economic crisis,” notes a Homelessness Research Institute report, “the older adult homeless population was projected to trend upwards until 2030.”
Indeed, homelessness has seemed like a sad, inevitable fact of American life for decades. Rather than accept the situation, organizations like Invisible People have worked to end it. “The first step to solving homelessness,” they write, “is acknowledging that its victims are people. Regular people. Fathers. Mothers. Veterans. Whole families. Folks who fell on hard times and lost their core foundation of being human — their homes.” No one asks to be in the situation, and the longer a person goes unhoused, the harder it is for them to rebuild their lives.
Invisible People offers action steps and publishes well-researched journalism on the problems, and solutions, for the millions of people experiencing homelessness at any given time. But as their name suggests, their primary aim is to make the lives of unhoused people visible to those of us who tend to walk right by them in our haste. We can feel overwhelmed by the intractable scale of the problem, which tends to turn individuals into statistics. Invisible People asks us to “change the story,” and to start by approaching homelessness one person, or one family, at a time.
Invisible People was founded in Los Angeles by Mark Horvath, a former TV executive who became homeless after drug and alcohol addiction in 1995. After recovering, he lost his home again during the 2008 Recession. Horvath began interviewing people he met on the streets of L.A. and posting the videos to YouTube and Twitter. Soon, the project became a global one, incorporated as a non-profit, and Horvath has traveled across the U.S. and to Canada, Peru, and the UK to interview people living without homes.
The project, says Horvath is designed to foster “a conversation about solutions to end homelessness [that] gives homeless people a chance to tell their own story.” Those stories are moving, human, unforgettable, and usually not at all what you might expect. You can see some of them here, and many more at the Invisible People YouTube channel. Connect with the organization and find out what you can dohere.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.