Carl Sagan’s “Baloney Detection Kit”: A Toolkit That Can Help You Scientifically Separate Sense from Nonsense

It’s probably no stretch to say that mass disinformation campaigns and rampant anti-intellectualism will constitute an increasing amount of our political reality both today and in the future. As Hannah Arendt wrote, the political lie has always been with us. But its global reach, particular vehemence, and blatant contempt for verifiable reality seem like innovations of the present.

Given the embarrassing wealth of access to information and educational tools, maybe it’s fair to say that the first and last line of defense should be our own critical reasoning. When we fail to verify news—using resources we all have in hand (I assume, since you’re reading this), the fault for believing bad information may lie with us.

But we so often don’t know what it is that we don’t know. Individuals can’t be blamed for an inadequate educational system, and one should not underestimate the near-impossibility of conducting time-consuming inquiries into the truth of every single claim that comes our way, like trying to identify individual droplets while getting hit in the face with a pressurized blast of targeted, contradictory info, sometimes coming from shadowy, unreliable sources.

Carl Sagan understood the difficulty, and he also understood that a lack of critical thinking did not make people totally irrational and deserving of contempt. “It’s not hard to understand,” for example, why people would think their relatives are still alive in some other form after death. As he writes of this common phenomenon in “The Fine Art of Baloney Detection,” most supernatural beliefs are just “humans being human.”

In the essay, a chapter from his 1995 book The Demon-Haunted World, Sagan proposes a rigorous but comprehensible “baloney detection kit” to separate sense from nonsense.

  • Wherever possible there must be independent confirmation of the “facts.”
  • Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
  • Arguments from authority carry little weight — “authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
  • Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives.
  • Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
  • If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations.
  • If there’s a chain of argument, every link in the chain must work (including the premise) — not just most of them.
  • Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler. Always ask whether the hypothesis can be, at least in principle, falsified…. You must be able to check assertions out. Inveterate skeptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result.

Calling his recommendations “tools for skeptical thinking,” he lays out a means of compensating for the strong emotional pulls that “promise something like old-time religion” and recognizing “a fallacious or fraudulent argument.” At the top of the post, in a video produced by Big Think, you can hear science writer and educator Michael Shermer explain the “baloney detection kit” that he himself adapted from Sagan, and just above, read Sagan’s own version, abridged into a short list (read it in full at Brain Pickings).

Like many a science communicator after him, Sagan was very much concerned with the influence of superstitious religious beliefs. He also foresaw a time in the near future much like our own. Elsewhere in The Demon-Haunted World, Sagan writes of “America in my children’s or grandchildren’s time…. when awesome technological powers are in the hands of a very few.” The loss of control over media and education renders people “unable to distinguish between what feels good and what’s true.”

This state involves, he says a “slide… back into superstition” of the religious variety and also a general “celebration of ignorance,” such that well-supported scientific theories carry the same weight or less than explanations made up on the spot by authorities whom people have lost the ability to “knowledgeably question.” It’s a scary scenario that may not have completely come to pass… just yet, but Sagan knew as well or better than anyone of his time how to address such a potential social epidemic.

Related Content:

Carl Sagan Predicts the Decline of America: Unable to Know “What’s True,” We Will Slide, “Without Noticing, Back into Superstition & Darkness” (1995)

Carl Sagan’s Syllabus & Final Exam for His Course on Critical Thinking (Cornell, 1986)

Carl Sagan’s Last Interview

Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness

by | Permalink | Comments (6) |

Support Open Culture

We’re hoping to rely on our loyal readers rather than erratic ads. To support Open Culture’s educational mission, please consider making a donation. We accept PayPal, Venmo (@openculture), Patreon and Crypto! Please find all options here. We thank you!

Comments (6)
You can skip to the end and leave a response. Pinging is currently not allowed.
  • Paul Tatara says:

    If you can’t trust authorities, how do you get independent confirmation of the facts?? I’m not being a wise-guy. I honestly don’t know how to work that out!

  • Nigel J Watson says:

    I really enjoyed his shows; esp Cosmos. I still have his epitaphial goodbye as he was making his exit via Cancer.

    Too bad, then, that he failed to bring his ‘detector’ (I constantly employ many of the tools on his list) to bear in his own realm of expertise.

  • Josh Jones says:

    Definitely a valid concern, Paul. I think there are questions we can ask about supposed authorities that help us determine whether or not they are reliable sources of information. What are their educational backgrounds and areas of expertise? Do they cite other experts or seem to just refer to themselves as sole authorities? Do they have a good reputation in a particular field? Do they have obvious biases or conflicts of interest that might cause them to misrepresent facts? I don’t think it’s true that we can’t trust all authorities, only that we can’t trust appeals to authority as guarantees of accuracy.

  • craig moreau says:

    Hubris ,bias ,greed and lust tend to convolute the truth and the truth becomes subjective and even a paper dragon. With this in mind tread carefully when authorities and experts come together or oppose each other .Sagan points out experts and authorities will take their baggage and fail to provide their version of the truth with enough truth to sustain it as history and science goes on. So we are left with shifting ideas that may survive the unrelenting bias of the deplorable, God save the experts. oh that’s not right

  • JS says:

    You must first admit that if you cannot discern what is probably trustworthy information, that’s a problem all by itself that you need to solve.

    A super high IQ, highly educated, unbiased, experienced person has the skills to discern trustworthiness in their areas of understanding. Are you that? If not you are something somewhat less like most of us, and that’s what you have to determine “trustworthiness.” Conversely, an illiterate, uneducated, low IQ person who has never been outside a box is a poor judge of trustworthiness of data.

    Socrates said that the basis of wisdom is having a decent ideas of what things you don’t know. You don’t know how to confirm data and you admit it, that is a good first step.

  • Tim Jones says:

    The currency of Reason is in two forms; Objective Philosophical Argumentation (or “O.P.A.” for short), and Hard Scientific Evidence (or H.S.E.” for short). A theory must by those merits establish a prima facie case for itself. It is them rigorously tested in order to merit widespread acceptance. And even then, it is always vulnerable to being refuted by either countervailing data or the rise of a better theory. An “expert” in just someone who’s done this a lot, in their area of expertise.

Leave a Reply

Open Culture was founded by Dan Colman.