When people say things like “the science is settled” or “the science has changed,” researchers tend to grind their teeth. Science can come to a broad consensus, as in the case of the coronavirus or climate change, but it isn’t ever perfectly settled as a bloc on any question. We proceed in scientific knowledge not by attaining perfect knowledge but, as Isaac Asimov once wrote, by being less wrong than those who came before.
And scientists advance in scientific publishing, as Aeon writes, not with certainty, but with “excitement, baby steps and reams of rejections.” As we see in the short film above, The Researcher’s Article, by French filmmaker Charlotte Arene, getting one’s research published can be “a patience-testing exercise in rejection, rewriting and waiting,” demonstrated here by the travails of physicists Frédéric Restagno and Julien Bobroff of the University of Paris-Saclay.
Even before submitting their findings, the scientists must carefully fit their work into the traditional form known as the “letter,” a document of four pages or fewer that condenses years of research into strictly succinct paragraphs, graphs, and references. The “letter” is “one of the most popular formats of articles in physics,” say the physicists, noting the major Nobel prize-winning discoveries to appear as letters in recent years, including the Higgs’ Boson publication that won in 2013, coming in at only two pages long.
Summing up “a massive amount of data,” short scientific articles then go on to prove themselves to their respective fields through a refereeing process in which three anonymous scientists read the work and recommend publication, revision, or rejection. This process can go several rounds and take several months. One must be persistent: Restagno and Bobroff were rejected from several journals before finally getting an acceptance.
After this significant investment of time and effort, the authors may have a readership of maybe twenty people. But crowd size is not the point, they say, “because research is made up of all these small discoveries,” contributing to a larger picture, informing and correcting each other, and going about the humble, painstaking business of trying to be less wrong than their predecessors, while still building on the best insights of hundreds of years of scientific publishing.
If you keep up with climate change news, you see a lot of predictions of what the world will look like twenty years from now, fifty years from now, a century from now. Some of these projections of the state of the land, the shape of continents, and the levels of the sea are more dramatic than others, and in any case they vary so much that one never knows which ones to credit. But of equal importance to foreseeing what Earth will look like in the future is not forgetting what it looks like now — or so holds the premise of the Earth Archive, a scientific effort to “scan the entire surface of the Earth before it’s too late.”
This ambitious project has three goals: to “create a baseline record of the earth as it is today to more effectively mitigate the climate crisis,” to “build a virtual, open-source planet accessible to all scientists so we can better understand our world,” and to “preserve a record of the Earth for our grandchildren’s grandchildren so they can study & recreate our lost heritage.”
All three depend on the creation of a detailed 3D model of the globe — but “globe” is the wrong word, bringing to mind as it does a sphere covered with flat images of land and sea.
Using lidar (short for Light Detection & Ranging), a technology that “involves shooting a dense grid of infrared beams from an airplane towards the ground,” the Earth Archive aims to create not an image but “a dense three-dimensional cloud of points” capturing the whole planet. At the top of the post, you can see a TED Talk on the Earth Archive’s origin, purpose, and potential by archaeologist and anthropology professor Chris Fisher, the project’s founder and director. “Fisher had used lidar to survey the ancient Purépecha settlement of Angamuco, in Mexico’s Michoacán state,” writes Atlas Obscura’s Isaac Schultz. “In the course of that work, he saw human-caused changes to the landscape, and decided to broaden his scope.”
Now, Fisher and Earth Archive co-director Steve Leisz want to create “a comprehensive archive of lidar scans” to “fuel an immense dataset of the Earth’s surface, in three dimensions.” This comes with certain obstacles, not the least the price tag: a scan of the Amazon rainforest would take six years and cost $15 million. “The next step,” writes Schultz, “could be to use some future technology that puts lidar in orbit and makes covering large areas easier.” Disinclined to wait around for the development of such a technology while forests burn and coastlines erode, Fisher and Leisz are taking their first steps — and taking donations — right now. On the off chance that humans of centuries ahead develop the ability to recreate the planet as we know it today, it’s the Earth Archive’s data they’ll rely on to do it.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
Broadly speaking, the “Space Race” of the 1950s and 60s involved two major players, the United States and the Soviet Union. But there were also minor players: take, for instance, the Zambian Space Program, founded and administered by just one man. A Time magazine article published in November 1964 — when the Republic of Zambia was one week old — described Edward Mukuka Nkoloso as a “grade-school science teacher and the director of Zambia’s National Academy of Science, Space Research and Philosophy.” Nkoloso had a plan “to beat the U.S. and the Soviet Union to the moon. Already Nkoloso is training twelve Zambian astronauts, including a 16-year-old girl, by spinning them around a tree in an oil drum and teaching them to walk on their hands, ‘the only way humans can walk on the moon.’ ”
Nkoloso and his Quixotic space program seem to have drawn as much attention as the subject of the article, Zambia’s first president Kenneth David Kaunda. Namwali Serpell tells Nkoloso’s story in a piece for The New Yorker: not just the conception and failure of his entry into the Space Race (“the program suffered from a lack of funds,” Serpell writes, “for which Nkoloso blamed ‘those imperialist neocolonialists’ who were, he insisted, ‘scared of Zambia’s space knowledge‘”), but also his background as “a freedom fighter in Kaunda’s United National Independence Party.”
Born in 1919 in then-Northern Rhodesia, Nkoloso received a missionary education, got drafted into World War II by the British, took an interest in science during his service, and came home to illegally found his own school. There followed periods as a salesman, a “political agitator,” and a messianic liberator figure, ending with his capture and imprisonment by colonial authorities.
How on Earth could this all have convinced Nkoloso to aim for Mars? Some assume he experienced a psychological break due to torture endured at the hands of Northern Rhodesian police. Some see his ostensible interplanetary ambitions as a cover for the training he was giving his “Afronauts” for guerrilla-style direct political action. Some describe him as a kind of national court jester: Serpell quotes from the memoir of San Francisco Chronicle columnist Arthur Hoppe, author of a series of contemporary pieces on the Zambian Space Program, who “believed it was the Africans who were satirizing our multi-billion-dollar space race against the Russians.” As Serpell points out, “Zambian irony is very subtle,” and as a satirist Nkoloso had “the ironic dédoublement — the ability to split oneself — that Charles Baudelaire saw in the man who trips in the street and is already laughing at himself as he falls.”
Whatever Nkoloso’s purposes, the Zambian Space Program has attracted new attention in the years since documentary footage of its facilities and training procedures found its way to Youtube. This fascinatingly eccentric chapter in the history of man’s heavenward aspirations has become the subject of short documentaries like the one from SideNote at the top of the post, as well as the subject of artworks like the short film Afronauts above. Nkoloso died more than 30 years ago, but he now lives on as an icon of Afrofuturism, a movement (previously featured here on Open Culture) at what Serpell calls “the nexus of black art and technoculture.” No figure embodies Afrofuturism quite so thoroughly as Sun Ra, who transformed himself from the Alabama-born Herman Poole Blount into a peace-preaching alien from Saturn. Though Nkoloso never seems to have met his American contemporary, such an encounter would surely, as a subject for Afrofuturistic art, be truly out of this world.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
We don’t call it a tragedy when a renowned person dies after the century mark, especially if that person is brilliant NASA mathematician Katherine Johnson, who passed away yesterday at the venerable age of 101. Her death is a great historical loss, but by almost any measure we would consider reaching such a finish line a triumphant end to an already heroic life.
A prodigy and pioneer, Johnson joined the all-black “human computing” section at NASA’s predecessor, the National Advisory Committee for Aeronautics, in 1953. She would go on to calculate the launch windows and return trajectories for Alan Shepard’s first spaceflight, John Glenn’s first trip into orbit, and the Apollo Lunar Module’s first return from the Moon.
All this without the benefit of any machine computing power to speak of and—as Hidden Figures dramatizes through the powerful performance of Taraji P. Henson as Johnson—while facing the dual barriers of racism and sexism her white male bosses and co-workers blithely ignored or deliberately upheld.
Johnson and her fellow “computers,” without whom none of these major milestones would have been possible, had to fight not only for recognition and a seat at the table, but for the basic accommodations we take for granted in every workplace.
Her contributions didn’t end when the space race was over—her work was critical to the Space Shuttle program and she even worked on a mission to Mars. But Johnson herself kept things in perspective, telling People magazine in the interview above from 2016, “I’m 98. My greatest accomplishment is staying alive.” Still, she lived to see herself turned into the hero of that year’s critically lauded film based on the bestselling book of the same name by Margot Lee Shetterly—decades after she completed her most groundbreaking work.
Shetterly’s book, writes historian of technology Marie Hicks, casts Johnson and her fellow black women mathematicians “as protagonists in the grand drama of American technological history rather than mere details.” By its very nature, a Hollywood film adaptation will leave out important details and take liberties with the facts for dramatic effect and mass appeal. The feature treatment moves audiences, but it also soothes them with feel-good moments that “keep racism at arm’s length from a narrative that, without it, would never have existed.”
The point is not that Johnson and her colleagues decided to make racism and sexism central to their stories; they simply wanted to be recognized for their contributions and be given the same access and opportunities as their white male colleagues. But to succeed, they had to work together instead of competing with each other. Despite its simplifications and glosses over Cold War history and the depth of prejudice in American society, Hidden Figures does something very different from most biopics, as Atlantic editor Lenika Cruz writes, telling “a story of brilliance, but not of ego. It’s a story of struggle and willpower, but not of individual glory… it looks closely at the remarkable person in the context of a community.”
Katherine Johnson lived her life as a tremendous example for young women of color who excel at math and science but feel excluded from the establishment. On her 98th birthday, she “wanted to share a message to the young women of the world,” says the narrator of the 20th Century Studios video above: “Now it’s your turn.” And, she might have added, “you don’t have to do it alone.” Hear Hidden Figures author Shetterly discuss the critical contributions of Katherine and her extraordinary “human computer” colleagues in the interview below, and learn more about Johnson’s life and legacy in the featurette at the top and at her NASA biography here.
When I hear the word robot, I like to imagine Isaac Asimov’s delightfully Yiddish-inflected Brooklynese pronunciation of the word: “ro-butt,” with heavy stress on the first syllable. (A quirk shared by Futurama’s crustacean Doctor Zoidberg.) Asimov warned us that robots could be dangerous and impossible to control. But he also showed young readers—in his Norby series of kids’ books written with his wife Janet—that robots could be heroic companions, saving the solar system from cosmic supervillains.
The word robot conjures all of these associations in science fiction: from Blade Runner’s replicants to Star Trek’s Data. We might refer to these particular examples as androids rather than robots, but this confusion is precisely to the point. Our language has forgotten that robots started in sci-fi as more human than human, before they became Asimov-like machines. Like the sci-fi writer’s pronunciation of robot, the word originated in Eastern Europe in 1921, the year after Asimov’s birth, in a play by Czech intellectual Karel Čapek called R.U.R., or “Rossum’s Universal Robots.”
The title refers to the creations of Mr. Rossum, a Frankenstein-like inventor and possible inspiration for Metropolis’s Rotwang (who was himself an inspiration for Dr. Strangelove). Čapek told the London Saturday Review after the play premiered that Rossum was a “typical representative of the scientific materialism of the last [nineteenth] century,” with a “desire to create an artificial man—in the chemical and biological, not mechanical sense.”
Rossum did not wish to play God so much as “to prove God to be unnecessary and absurd.” This was but one stop on “the road to industrial production.” As technology analyst and Penn State professor John M. Jordan writes at the MIT Press Reader, Čapek’s robots were not appliances become sentient, nor trusty, superpowered sidekicks. They were, in fact, invented to be slaves.
The robot… was a critique of mechanization and the ways it can dehumanize people. The word itself derives from the Czech word “robota,” or forced labor, as done by serfs. Its Slavic linguistic root, “rab,” means “slave.” The original word for robots more accurately defines androids, then, in that they were neither metallic nor mechanical.
Jordan describes this history in an excerpt from his book Robots, part of the MIT Press Essential Knowledge Series, and a timelier than ever intervention in the cultural and technological history of robots, who walk (and moonwalk) among us in all sorts of machine forms, if not quite yet in the sense Čapek imagined. But a Blade Runner-like scenario seemed inevitable to him in a society ruled by “utopian notions of science and technology.”
In the time he imagines, he says, “the product of the human brain has escaped the control of human hands.” Čapek has one character, the robot Radius, make the point plainly:
The power of man has fallen. By gaining possession of the factory we have become masters of everything. The period of mankind has passed away. A new world has arisen. … Mankind is no more. Mankind gave us too little life. We wanted more life.
Sound familiar? While R.U.R. owes a “substantial” debt to Mary Shelley’s Frankenstein, it’s also clear that Čapek contributed something original to the critique, a vision of a world in which “humans become more like their machines,” writes Jordan. “Humans and robots… are essentially one and the same.” Beyond the surface fears of science and technology, the play that introduced the word robot to the cultural lexicon also introduced the darker social critique in most stories about them: We have reason to fear robots because in creating them, we’ve recreated ourselves; then we’ve treated them the way we treat each other.
“The Mummy Speaks!” announces The New York Timesin Nicholas St. Fleur’s story about Nesyamun, a mummified Egyptian priest whose voice has been recreated, sort of, “with the aid of a 3‑D printed vocal tract” and an electronic larynx. Does the mummy sound like the monster of classic 1930’s horror? Scientists have only got as far as one syllable, “which resembles the ‘ah’ and ‘eh’ vowels sounds heard in the words ‘bad’ and ‘bed.’ ” Yet it’s clear that Nesyamun would not communicate with guttural moans.
This may not make the recreation any less creepy. Nesyamun, whose coffin is inscribed with the words “true of voice,” was charged with singing and chanting the liturgies; “he had this wish,” says David Howard, speech scientist at Royal Holloway, University of London, “that his voice would somehow continue into perpetuity.” Howard and his team’s 3‑D printed recreation of his mouth and throat has allowed them to synthesize “the sound that would come out of his vocal tract if he was in his coffin and his larynx came to life again.”
Let’s imagine a different scenario, shall we? One in which Nesyamun speaks from the ancient past rather than from the sarcophagus. “Voice from the Past” is, indeed, what the researchers call their project, and they hope that it will eventually enable museum goers to “engage with the past in completely new and innovative ways.”
If Nesyamun could be made to speak again, St. Fleur writes, “perhaps the mummy could recite for museum visitors his words to Nut, the ancient Egyptian goddess of the sky and heavens: ‘O mother Nut, spread out your wings over my face so you may allow me to be like the stars-which-know-no-destruction, like the stars-which-know-no-weariness, (and) not to die over again in the cemetery.”
Might we empathize? As University of York archaeologist John Schofield puts it, “there is nothing more personal than someone’s voice.” Hearing the mummy speak would be “more multidimensional” than only staring at his corpse. The novelty of this experience aside, one can imagine the knowledge historians and linguists of ancient languages might gather from this research. Others in the scientific community have expressed their doubts. We may wish to temper our expectations.
Piero Cosi, an Italian speech scientist who helped reconstruct the voice of a mummified iceman named Ötzi in 2016 (speaking only in Italian vowels), points out the speculative nature of the science: “Even if we have the precise 3‑D-geometric description of the voice system of the mummy, we would not be able to rebuild precisely his original voice.” Egyptologist Kara Cooney notes the clear potential for human biases to shape research that uses “so much inference about what [ancient people] looked or sounded like.”
So, what might be the value of approximating Nesyamun’s voice? In their paper, published in Nature Scientific Reports, Howard and his co-authors explain, in language that sounds suspiciously like the kind that might invoke a classic horror movie mummy’s curse:
While this approach has wide implications for heritage management/museum display, its relevance conforms exactly to the ancient Egyptians’ fundamental belief that ‘to speak the name of the dead is to make them live again.’ Given Nesyamun’s stated desire to have his voice heard in the afterlife in order to live forever, the fulfilment of his beliefs through the synthesis of his vocal function allows us to make direct contact with ancient Egypt.
Learn more about the Nesyamun’s vocal recreation in the videos above.
In the early 19th century, Aristotle’s Meteorologica still guided scientific ideas about the climate. The model “sprang from the ancient Greek concept of klima,” as Ian Beacock writes at The Atlantic, a static scheme that “divided the hemispheres into three fixed climatic bands: polar cold, equatorial heat, and a zone of moderation in the middle.” It wasn’t until the 1850s that the study of climate developed into what historian Deborah Cohen describes as “dynamic climatology.”
Indeed, 120 years before Exxon Mobile learned about—and then seemingly covered up—global warming, pioneering researchers discovered the greenhouse gas effect, the tendency for a closed environment like our atmosphere to heat up when carbon dioxide levels rise. The first person on record to link CO2 and global warming, amateur scientist Eunice Newton Foote, presented her research to the Eight Annual Meeting of the American Association for the Advancement of Science in 1856.
Foote’s paper, “Circumstances affecting the heat of the sun’s rays,” was reviewed the following month in the pages of Scientific American, in a column that approved of her “practical experiments” and noted, “this we are happy to say has been done by a lady.” She used an air pump, glass cylinders, and thermometers to compare the effects of sunlight on “carbonic acid gas” (or carbon dioxide) and “common air.” From her rudimentary but effective demonstrations, she concluded:
An atmosphere of that gas [CO2] would give to our earth a high temperature; and if as some suppose, at one period of its history the air had mixed with it a larger proportion than at present, an increased temperature…must have necessarily resulted.
Unfortunately, her achievement would disappear three years later when Irish physicist John Tyndall, who likely knew nothing of Foote, made the same discovery. With his superior resources and privileges, Tyndall was able to take his research further. “In retrospect,” one climate science database writes, Tyndall has emerged as the founder of climate science, though the view “hides a complex, and in many ways more interesting story.”
Neither Tyndall nor Foote wrote about the effect of human activity on the contemporary climate. It would take until the 1890s for Swedish scientist Svante Arrhenius to predict human-caused warming from industrial CO2 emissions. But subsequent developments depended upon their insights. Foote, whose was born 200 years ago this past July, was marginalized almost from the start. “Entirely because she was a woman,” the Public Domain Review points out, “Foote was barred from reading the paper describing her findings.”
Furthermore, Foote “was passed over for publication in the Association’s annual Proceedings.” Her paper was published in The American Journal of Science, but was mostly remarked upon, as in the Scientific American review, for the marvel of such homespun ingenuity from “a lady.” The review, titled “Scientific Ladies—Experiments with Condensed Gas,” opened with the sentence “Some have not only entertained, but expressed the mean idea, that women do not possess the strength of mind necessary for scientific investigation.”
The praise of Foote credits her as a paragon of her gender, while failing to convey the universal importance of her discovery. At the AAAS conference, the Smithsonian’s Joseph Henry praised Foote by declaring that science was “of no country and of no sex,” a statement that has proven time and again to be untrue in practice. The condescension and discrimination Foote endured points to the multiple ways in which she was excluded as a woman—not only from the scientific establishment but from the educational institutions and funding sources that supported it.
Her disappearance, until recently, from the history of science “plays into the Matilda Effect,” Leila McNeill argues at Smithsonian, “the trend of men getting credit for female scientist’s achievements.” In this case, there’s no reason not to credit both scientists, who made original discoveries independently. But Foote got there first. Had she been given the credit she was due at the time—and the institutional support to match—there’s no telling how far her work would have taken her.
Just as Foote’s discovery places her firmly within climate science history, retrospectively, her “place in the scientific community, or lack therof,” writes Amara Huddleston at Climate.gov, “weaves into the broader story of women’s rights.” Foote attended the first Women’s Rights Convention in Seneca Falls, NY in 1848, and her name is fifth down on the list of signatories to the “Declaration of Sentiments,” a document demanding full equality in social status, legal rights, and educational, economic, and, Foote would have added, scientific opportunities.
My first thought upon seeing the delicate, anatomy-based work of the 23-year-old embroidery artist and medical student Emmi Khan was that the Girl Scouts must have expanded the categories of skills eligible for merit badges.
(If memory serves, there was one for embroidery, but it certainly didn’t look like a cross-sectioned brain, or a sinus cavity.)
Closer inspection revealed that the circular views of Khan’s embroideries are not quite as tiny as the round badges stitched to high achieving Girl Scouts’ sashes, but rather still framed in the wooden hoops that are an essential tool of this artist’s trade.
Methods both scientific and artistic are a source of fascination for Khan, who began taking needlework inspiration from anatomy as an undergrad studying biomedical sciences. As she writes on her Moleculart website:
Science has particular methods: it is fundamentally objective, controlled, empirical. Similarly, art has particular methods: there is an emphasis on subjectivity and exploration, but there is also an element of regulation regarding how art is created… e.g. what type of needle to use to embroider or how to prime a canvas.
The procedures and techniques adopted by scientists and artists may be very different. Ultimately, however, they both have a common aim. Artists and scientists both want to 1) make sense of the vastness around them in new ways, and 2) present and communicate it to others through their own vision.
A glimpse at the flowers, intricate stitches, and other dainties that populate her Pinterest boards offers a further peek into Khan’s methods, and might prompt some readers to pick up a needle themselves, even those with no immediate plans to embroider a karyotype or The Circle of Willis, the circular anastomosis of arteries at the base of the brain.
The Cardiff-based medical student delights in embellishing her threaded observations of internal organs with the occasional decorative element—sunflowers, posies, and the like…
She makes herself available on social media to answer questions on subjects ranging from embroidery tips to her relationship to science as a devout Muslim, and to share works in progress, like a set of lungs that embody the Four Seasons, commissioned by a customer in the States.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.