For her groundbreaking research on radioactivity, Marie Curie won the Nobel Prize. Or rather, she won two, one for physics and another for chemistry, making her the only Nobel Laureate in more than one science. What’s more, her first Nobel came in 1903, the very same year she completed her PhD thesis at the Sorbonne. In Recherches sur les substances radioactives (or Research on Radioactive Substances), Curie “talks about the discovery of the new elements radium and polonium, and also describes how she gained one of the first understandings of the new physical phenomenon of radioactivity.”
So says science Youtuber Toby Hendy in the introduction below to Curie’s thesis–a thesis that made her the first woman in France to receive a doctoral degree in physics. “Following on from the discovery of X-rays by Wilhelm Roentgen in 1895 and Henri Becquerel’s discovery that uranium salts emitted similar penetration properties,” says The Document Centre, Curie “investigated uranium rays as a starting point, but in the process discovered that the air around uranium rays is made to conduct electricity.”
Her deduction that “the process was caused by properties of the atoms themselves” — a revolutionary finding that overturned previously held notions in physics — led her eventually to discover radium and polonium, which would get her that second Nobel in 1911.
Unlike her Nobel Prize in physics, which she shared with her husband Pierre and the physicist Henri Becquerel, Marie Curie won her Nobel Prize in chemistry alone. By 1911 Pierre had been dead for half a decade, but Marie’s scientific genius couldn’t be stopped from continuing their pioneering research as far as she could take it in her own lifetime. She clearly knew how vast a field her work, with and without her husband, had opened up: “Our researches upon the new radio-active bodies have given rise to a scientific movement,” she writes at the end of Recherches sur les substances radioactives. That movement continues to make discoveries more than a century later — and her original thesis itself remains radioactive.
Despite having been around for well over a century, the Oreo cookie has managed to retain certain mysteries. Why, for example, does it never come apart evenly? Though different Oreo-eaters prefer different methods of Oreo-eating, an especially popular approach to the world’s most popular cookie involves twisting it open before consumption. That action produces two separate chocolate wafers, but as even kindergarteners know from long and frustrating experience, the crème filling sticks only to one side. It seems that no manual technique, no matter how advanced, can split the contents of an Oreo close to evenly, and only recently have a team of researchers at the Massachusetts Institute of Technology sought an explanation.
This endeavor necessitated an investigation of the Oreo’s rheology — the study of the flow of matter, especially liquids but also “soft solids” like crème filling. Like all scientific research, it involved intensive experimentation, and even the invention of a new measurement device: in this case, a simple 3D-printable “Oreometer” (seen in animated action above) that uses pennies and rubber bands.
With it the researchers applied “applied varying degrees of torque and angular rotation, noting the values that successfully twisted each cookie apart,” writes MIT News‘ Jennifer Chu. “In all, the team went through about 20 boxes of Oreos, including regular, Double Stuf, and Mega Stuf levels of filling, and regular, dark chocolate, and ‘golden’ wafer flavors. Surprisingly, they found that no matter the amount of cream filling or flavor, the cream almost always separated onto one wafer.”
Crystal Owens, a mechanical engineering PhD candidate working on this project, puts this down in large part to how Oreos are made. “Videos of the manufacturing process show that they put the first wafer down, then dispense a ball of cream onto that wafer before putting the second wafer on top. Apparently that little time delay may make the cream stick better to the first wafer.” But other physical factors also bear on the phenomenon as well, as documented in the paper Owens and her collaborators published earlier this year in the journal Physics of Fluid. “We introduce Oreology (/ɔriːˈɒlədʒi/), from the Nabisco Oreo for “cookie” and the Greek rheo logia for ‘flow study,’ as the study of the flow and fracture of sandwich cookies,” they write in its abstract. For a scientifically inclined youngster, one could hardly imagine a more compelling field.
We have become quite used to pronouncements of doom, from scientists predicting the sixth mass extinction due to the measurable effects of climate change, and from religionists declaring the apocalypse due to a surfeit of sin. It’s almost impossible to imagine these two groups of people agreeing on anything other than the ominous portent of their respective messages. But in the early days of the scientific revolution—the days of Shakespeare contemporary Francis Bacon, and later 17th century Descartes—it was not at all unusual to find both kinds of reasoning, or unreasoning, in the same person, along with beliefs in magic, divination, astrology, etc.
Yet even in this maelstrom of heterodox thought and practices, Sir Isaac Newton stood out as a particularly odd co-existence of esoteric biblical prophecy, occult beliefs, and a rigid, formal mathematics that not only adhered to the inductive scientific method, but also expanded its potential by applying general axioms to specific cases.
Yet many of Newton’s general principles would seem totally inimical to the naturalism of most physicists today. As he was formulating the principles of gravity and three laws of motion, for example, Newton also sought the legendary Philosopher’s Stone and attempted to turn metal to gold. Moreover, the devoutly religious Newton wrote theological treatises interpreting Biblical prophecies and predicting the end of the world. The date he arrived at? 2060.
So then the time times & half a time are 42 months or 1260 days or three years & an half, recconing twelve months to a yeare & 30 days to a month as was done in the Calendar of the primitive year. And the days of short lived Beasts being put for the years of lived [sic] kingdoms, the period of 1260 days, if dated from the complete conquest of the three kings A.C. 800, will end A.C. 2060. It may end later, but I see no reason for its ending sooner.
Newton further demonstrates his confidence in the next sentence, writing that his intent, “though not to assert” an answer, should in any event “put a stop the rash conjectures of fancifull men who are frequently predicting the time of the end.” Indeed. So how did he arrive at this number? Newton applied a rigorous method, that is to be sure.
If you have the patience for exhaustive description of how he worked out his prediction using the Book of Daniel, you may read one here by historian of science Stephen Snobelen, who also points out how widespread the interest in Newton’s odd beliefs has become, reaching across every continent, though scholars have known about this side of the Enlightenment giant for a long time.
For a sense of the exacting, yet completely bizarre flavor of Newton’s prophetic calculations, see another Newton letter at the of the post, transcribed below.
Prop. 1. The 2300 prophetick days did not commence before the rise of the little horn of the He Goat.
2 Those day [sic] did not commence a[f]ter the destruction of Jerusalem & ye Temple by the Romans A.[D.] 70.
3 The time times & half a time did not commence before the year 800 in wch the Popes supremacy commenced
4 They did not commence after the re[ig]ne of Gregory the 7th. 1084
5 The 1290 days did not commence b[e]fore the year 842.
6 They did not commence after the reigne of Pope Greg. 7th. 1084
7 The diffence [sic] between the 1290 & 1335 days are a parts of the seven weeks.
Therefore the 2300 years do not end before ye year 2132 nor after 2370.
The time times & half time do n[o]t end before 2060 nor after 
The 1290 days do not begin [this should read: end] before 2090 [Newton might mean: 2132] nor after 1374 [sic; Newton probably means 2374]
The editorial insertions are Professor Snobelen’s, who thinks the letter dates “from after 1705,” and that “the shaky handwriting suggests a date of composition late in Newton’s life.” Whatever the exact date, we see him much less certain here; Newton pushes around some other dates—2344, 2090 (or 2132), 2374. All of them seem arbitrary, but “given the nice roundness of the number,” writesMotherboard, “and the fact that it appears in more than one letter,” 2060 has become his most memorable dating for the apocalypse.
It’s important to note that Newton didn’t believe the world would “end” in the sense of cease to exist or burn up in holy flames. His end times philosophy resembles that of a surprising number of current day evangelicals: Christ would return and reign for a millennium, the Jewish diaspora would return to Israel and would, he wrote, set up “a flourishing and everlasting Kingdom.” We hear such statements often from televangelists, school boards, governors, and presidential candidates.
As many people have argued, despite Newton’s conception of his scientific work as a bulwark against other theologies, it ultimately became a foundation for Deism and Naturalism, and has allowed scientists to make accurate predictions for hundreds of years. 20th century physics may have shown us a much more radically unstable universe than Newton ever imagined, but his theories are, as Isaac Asimov would put it, “not so much wrong as incomplete,” and still essential to our understanding of certain fundamental phenomena. But as fascinating and curious as Newton’s other interests may be, there’s no more reason to credit his prophetic calculations than those of the Millerites, Harold Camping, or any other apocalyptic doomsday sect.
Note: An earlier version of this post appeared on our site in 2015.
We hear a great deal today about the potential causes of rising sea levels. At a certain point, natural curiosity brings out the opposite question: what causes sea levels to fall? And for that matter, can a body of water so large simply vanish entirely? Such a thing did happen once, according to the PBS Eons video above. The story begins, from our perspective, with the discovery about a decade ago of a giant rabbit — or rather of the bones of a giant rabbit, one “up to six times heavier than your average cottontail” that “almost certainly couldn’t hop.” This odd, long-gone specimen was dubbed Nuralagus rex: “the rabbit king of Minorca,” the modern-day island it ruled from about five million to three million years ago.
After living for long periods of time on islands without natural predators, certain species take on unusual proportions. “But how did the normal-size ancestor of Nuralagus make it onto a Mediterranean island in the first place?” The answer is that Minorca wasn’t always an island. In fact, “mega-deposits” of salt under the floor of the Mediterranean suggest that, “at one point in history, the Mediterranean Sea must have evaporated.” As often in our investigation of the natural world, one strange big question leads to another even stranger and bigger one. Geologists’ long and complex project of addressing it has led them to posit a forbidding-sounding event called the Messinian Salinity Crisis, or MSC.
MSC-explaining theories include a “global cooling event” six million years ago whose creation of glaciers would have reduced the flow of water into the Mediterranean, and “tectonic events” that could have blocked off what we now know as the Strait of Gibraltar. But the cause now best supported by evidence involves a combination of shifts in the Earth’s crust and changes in its climate — sixteen full cycles of them. “During periods of decreasing sea level, the position and angle of the Earth changed with respect to the Sun, so there were periods of lower solar energy, and others of higher solar energy, which increased evaporation rates in the Mediterranean. At the same time, an actively folding and uplifting tectonic belt caused water input to decrease.”
The MSC seems to have lasted for over 600,000 years. At its driest point, 5.6 million years ago, “external water sources were completely cut off, and most of the water left behind in the Mediterranean basin was evaporating.” For sea creatures, the Mediterranean became uninhabitable, but those that lived on dry land had a bit of a field day. These relatively dry conditions “allowed hippos, elephants, and other megafauna from Africa to walk and swim across the Mediterranean,” constituting a great migration that would have included the ancestor of Nuralagus rex. But when the sea later filled back up — possibly due to a flood, as animated above — the rabbit king of Minorca learned that, even on a geological timescale, you can’t go home again.
The fate of the visionary is to be forever outside of his or her time. Such was the life of Nikola Tesla, who dreamed the future while his opportunistic rival Thomas Edison seized the moment. Even now the name Tesla conjures seemingly wildly impractical ventures, too advanced, too expensive, or far too elegant in design for mass production and consumption. No one better than David Bowie, the pop artist of possibility, could embody Tesla’s air of magisterial high seriousness on the screen. And few were better suited than Tesla himself, perhaps, to extrapolate from his time to ours and see the technological future clearly.
Of course, this image of Tesla as a lone, heroic, and even somewhat tragic figure who fell victim to Edison’s designs is a bit of a romantic exaggeration. As even the editor of a 1935 feature interview piece in the now-defunct Liberty magazine wrote, Tesla and Edison may have been rivals in the “battle between alternating and direct current…. Otherwise the two men were merely opposites. Edison had a genius for practical inventions immediately applicable. Tesla, whose inventions were far ahead of the time, aroused antagonisms which delayed the fruition of his ideas for years.” One can in some respects see why Tesla “aroused antagonisms.” He may have been a genius, but he was not a people person, and some of his views, though maybe characteristic of the times, are downright unsettling.
In the lengthy Liberty essay, “as told to George Sylvester Viereck” (a poet and Nazi sympathizer who also interviewed Hitler), Tesla himself makes the pronouncement, “It seems that I have always been ahead of my time.” He then goes on to enumerate some of the ways he has been proven right, and confidently lists the characteristics of the future as he sees it. No one likes a know-it-all, but Tesla refused to compromise or ingratiate himself, though he suffered for it professionally. And he was, in many cases, right. Many of his 1935 predictions in Liberty are still too far off to measure, and some of them will seem outlandish, or criminal, to us today. But some still seem plausible, and a few advisable if we are to make it another 100 years as a species. Tesla’s predictions include the following, which he introduces with the disclaimer that “forecasting is perilous. No man can look very far into the future.”
“Buddhism and Christianity… will be the religion of the human race in the twenty-first century.”
“The year 2100 will see eugenics universally established.” Tesla went on to comment, “no one who is not a desirable parent should be permitted to produce progeny. A century from now it will no more occur to a normal person to mate with a person eugenically unfit than to marry a habitual criminal.”
“Hygiene, physical culture will be recognized branches of education and government. The Secretary of Hygiene or Physical Culture will be far more important in the cabinet of the President of the United States who holds office in the year 2025 than the Secretary of War.” Along with personal hygiene, Tesla included “pollution” as a social ill in need of regulation.
“I am convinced that within a century coffee, tea, and tobacco will be no longer in vogue. Alcohol, however, will still be used. It is not a stimulant but a veritable elixir of life.”
“There will be enough wheat and wheat products to feed the entire world, including the teeming millions of China and India.” (Tesla did not foresee the anti-gluten mania of the 21st century.)
“Long before the next century dawns, systematic reforestation and the scientific management of natural resources will have made an end of all devastating droughts, forest fires, and floods. The universal utilization of water power and its long-distance transmission will supply every household with cheap power.” Along with this optimistic prediction, Tesla foresaw that “the struggle for existence being lessened, there should be development along ideal rather than material lines.”
Tesla goes on to predict the elimination of war, “by making every nation, weak or strong, able to defend itself,” after which war chests would be diverted to funding education and research. He then describes—in rather fantastical-sounding terms—an apparatus that “projects particles” and transmits energy, enabling not only a revolution in defense technology, but “undreamed of results in television.” Tesla diagnoses his time as one in which “we suffer from the derangement of our civilization because we have not yet completely adjusted ourselves to the machine age.” The solution, he asserts—along with most futurists, then and now—“does not lie in destroying but in mastering the machine.” As an example of such mastery, Tesla describes the future of “automatons” taking over human labor and the creation of “a thinking machine.”
When wireless is perfectly applied the whole earth will be converted into a huge brain, which in fact it is…. We shall be able to communicate with one another instantly, irrespective of distance. Not only this, but through television and telephony we shall see and hear one another as perfectly as though were face to face, despite intervening distances of thousands of miles; and the instruments through which we shall be able to do this will be amazingly simple compared with our present telephone. A man will be able to carry one in his vest pocket.
Telsa also made some odd predictions about fuel-less passenger flying machines “free from any limitations of the present airplanes and dirigibles” and spouted more of the scary stuff about eugenics that had come to obsess him late in life. Additionally, Tesla saw changing gender relations as the precursor of a coming matriarchy. This was not a development he characterized in positive terms. For Tesla, feminism would “end in a new sex order, with the female as superior.” (As Novak notes, Tesla’s misgivings about feminism have made him a hero to the so-called “men’s rights” movement.) While he fully granted that women could and would match and surpass men in every field, he warned that “the acquisition of new fields of endeavor by women, their gradual usurpation of leadership, will dull and finally dissipate feminine sensibilities, will choke the maternal instinct, so that marriage and motherhood may become abhorrent and human civilization draw closer and closer to the perfect civilization of the bee.”
It seems to me that a “bee civilization” would appeal to a eugenicist, except, I suppose, Tesla feared becoming a drone. Although he saw the development as inevitable, he still sounds to me like any number of current politicians who argue that society should continue to suppress and discriminate against women for their own good and the good of “civilization.” Tesla may be an outsider hero for geek culture everywhere, but his social attitudes give me the creeps. While I’ve personally always liked the vision of a world in which robots do most the work and we spend most of our money on education, when it comes to the elimination of war, I’m less sanguine about particle rays and more sympathetic to the words of Ivor Cutler.
Note: An earlier version of this post appeared on our site in 2015.
Particularly if SpaceX CEO Elon Musk achieves his goal of establishing a permanent human presence on Mars.
Surely at some point in their long travels to and residence on Mars, those pioneers would get down to business in much the same way that rats, fruit flies, parasitic wasps, and Japanese rice fish have while under observation on prior space expeditions.
Meanwhile, we’re seriously lacking in human data.
A pair of human astronauts, Jan Davis and Mark Lee, made history in 1992 as the first married couple to enter space together, but NASA insisted their relations remained strictly professional for the duration, and that a shuttle’s crew compartment is too small for the sort of antics a nasty-minded public kept asking about.
In an interview with Mens Health, Colonel Mike Mullane, a veteran of three space missions, confirmed that a spacecraft’s layout doesn’t favor romance:
The only privacy would have been in the air lock, but everybody would know what you were doing. You’re not out there doing a spacewalk. There’s no reason to be in there.
Shortly after Davis and Lee returned to earth, NASA formalized an unspoken rule prohibiting husbands and wives from venturing into space together. It did little to squelch public interest in space sex.
One wonders if NASA’s rule has been rewritten in accordance with the times. Air lock aside, might same sex couples remain free to swing what hetero-normative marrieds (arguably) cannot?
This is but one of hundreds of space sex questions begging further consideration.
In Physiology News Magazine, Dr. Adam Watkins, associate professor of Reproductive and Developmental Physiology at the University of Nottingham, suggests that internal and external atmospheric changes would make such things, pardon the pun, hard:
Firstly, just staying in close contact with each other under zero gravity is hard. Secondly, as astronauts experience lower blood pressure while in space, maintaining erections and arousal are more problematic than here on Earth.
The exceptionally forthright Col Mullane has some contradictory first hand experience that should come as a relief to all humankind:
A couple of times, I would wake up from sleep periods and I had a boner that I could have drilled through kryptonite.
Watch above a classic movie made by David Rogers at Vanderbilt University in the 1950s. It shows “a neutrophil (a type of white blood cell) chasing a bacterium through a field of red blood cells in a blood smear. After pursuing the bacterium around several red blood cells, the neutrophil finally catches up to and engulfs its prey. In the human body, these cells are an important first line of defense against bacterial infection. The speed of rapid movements such as cell crawling can be most easily measured by the method of direct observation.” This comforting video comes courtesy of the estate of David Rogers, Vanderbilt University.
And if you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, Venmo (@openculture) and Crypto. Thanks for your support!
“Where’re you from?” one character asks another on the Firesign Theatre’s classic 1969 album How Can You Be in Two Places at Once When You’re Not Anywhere at All. “Nairobi, ma’am,” the other replies. “Isn’t everybody?” Like most of the countless multi-layered gags on their albums, this one makes a cultural reference, presumably to the discoveries made by famed paleoanthropologists Louis and Mary Leakey over the previous 20 years. Their discovery of fossils in Kenya and elsewhere did much to advance the thesis that humankind evolved in Africa, and that the process was happening more than 1.75 million years before.
Like all scientific breakthroughs, the Leakeys’ work only prompted more questions — or rather, created more opportunities for refining and adding detail to the relevant body of knowledge. Subsequent digs all over Africa have produced further evidence of how far our species and its predecessors go back, and where exactly the evolutionary progress happened.
Just this month, Nature published a new paper on the “age of the oldest known Homo sapiens from eastern Africa.” These new findings about known fossils, originally discovered in southwestern Ethiopia in 1967, suggest that the time has come for another revision of the long pre-history of humanity.
photo by Céline Vidal
The paper’s authors, writes Reuters’ Will Dunham, “used the geochemical fingerprints of a thick layer of ash found above the sediments containing the fossils to ascertain that it resulted from an eruption that spewed volcanic fallout over a wide swathe of Ethiopia roughly 233,000 years ago.” These fossils “include a rather complete cranial vault and lower jaw, some vertebrae and parts of the arms and legs.” After their initial discovery by the late Richard Leakey, son of Louis and Mary (and a man genuinely from Nairobi, born and raised), the fossils buried by this prehistoric Vesuvius were previously believed to be “no more than about 200,000 years old.”
Dunham quotes the paper‘s lead author, University of Cambridge volcanologist Celine Vidal, as saying this discovery aligns with “the most recent scientific models of human evolution placing the emergence of Homo sapiens sometime between 350,000 to 200,000 years ago.” Though Vidal and her team’s analysis of the ash’s geochemical composition has determined the minimum age of Omo I, as these fossils are known, the maximum age remains an open question. Or at least, it awaits the efforts of researchers to date the “ash layer below the sediment containing the fossils” and render a more precise estimate. And when that’s established, it will then, ideally, become material for the next big absurdist comedy troupe.
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.
Open Culture (openculture.com) and our trusted partners use technology such as cookies on our website to personalise ads, support social media features, and analyze our traffic. Please click below to consent to the use of this technology while browsing our site.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.