Quick note: Whenever Apple releases a new version of iOS, Stanford eventually releases a course telling you how to develop apps in that environment. iOS 8 came out last fall, and now the iOS 8 app development course is getting rolled out this quarter. It’s free online, of course, on iTunes.
“Clarke sm” by Amy Marash. Licensed under Public Domain via Wikimedia Commons
When you want a vision of the future, I very much doubt you turn to Reader’s Digest for it. But Arthur C. Clarke did once appear in its small-format pages to provide just that, and when Arthur C. Clarke talks about the future, you’d do well to listen. Last year, we featured a 1964 BBC documentary in which the science-fiction luminary predicted the internet, 3D printers, and trained monkey servants. Today, we’d like to link you up to his Reader’s Digest predictions from the comparatively recent year of 2001 — one in which, for obvious reasons, Clarke made the media rounds — which you can read in full at arthurcclarke.net. Some highlights of his speculative timeline from 2001 to 2100:
By 2010, commercial nuclear devices, household quantum generators, and fully re-engineered automobile engines will have ended the Fossil Fuel Age. We’ll have seen the first acknowledged human clone and seen off the last human criminal.
By 2020, we’ll have discovered a 76-meter octopus, fly on “aerospace-planes” (one of which will carry Prince Harry), and trade in “mega-watt-hours” instead of any now-known currencies, and tsunamis caused by a meteor will wreck the coasts of Greenland and Canada (prompting the development of new meteor-detecting technologies).
By 2030, artificial intelligence will have reached human level, we’ll have landed on Mars, computer-generated DNA will make possible a real-life Jurassic Park, and the neurological “braincap” will allow us the direct sensory experience of anything at all.
By 2040, the “universal replicator” will allow us to create any object at all in the comfort of our own homes, resulting in the phase-out of work and a boom in arts, entertainment, and education.
By 2050, Buckminster Fuller-style self-contained mobile homes become a reality, and humans scattered as far as “Earth, the Moon, Mars, Europa, Ganymede and Titan, and in orbit around Venus, Neptune and Pluto” celebrate the centenary of Sputnik 1.
By 2090, Halley’s comet will have returned, and on it we’ll have found life forms that vindicate “Wickramasinghe and Hoyle’s century-old hypothesis that life exists through space.” We’ll also start burning fossil fuels again, both as a replacement for the carbon dioxide we’ve “mined” from the air and to forestall the next Ice Age by warming the globe back up a bit.
By 2100, we’ll have replaced rockets with a “space drive” that lets us travel close to the speed of light. And so, Clarke writes, “history begins…”
You’ll notice, of course, that we’re already behind Clarke’s vision, according to which many a still-improbable development also lies ahead in the near future. In any case, though, the end of crime, the beginning of private space travel, and the era of the Dymaxion home must come sooner or later, mustn’t they? And as Clarke himself admits, one plays a mug’s game when one predicts, even when one does it with uncommon astuteness. “In 1971 I predicted the first Mars Landing in 1994,” he remembers in the preamble to his list. “On the other hand, I thought I was being wildly optimistic in 1951 by suggesting a mission to the moon in 1978. Neil Armstrong and Buzz Aldrin beat me by almost a decade.”
But to this day, Clarke’s scorecard looks better than most of ours: “I take pride in the fact that communications satellites are placed exactly where I suggested in 1945, and the name “Clarke Orbit” is often used (if only because it’s easier to say than ‘geostationary orbit’).” Who knows what he could tell us to watch out for now if, as he predicted in 2001, he’d lived to see his hundredth birthday aboard the Hilton Orbiter Hotel?
Founded in 1931, the Woodberry Poetry Room at Harvard University features (among other things) 6,000 recordings of poetry from the 20th and 21st centuries. There you can find some of the earliest recordings of W. H. Auden, Elizabeth Bishop, T. S. Eliot, Denise Levertov, Robert Lowell, Anais Nin, Ezra Pound, Robert Penn Warren, Tennessee Williams and many others.
In the “Listening Booth,” a section of the Poetry Room website, you can listen to recordings of classic readings by nearly 200 authors, including John Berryman, Robert Bly, Jorge Luis Borges, Joseph Brodsky, Jorie Graham, Seamus Heaney, Jack Kerouac, Adrienne Rich, Anne Sexton, Wallace Stevens, Dylan Thomas, Anne Waldman, William Carlos Williams and more. The sound files are all free to stream. And if this is your kind of thing, make sure you visit the Penn Sound archive at the University of Pennsylvania, which is an equally rich and amazing audio archive. We previously featured it here.
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
Herman Melville’s Moby Dick, the work he is most known for in death, had the effect in life of ruining his literary reputation and driving him into obscurity. This is but one of many ironies attending the massive novel, first published in Britain in three volumes on October 18, 1851. At that time, it was simply called The Whale, and as Melville.org informs us, was “expurgated to avoid offending delicate political and moral sensibilities.” One month later, the first American edition appeared, now titled Moby Dick; Or, The Whale, compiled into one huge volume, and with its censored passages, including the Epilogue, restored. In both printings, the book sold poorly, and the reviews—save those from a handful of American critics, including Melville’s fellow Great American novelist Nathaniel Hawthorne—were largely negative.
Another irony surrounding the novel is one nearly everyone who’s read it, or tried to read it, will know well. We’re socialized through visual media to approach the story as great, tragic action/adventure. As Melville’s friend, publisher Evert Augustus Duyckinck, described it, the novel is ostensibly “a romantic, fanciful & literal & most enjoyable presentment of the Whale Fishery,” driven by the revenge plot of mad old Captain Ahab. And yet, it is not that at all, or not simply that. Despite the fact that it lends itself so well to adventurous retelling, the novel itself can seem very obscure, ponderous, and digressive to a maddening degree. The so-called “whaling chapters,” notably “Cetology,” delve deeply into the lore and technique of whaling, the anatomy and physiology of various whale species, and the history and politics of the venture.
Throughout the novel, ordinary objects and events—especially, of course, the whale itself—acquire such symbolic weight that they become almost cartoonish talismans and leap bewilderingly out of the narrative, forcing the reader to contemplate their significance—no easy task. Depending on your sensibilities and tolerance for Melville’s labyrinthine prose, these very strange features of the novel are either indispensably fascinating or just plain excess baggage. Since many editions are published with the whaling chapters excised, many readers clearly feel they are the latter. That is unfortunate, I think. It’s one of my favorite novels, in all its baroque overstuffedness and philosophical density. But there’s no denying that it works, as they say, “on many levels.” Depending on how you experience the book—it’s either an incredibly gripping adventure tale, or a very dense and puzzling work of history, philosophy, politics, and zoology… or both, and more besides….
Recognizing the power of Melville’s arresting imagery, artist and librarian Matt Kish decided that he would illustrate all 552 pages of the Signet Classic paperback edition of Moby Dick, a book he considers “to be the greatest novel ever written.” He began the project in August of 2009 with the first page, illustrating those famous first words—“Call me Ishmael”—above. (At the top, see page 489, below it page 158, and directly below, page 116). Kish completed his epic project at the end of 2010. He used a variety of media—ink, watercolor, acrylic paint—and incorporated a number of different graphic art styles. As he explains in the comments under the first illustration, he chose “drawing and painting over pages from old books and diagrams because the presence of visual information on those pages would in some ways interfere with, and clutter up, my own obsessive control over my marks.” All in all, it’s a very admirable undertaking, and you can see each individual illustration, and many of the stages of drafting and composition, at Kish’s blog or on this list we’ve compiled. (You can also find links to the first 25 pages at bottom of this post.) The entire project has also been published as a book, Moby-Dick in Pictures: One Drawing for Every Page, a further irony given the obsessive literariness of Melville’s novel, a work as obsessed with language as Captain Ahab is with his great white nemesis.
Nonetheless, what Kish’s project further demonstrates is the seemingly inexhaustible treasure house that is Moby Dick, a book that so richly appeals to all the senses as it also ceaselessly engages the intellect. Kish has gone on to apply his wonderful interpretive technique to other classic literary works, including Joseph Conrad’s Heart of Darkness and Italo Calvino’s Invisible Cities. These projects are equally striking, but it’s Moby Dick, “the great unread American novel,” that most inspired Kish, as it has so many other artists and readers.
The field of psychology is very different than it used to be. Nowadays, the American Psychological Association has a code of conduct for experiments that ensures a subject’s confidentiality, consent and general mental well being. In the old days, it wasn’t the case.
Back then, you could, for instance, con subjects into thinking that they were electrocuting a man to death, as they did in the infamous 1961 Milgram experiment, which left people traumatized and humbled in the knowledge that deep down they are little more than weak-willed puppets in the face of authority. You could also try to turn a group of unsuspecting orphans into stutterers by methodically undermining their self-esteem as the folks who ran the aptly named Monster Study of 1939 tried to do. But, if you really want to get into the swamp of moral dubiousness, look no further than the Little Albert experiments, which traumatized a baby into hating dogs, Santa Claus and all things fuzzy.
In 1920, Johns Hopkins professor John B. Watson was fascinated with Ivan Pavlov’s research on conditioned stimulus. Pavlov famously rang a bell every time he fed his dogs. At first the food caused the dogs to salivate, but after a spell of pairing the bell with dinner, the dogs would eventually salivate at just the sound of the bell. That’s called a conditioned response. Watson wanted to see if he could create a conditioned response in a baby.
Enter 9‑month old Albert B., AKA Little Albert. At the beginning of the experiment, Albert was presented with a white rat, a dog, a white rabbit, and a mask of Santa Claus among other things. The lad was unafraid of everything and was, in fact, really taken with the rat. Then every time the baby touched the animals, scientists struck a metal bar behind him, creating a startlingly loud bang. The sound freaked out the child and soon, like Pavlov’s dogs, Little Albert grew terrified of the rat and the mask of Santa and even a fur coat. The particularly messed up thing about the experiment was that Watson didn’t even both to reverse the psychological trauma he inflicted.
What happened to poor baby Albert is hard to say, in part because no one is really sure of the child’s true identity. He might have been Douglas Merritte, as psychologists Hall P. Beck and Sharman Levinson argued in 2009. If that’s the case, then the child died at the age of 6 in 1925 of hydrocephalus. Or he might have been William Albert Barger, as Russ Powell and Nancy Digdon argued in 2012. He passed away in 2007 at the age of 87. He reportedly had a lifelong aversion to dogs, though it cannot be determined if it was a lasting effect of the experiment.
Later in life, Watson left academics for advertising.
Jonathan Crow is a Los Angeles-based writer and filmmaker whose work has appeared in Yahoo!, The Hollywood Reporter, and other publications. You can follow him at @jonccrow. And check out his blog Veeptopus, featuring lots of pictures of badgers and even more pictures of vice presidents with octopuses on their heads. The Veeptopus store is here.
Did the weather have anything to do with those balls deflating in New England during the AFC championship game? It’s unlikely, very unlikely. Bill Nye explains why with science, but not without putting the hyped controversy into perspective first. Take it away Bill.
Richard Dawkins — some know him as the Oxford evolutionary biologist who coined the term “meme” in his influential 1976 book, The Selfish Gene; others consider him a leading figure in the New Atheism movement, a position he has assumed unapologetically. In recent years, Dawkins has made his case against religion though different forms of media: books, documentaries, college lectures, and public debates. He can be aggressive and snide, to be sure. But he dishes out far less than he receives in return. Just witness him reading the “love letters” (as he euphemistically calls them) that he has received from the general public. They are not safe for work. You can see him reading a previous batch of letters here.
A perfect symbol of the mechanisms of British rule over India, the Salt Acts prohibited Indians from access and trade of their own resources, forcing them to buy salt from British monopolies, who taxed the mineral heavily. In 1930, in one of the defining acts of his Satyagraha movement, Mohandas Gandhi decided to defy the Salt Act with a very grand gesture—a march, with thousands of his supporters, over a distance of over 200 miles, to the Arabian Sea. Once there, following Gandhi’s lead, the crowd proceeded to collect sea salt, prompting British colonial police to arrest over 60,000 people, including Gandhi himself.
The 1930 action, the first organized act of civil disobedience after the Indian National Congress’ declaration of independence, got the attention of the Viceroy, Lord Irwin, who had been directing harsh repressive measures against the growing independence movement, and in January of 1931, after his release, Gandhi and Irwin signed a pact. Gandhi agreed to end the movement; Irwin agreed to allow the Indians to make their own salt, and the Indians would have an equal role in negotiating India’s future. British officials were outraged and disgusted. Winston Churchill, for example, staunchly opposed to independence, called the meeting of the two leaders a “nauseating and humiliating spectacle,” saying “Gandhi-ism and everything it stands for will have to be grappled with and crushed.” (Churchill favored letting Gandhi die if he went on hunger strike.)
The terms of the pact, of course, did not hold, and the movement would continue until eventual independence in 1947. But Gandhi had not only succeeded in incurring the wrath of the British colonialists; he had also won many supporters in England. One of them, Muriel Lester, invited the Indian leader to stay with her in London at a community center she had founded called Kingsley Hall. “He enjoyed his stay here,” says the current Kingsley Hall manager David Baker, “and it was wise because if he had stayed in the West End the press would have lampooned him. He wouldn’t have had a life, but here he was left alone and walked around in the streets. He wanted to stay with the people that he lived with in India, i.e. the poor.” However, Gandhi wasn’t totally ignored by the press. While at Kingsley, he delivered a short speech, which you can hear above, and the BBC was there to record it.
In the speech, Gandhi says absolutely nothing about Indian independence, British oppression, or the aims and tactics of the movement. He says nothing at all about politics or any worldly affairs whatsoever. Instead, he lectures on the existence of God, “an indefinable mysterious power that pervades everything,” and which “defies all proof.” Gandhi testifies to “an unalterable law governing everything and every being that exists or lives,” though he also confesses “that I have no argument to convince through reason.” Instead relies on analogies, on things he “dimly perceives,” on the “marvelous researches of [Indian engineer and scientist] Sir J.C. Bose,” and on “the experiences of an unbroken line of prophets and sages in all countries and climes.” It’s not a speech likely to persuade anyone who isn’t already some sort of a believer, I think, but it’s of much interest to anyone interested in the history of Indian independence and in Gandhi’s life and message.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.