Creative Commons image via NASA
It shouldn’t be especially controversial to point out that we live in a pivotal time in human history—that the actions we collectively take (or that plutocrats and technocrats take) will determine the future of the human species—or whether we even have a future in the coming centuries. The threats posed by climate change and war are exacerbated and accelerated by rapidly worsening economic inequality. Exponential advances in technology threaten to eclipse our ability to control machines rather than be controlled, or stamped out, by them.
It’s also the case that our most well-regarded scientists and technological innovators have not remained silent in the face of these crises. Physicist Stephen Hawking has issued some dire warnings lately when it comes to humanity’s future. Several years ago, he predicted that “our only chance of long term survival” may be to “spread out into space,” a la Interstellar. In addition to the worsening climate crisis, the rise of artificial intelligence concerns Hawking. Along with Bill Gates and Elon Musk, he has warned of what futurist Ray Kurzweil has called “the singularity,” the point at which machine intelligence surpasses our own.
Where Kurzweil has seen this event through an optimistic, New Age lens, Hawking’s view seems more in line with dystopian sci-fi visions of robot apocalypse. “Success in AI would be the biggest event in human history,” he wrote in The Independent last year, “Unfortunately it might also be the last.” Given the design of autonomous weapons systems and, as he told the BBC, the fact that “Humans, who are limited by slow biological evolution, couldn’t compete and would be superseded,” the prospect looks chilling, but it isn’t inevitable.
Our tech isn’t actively out to get us. “The real risk with AI isn’t malice but competence,” Hawking clarified, in a fascinating Reddit “Ask Me Anything” session last month. Due to the physicist’s physical limitations, readers posted questions and voted on their favorites. From these, Hawking elected the “ones he feels he can give answers to.” In response to a top-rated question about the so-called “Terminator Conversation,” he wrote, “A superintelligent AI will be extremely good at accomplishing its goals, and if those goals aren’t aligned with ours, we’re in trouble.”
This problem of misaligned goals is not of course limited to our relationship with machines. Our precarious economic relationships with each other pose a separate threat, especially in the face of massive job loss due to future automation. We’d like to imagine a future where technology frees us of toil and want, the kind of society Buckminster Fuller sought to create. But the truth is that wealth and income inequality, at their highest levels in the U.S. since at least the Gilded Age, may determine a very different path—one we might think of in terms of “The Elysium Conversation.” Asked in the same AMA Reddit session, “Do you foresee a world where people work less because so much work is automated? Do you think people will always either find work or manufacture more work to be done?,” Hawking elaborated,
If machines produce everything we need, the outcome will depend on how things are distributed. Everyone can enjoy a life of luxurious leisure if the machine-produced wealth is shared, or most people can end up miserably poor if the machine-owners successfully lobby against wealth redistribution. So far, the trend seems to be toward the second option, with technology driving ever-increasing inequality.
For decades after the Cold War, capitalism had the status of an unquestionably sacred doctrine—the end of history and the best of all possible worlds. Now, not only has Hawking identified its excesses as drivers of human decline, but so have other decidedly non-Marxist figures like Bill Gates, who in a recent Atlantic interview described the private sector as “in general inept” and unable to address the climate crisis because of its focus on short-term gains and maximal profits. “There’s no fortune to be made,” he said, from dealing with some of the biggest threats to our survival. But if we don’t deal with them, the losses are incalculable.