Multimedia artist and writer James Bridle has a new book out, and it’s terrifying—appropriately so, I would say—in its analysis of “the dangers of trusting computers to explain (and, increasingly, run) the world,” as Adi Robertson writes at The Verge. Summing up one of his arguments in his New Dark Age: Technology and the End of the Future, Bridle writes, “We know more and more about the world, while being less and less able to do anything about it.” As Bridle tells Robertson in a short interview, he doesn’t see the problems as irremediable, provided we gain “some kind of agency within these systems.” But he insists that we must face head-on certain facts about our dystopian, sci-fi-like reality.
In the brief TED talk above, you can see Bridle do just that, beginning with an analysis of the millions of proliferating videos for children, with billions of views, on YouTube, a case study that quickly goes to some disturbing places. Videos showing a pair of hands unwrapping chocolate eggs to reveal a toy within “are like crack for little kids,” says Bridle, who watch them over and over. Autoplay ferries them on to weirder and weirder iterations, which eventually end up with dancing Hitlers and their favorite cartoon characters performing lewd and violent acts. Some of the videos seem to be made by professional animators and "wholesome kid's entertainers," some seem assembled by software, some by “people who clearly shouldn’t be around children at all.”
The algorithms that drive the bizarre universe of these videos are used to “hack the brains of very small children in return for advertising revenue,” says Bridle. “At least that what I hope they’re doing it for.” Bridle soon bridges the machinery of kids’ YouTube with the adult version. “It’s impossible to know,” he says, who’s posting these millions of videos, “or what their motives might be…. Really it’s exactly the same mechanism that’s happening across most of our digital services, where it’s impossible to know where this information is coming from.” The children’s videos are “basically fake news for kids. We’re training them from birth to click on the very first link that comes along, regardless of what the source is.”
High school and college teachers already deal with the problem of students who cannot judge good information from bad—and who cannot really be blamed for it, since millions of adults seem unable to do so as well. In surveying YouTube children’s videos, Bridle finds himself asking the same questions that arise in response to so much online content: “Is this a bot? Is this a person? Is this a troll? What does it mean that we can’t tell the difference between these things anymore?” The language of online content is a hash of popular tags meant to be read by machine algorithms, not humans. But real people performing in an “algorithmically optimized system” seem forced to “act out these increasingly bizarre combinations of words.”
Within this culture, he says, “even if you’re human, you have to end up behaving like a machine just to survive.” What makes the scenario even darker is that machines replicate the worst aspects of human behavior, not because they’re evil but because that’s what they’re taught to do. To think that technology is neutral is a dangerously naïve view, Bridle argues. Humans encode their historical biases into the data, then entrust to A.I. such critical functions as not only children’s entertainment, but also predictive policing and recommending criminal sentences. As Bridle notes in the short video above, A.I. inherits the racism of its creators, rather than acting as a “leveling force."
As we’ve seen the CEOs of tech companies taken to task for the use of their platforms for propaganda, disinformation, hate speech, and wild conspiracy theories, we’ve also seen them respond to the problem by promising to solve it with more automated machine learning algorithms. In other words, to address the issues with the same technology that created them—technology that no one really seems to understand. Letting “unaccountable systems” driven almost solely by ads control global networks with ever-increasing influence over world affairs seems wildly irresponsible, and has already created a situation, Bridle argues in his book, in which imperialism has “moved up to infrastructure level” and conspiracy theories are the most “powerful narratives of our time,” as he says below.
Bridle’s claims might themselves sound like alarmist conspiracies if they weren’t so alarmingly obvious to most anyone paying attention. In an essay on Medium he writes a much more in-depth analysis of YouTube kids’ content, developing one of the arguments in his book. Bridle is one of many writers and researchers covering this terrain. Some other good popular books on the subject come from scholars and technologists like Tim Wu and Jaron Lanier. They are well worth reading and paying attention to, even if we might disagree with some of their arguments and prescriptions.
As Bridle himself argues in his interview at The Verge, the best approach to dealing with what seems like a nightmarish situation is to develop a “systemic literacy,” learning “to think clearly about subjects that seem difficult and complex,” but which nonetheless, as we can clearly see, have tremendous impact on our everyday lives and the society our kids will inherit.