This Is Your Kids’ Brains on Internet Algorithms: A Chilling Case Study Shows What’s Wrong with the Internet Today

Mul­ti­me­dia artist and writer James Bri­dle has a new book out, and it’s terrifying—appropriately so, I would say—in its analy­sis of “the dan­gers of trust­ing com­put­ers to explain (and, increas­ing­ly, run) the world,” as Adi Robert­son writes at The Verge. Sum­ming up one of his argu­ments in his New Dark Age: Tech­nol­o­gy and the End of the Future, Bri­dle writes, “We know more and more about the world, while being less and less able to do any­thing about it.” As Bri­dle tells Robert­son in a short inter­view, he doesn’t see the prob­lems as irre­me­di­a­ble, pro­vid­ed we gain “some kind of agency with­in these sys­tems.” But he insists that we must face head-on cer­tain facts about our dystopi­an, sci-fi-like real­i­ty.

In the brief TED talk above, you can see Bri­dle do just that, begin­ning with an analy­sis of the mil­lions of pro­lif­er­at­ing videos for chil­dren, with bil­lions of views, on YouTube, a case study that quick­ly goes to some dis­turb­ing places. Videos show­ing a pair of hands unwrap­ping choco­late eggs to reveal a toy with­in “are like crack for lit­tle kids,” says Bri­dle, who watch them over and over. Auto­play fer­ries them on to weird­er and weird­er iter­a­tions, which even­tu­al­ly end up with danc­ing Hitlers and their favorite car­toon char­ac­ters per­form­ing lewd and vio­lent acts. Some of the videos seem to be made by pro­fes­sion­al ani­ma­tors and “whole­some kid’s enter­tain­ers,” some seem assem­bled by soft­ware, some by “peo­ple who clear­ly shouldn’t be around chil­dren at all.”

The algo­rithms that dri­ve the bizarre uni­verse of these videos are used to “hack the brains of very small chil­dren in return for adver­tis­ing rev­enue,” says Bri­dle. “At least that what I hope they’re doing it for.” Bri­dle soon bridges the machin­ery of kids’ YouTube with the adult ver­sion. “It’s impos­si­ble to know,” he says, who’s post­ing these mil­lions of videos, “or what their motives might be…. Real­ly it’s exact­ly the same mech­a­nism that’s hap­pen­ing across most of our dig­i­tal ser­vices, where it’s impos­si­ble to know where this infor­ma­tion is com­ing from.” The children’s videos are “basi­cal­ly fake news for kids. We’re train­ing them from birth to click on the very first link that comes along, regard­less of what the source is.”

High school and col­lege teach­ers already deal with the prob­lem of stu­dents who can­not judge good infor­ma­tion from bad—and who can­not real­ly be blamed for it, since mil­lions of adults seem unable to do so as well. In sur­vey­ing YouTube children’s videos, Bri­dle finds him­self ask­ing the same ques­tions that arise in response to so much online con­tent: “Is this a bot? Is this a per­son? Is this a troll? What does it mean that we can’t tell the dif­fer­ence between these things any­more?” The lan­guage of online con­tent is a hash of pop­u­lar tags meant to be read by machine algo­rithms, not humans. But real peo­ple per­form­ing in an “algo­rith­mi­cal­ly opti­mized sys­tem” seem forced to “act out these increas­ing­ly bizarre com­bi­na­tions of words.”

With­in this cul­ture, he says, “even if you’re human, you have to end up behav­ing like a machine just to sur­vive.” What makes the sce­nario even dark­er is that machines repli­cate the worst aspects of human behav­ior, not because they’re evil but because that’s what they’re taught to do. To think that tech­nol­o­gy is neu­tral is a dan­ger­ous­ly naïve view, Bri­dle argues. Humans encode their his­tor­i­cal bias­es into the data, then entrust to A.I. such crit­i­cal func­tions as not only children’s enter­tain­ment, but also pre­dic­tive polic­ing and rec­om­mend­ing crim­i­nal sen­tences. As Bri­dle notes in the short video above, A.I. inher­its the racism of its cre­ators, rather than act­ing as a “lev­el­ing force.”

As we’ve seen the CEOs of tech com­pa­nies tak­en to task for the use of their plat­forms for pro­pa­gan­da, dis­in­for­ma­tion, hate speech, and wild con­spir­a­cy the­o­ries, we’ve also seen them respond to the prob­lem by promis­ing to solve it with more auto­mat­ed machine learn­ing algo­rithms. In oth­er words, to address the issues with the same tech­nol­o­gy that cre­at­ed them—technology that no one real­ly seems to under­stand. Let­ting “unac­count­able sys­tems” dri­ven almost sole­ly by ads con­trol glob­al net­works with ever-increas­ing influ­ence over world affairs seems wild­ly irre­spon­si­ble, and has already cre­at­ed a sit­u­a­tion, Bri­dle argues in his book, in which impe­ri­al­ism has “moved up to infra­struc­ture lev­el” and con­spir­a­cy the­o­ries are the most “pow­er­ful nar­ra­tives of our time,” as he says below.

Bridle’s claims might them­selves sound like alarmist con­spir­a­cies if they weren’t so alarm­ing­ly obvi­ous to most any­one pay­ing atten­tion. In an essay on Medi­um he writes a much more in-depth analy­sis of YouTube kids’ con­tent, devel­op­ing one of the argu­ments in his book. Bri­dle is one of many writ­ers and researchers cov­er­ing this ter­rain. Some oth­er good pop­u­lar books on the sub­ject come from schol­ars and tech­nol­o­gists like Tim Wu and Jaron Lanier. They are well worth read­ing and pay­ing atten­tion to, even if we might dis­agree with some of their argu­ments and pre­scrip­tions.

As Bri­dle him­self argues in his inter­view at The Verge, the best approach to deal­ing with what seems like a night­mar­ish sit­u­a­tion is to devel­op a “sys­temic lit­er­a­cy,” learn­ing “to think clear­ly about sub­jects that seem dif­fi­cult and com­plex,” but which nonethe­less, as we can clear­ly see, have tremen­dous impact on our every­day lives and the soci­ety our kids will inher­it.

Relat­ed Con­tent:

How Infor­ma­tion Over­load Robs Us of Our Cre­ativ­i­ty: What the Sci­en­tif­ic Research Shows

The Case for Delet­ing Your Social Media Accounts & Doing Valu­able “Deep Work” Instead, Accord­ing to Prof. Cal New­port

The Diderot Effect: Enlight­en­ment Philoso­pher Denis Diderot Explains the Psy­chol­o­gy of Con­sumerism & Our Waste­ful Spend­ing

Josh Jones is a writer and musi­cian based in Durham, NC. Fol­low him at @jdmagness


by | Permalink | Comments (0) |

Sup­port Open Cul­ture

We’re hop­ing to rely on our loy­al read­ers rather than errat­ic ads. To sup­port Open Cul­ture’s edu­ca­tion­al mis­sion, please con­sid­er mak­ing a dona­tion. We accept Pay­Pal, Ven­mo (@openculture), Patre­on and Cryp­to! Please find all options here. We thank you!


Leave a Reply

Quantcast
Open Culture was founded by Dan Colman.