Dear Facebook, This is How You’re Breaking Democracy: A Former Facebook Insider Explains How the Platform’s Algorithms Polarize Our Society

Is this what we want? A post-truth world where tox­i­c­i­ty and trib­al­ism trump bridge build­ing and con­sen­sus seek­ing? —Yaël Eisen­stat

It’s an increas­ing­ly famil­iar occur­rence.

A friend you’ve enjoyed recon­nect­ing with in the dig­i­tal realm makes a dra­mat­ic announce­ment on their social media page. They’re delet­ing their Face­book account with­in the next 24 hours, so shoot them a PM with your email if you’d like to stay in touch.

Such deci­sions used to be spurred by the desire to get more done or return to neglect­ed pas­times such as read­ing, paint­ing, and going for long uncon­nect­ed nature walks.

These announce­ments could induce equal parts guilt and anx­i­ety in those of us who depend on social media to get the word out about our low-bud­get cre­ative projects, though being prone to Inter­net addic­tion, we were near­ly as like­ly to be the one mak­ing the announce­ment.

For many, the break was tem­po­rary. More of a social media fast, a chance to reeval­u­ate, rest, recharge, and ulti­mate­ly return.

Legit­i­mate con­cerns were also raised with regard to pri­va­cy. Who’s on the receiv­ing end of all the sen­si­tive infor­ma­tion we’re offer­ing up? What are they doing with it? Is some­one lis­ten­ing in?

But in this elec­tion year, the deci­sion to quit Face­book is apt to be dri­ven by the very real fear that democ­ra­cy as we know it is at stake.

For­mer CIA ana­lyst, for­eign ser­vice offi­cer, andfor six monthsFacebook’s Glob­al Head of Elec­tions Integri­ty Ops for polit­i­cal adver­tis­ing, Yaël Eisen­stat, address­es these pre­oc­cu­pa­tions in her TED Talk, “Dear Face­book, This is How You’re Break­ing Democ­ra­cy,” above.

Eisen­stat con­trasts the civil­i­ty of her past face-to-face ”hearts and minds”-based engage­ments with sus­pect­ed ter­ror­ists and anti-West­ern cler­ics to the polar­iza­tion and cul­ture of hatred that Facebook’s algo­rithms foment.

As many users have come to sus­pect, Face­book rewards inflam­ma­to­ry con­tent with ampli­fi­ca­tion. Truth does not fac­tor into the equa­tion, nor does sin­cer­i­ty of mes­sage or mes­sen­ger.

Lies are more engag­ing online than truth. As long as [social media] algo­rithms’ goals are to keep us engaged, they will feed us the poi­son that plays to our worst instincts and human weak­ness­es.

Eisen­stat, who has val­ued the ease with which Face­book allows her to main­tain rela­tion­ships with far-flung friends, found her­self effec­tive­ly demot­ed on her sec­ond day at the social media giant, her title revised, and her access to high lev­el meet­ings revoked. Her hir­ing appears to have been pure­ly orna­men­tal, a pal­lia­tive ruse in response to mount­ing pub­lic con­cern.

As she remarked in an inter­view with The Guardian’s Ian Tuck­er ear­li­er this sum­mer:

They are mak­ing all sorts of reac­tive changes around the mar­gins of the issues, [to sug­gest] that they are tak­ing things seri­ous­ly – such as build­ing an ad library or ver­i­fy­ing that polit­i­cal adver­tis­ers reside in the coun­try in which they adver­tis­ing – things they should have been doing already. But they were nev­er going to make the fun­da­men­tal changes that address the key sys­temic issues that make Face­book ripe for manip­u­la­tion, viral mis­in­for­ma­tion and oth­er ways that the plat­form can be used to affect democ­ra­cy.

In the same inter­view she assert­ed that Facebook’s recent­ly imple­ment­ed over­sight board is lit­tle more than an inter­est­ing the­o­ry that will nev­er result in the total over­haul of its busi­ness mod­el:

First of all, it’s anoth­er exam­ple of Face­book putting respon­si­bil­i­ty on some­one else. The over­sight board does not have any author­i­ty to actu­al­ly address any of the poli­cies that Face­book writes and enforces, or the under­ly­ing sys­temic issues that make the plat­form absolute­ly rife for dis­in­for­ma­tion and all sorts of bad behav­iour and manip­u­la­tion.

The sec­ond issue is: it’s basi­cal­ly an appeal process for con­tent that was already tak­en down. The big­ger ques­tion is the con­tent that remains up. Third, they are not even going to be oper­a­tional until late fall and, for a com­pa­ny that claims to move fast and break things, that’s absurd.

Nine min­utes into her TED Talk, she offers con­crete sug­ges­tions for things the Face­book brass could do if it was tru­ly seri­ous about imple­ment­ing reform:

  • Stop ampli­fy­ing and rec­om­mend­ing dis­in­for­ma­tion and bias-based hatred, no mat­ter who is behind itfrom con­spir­a­cy the­o­rists to our cur­rent pres­i­dent.
  • Dis­con­tin­ue per­son­al­iza­tion tech­niques that don’t dif­fer­en­ti­ate between tar­get­ed polit­i­cal con­tent and tar­get­ed ads for ath­let­ic footwear.
  • Retrain algo­rithms to focus on a met­rics beyond what users click or linger on.
  • Imple­ment safe­ty fea­tures that would ensure that sen­si­tive con­tent is reviewed before it is allowed to go viral.

Hope­ful­ly view­ers are not feel­ing maxed out on con­tact­ing their rep­re­sen­ta­tives, as gov­ern­ment enforce­ment is Eisenstat’s only pre­scrip­tion for get­ting Face­book to alter its prod­uct and prof­it mod­el. And that will require sus­tained civic engage­ment.

She sup­ple­ments her TED Talk with rec­om­men­da­tions for arti­fi­cial intel­li­gence engi­neer Guil­laume Chaslot’s insid­er per­spec­tive op-ed “The Tox­ic Poten­tial of YouTube’s Feed­back Loop” and The Fil­ter Bub­ble: How the New Per­son­al­ized Web Is Chang­ing What We Read and How We Think by MoveOn.org’s for­mer Exec­u­tive Direc­tor, Eli Paris­er.

Your clued-in Face­book friends have no doubt already point­ed you to the doc­u­men­tary The Social Dilem­ma, which is now avail­able on Net­flix. Or per­haps to Jaron Lanier’s Ten Argu­ments for Delet­ing Your Social Media Accounts Right Now.

Read the tran­script of Yaël Eisenstat’s TED Talk here.

Relat­ed Con­tent: 

The Prob­lem with Face­book: “It’s Keep­ing Things From You”

The Case for Delet­ing Your Social Media Accounts & Doing Valu­able “Deep Work” Instead, Accord­ing to Com­put­er Sci­en­tist Cal New­port

This Is Your Kids’ Brains on Inter­net Algo­rithms: A Chill­ing Case Study Shows What’s Wrong with the Inter­net Today

Ayun Hal­l­i­day is an author, illus­tra­tor, the­ater mak­er and Chief Pri­ma­tol­o­gist of the East Vil­lage Inky zine. Fol­low her @AyunHalliday.


by | Permalink | Comments (0) |

Sup­port Open Cul­ture

We’re hop­ing to rely on our loy­al read­ers rather than errat­ic ads. To sup­port Open Cul­ture’s edu­ca­tion­al mis­sion, please con­sid­er mak­ing a dona­tion. We accept Pay­Pal, Ven­mo (@openculture), Patre­on and Cryp­to! Please find all options here. We thank you!


Leave a Reply

Quantcast
Open Culture was founded by Dan Colman.