What Happens When Blade Runner & A Scanner Darkly Get Remade with an Artificial Neural Network

Philip K. Dick, titling the 1968 nov­el that would pro­vide the basis for Blade Run­ner, asked whether androids dream of elec­tric sheep. But what goes on in the “mind” of an arti­fi­cial intel­li­gence designed specif­i­cal­ly to watch movies? Ter­ence Broad, a com­put­ing researcher at Gold­smiths, Uni­ver­si­ty of Lon­don, took on a form of that ques­tion for his mas­ter’s dis­ser­ta­tion, using “arti­fi­cial neur­al net­works to recon­struct films — by train­ing them to recon­struct indi­vid­ual frames from films, and then get­ting them to recon­struct every frame in a giv­en film and rese­quenc­ing it.”

Neur­al net­works” sounds like a term straight out of one of Dick­’s influ­en­tial sci­ence-fic­tion nov­els, but you’ve almost cer­tain­ly heard quite a bit about them in recent years of real life. A neur­al net­work, in the words of neu­ro­com­put­er pio­neer Dr. Robert Hecht-Nielsen, “is a com­put­ing sys­tem made up of a num­ber of sim­ple, high­ly inter­con­nect­ed pro­cess­ing ele­ments, which process infor­ma­tion by their dynam­ic state response to exter­nal inputs.” These sys­tems, in oth­er words, imi­tate the prob­lem-solv­ing meth­ods of the human brain as we cur­rent­ly under­stand them, and can, when pro­vid­ed with suit­able data, “learn” from it.

One thinks less of the Repli­cants, Blade Run­ner’s lethal­ly engi­neered super­hu­mans, than of Num­ber 5, the arti­fi­cial­ly intel­li­gent robot star of Short Cir­cuit (co-designed, inci­den­tal­ly, by Blade Run­ner’s “visu­al futur­ist” Syd Mead), with his con­stant demands for “input.” When it came out in the mid-1980s, that goofy com­e­dy once looked like by far the more suc­cess­ful film, but over the inter­ven­ing three decades Rid­ley Scot­t’s one-time bomb has become per­haps the most respect­ed work of its kind. “The first ever film remade by a neur­al net­work had to be Blade Run­ner,” Ter­ence Broad told Vox, point­ing in his expla­na­tion of his project to the movie’s pre­scient treat­ment of the theme “that the task of deter­min­ing what is and isn’t human is becom­ing increas­ing­ly dif­fi­cult, with the ever-increas­ing tech­no­log­i­cal devel­op­ments.”

Dick, as his gen­er­a­tions of read­ers know, had deep con­cerns about the dif­fer­ence between the real and the unre­al, and how human beings can ever tell one from the oth­er. He tack­led that issue again, from a very dif­fer­ent angle, in his 1977 nov­el A Scan­ner Dark­ly. Richard Lin­klater turned that book into a movie almost thir­ty years lat­er, one which Broad also fed as input into his neur­al net­work, which then attempt­ed to recon­struct it. Though still the­mat­i­cal­ly appro­pri­ate, its col­or­ful roto­scoped ani­ma­tion posed more of a chal­lenge, and “the results are less tem­po­ral­ly coher­ent than the Blade Run­ner mod­el.” But “on the oth­er hand, the images are incred­i­bly unusu­al and com­plex, once again pro­duc­ing video with a rich unpre­dictabil­i­ty.”

At the top of the post, you can watch Broad­’s Blade Run­ner-trained neur­al net­work recon­struct Blade Run­ner’s trail­er, and below that his A Scan­ner Dark­ly-trained neur­al net­work recon­struct A Scan­ner Dark­ly’s trail­er. Curios­i­ty demand­ed, of course, that Broad let a neur­al net­work trained to watch one film have a go at recon­struct­ing the oth­er, and just above we have the A Scan­ner Dark­ly-trained neur­al net­work’s recon­struc­tion of Blade Run­ner. He’s also giv­en Scot­t’s famous 1984-themed Super Bowl Apple ad and God­frey Reg­gio’s Koy­aanisqat­si the neur­al-net­work treat­ment. We read so often, these days, about arti­fi­cial intel­li­gence’s grow­ing abil­i­ty to out-think, out-work, and one day even out-cre­ate us. What on Earth, the Philip K. Dicks of our day must won­der, will the neur­al net­works come up with when they can final­ly out-watch us?

Relat­ed Con­tent:

Watch an Ani­mat­ed Ver­sion of Rid­ley Scott’s Blade Run­ner Made of 12,597 Water­col­or Paint­ings

Philip K. Dick Pre­views Blade Run­ner: “The Impact of the Film is Going to be Over­whelm­ing” (1981)

Rid­ley Scott Talks About Mak­ing Apple’s Land­mark “1984” Com­mer­cial, Aired 30 Years Ago on Super Bowl Sun­day

Watch Sun­spring, the Sci-Fi Film Writ­ten with Arti­fi­cial Intel­li­gence, Star­ring Thomas Mid­dled­itch (Sil­i­con Val­ley)

Arti­fi­cial Intel­li­gence Pro­gram Tries to Write a Bea­t­les Song: Lis­ten to “Daddy’s Car”

Two Arti­fi­cial Intel­li­gence Chat­bots Talk to Each Oth­er & Get Into a Deep Philo­soph­i­cal Con­ver­sa­tion

Based in Seoul, Col­in Mar­shall writes and broad­casts on cities and cul­ture. He’s at work on a book about Los Ange­les, A Los Ange­les Primer, the video series The City in Cin­e­ma, the crowd­fund­ed jour­nal­ism project Where Is the City of the Future?, and the Los Ange­les Review of Books’ Korea Blog. Fol­low him on Twit­ter at @colinmarshall or on Face­book.


by | Permalink | Comments (1) |

Sup­port Open Cul­ture

We’re hop­ing to rely on our loy­al read­ers rather than errat­ic ads. To sup­port Open Cul­ture’s edu­ca­tion­al mis­sion, please con­sid­er mak­ing a dona­tion. We accept Pay­Pal, Ven­mo (@openculture), Patre­on and Cryp­to! Please find all options here. We thank you!


Comments (1)
You can skip to the end and leave a response. Pinging is currently not allowed.

Leave a Reply

Quantcast
Open Culture was founded by Dan Colman.