Philip K. Dick, titling the 1968 novel that would provide the basis for Blade Runner, asked whether androids dream of electric sheep. But what goes on in the “mind” of an artificial intelligence designed specifically to watch movies? Terence Broad, a computing researcher at Goldsmiths, University of London, took on a form of that question for his master’s dissertation, using “artificial neural networks to reconstruct films — by training them to reconstruct individual frames from films, and then getting them to reconstruct every frame in a given film and resequencing it.”
“Neural networks” sounds like a term straight out of one of Dick’s influential science-fiction novels, but you’ve almost certainly heard quite a bit about them in recent years of real life. A neural network, in the words of neurocomputer pioneer Dr. Robert Hecht-Nielsen, “is a computing system made up of a number of simple, highly interconnected processing elements, which process information by their dynamic state response to external inputs.” These systems, in other words, imitate the problem-solving methods of the human brain as we currently understand them, and can, when provided with suitable data, “learn” from it.
One thinks less of the Replicants, Blade Runner’s lethally engineered superhumans, than of Number 5, the artificially intelligent robot star of Short Circuit (co-designed, incidentally, by Blade Runner’s “visual futurist” Syd Mead), with his constant demands for “input.” When it came out in the mid-1980s, that goofy comedy once looked like by far the more successful film, but over the intervening three decades Ridley Scott’s one-time bomb has become perhaps the most respected work of its kind. “The first ever film remade by a neural network had to be Blade Runner,” Terence Broad told Vox, pointing in his explanation of his project to the movie’s prescient treatment of the theme “that the task of determining what is and isn’t human is becoming increasingly difficult, with the ever-increasing technological developments.”
Dick, as his generations of readers know, had deep concerns about the difference between the real and the unreal, and how human beings can ever tell one from the other. He tackled that issue again, from a very different angle, in his 1977 novel A Scanner Darkly. Richard Linklater turned that book into a movie almost thirty years later, one which Broad also fed as input into his neural network, which then attempted to reconstruct it. Though still thematically appropriate, its colorful rotoscoped animation posed more of a challenge, and “the results are less temporally coherent than the Blade Runner model.” But “on the other hand, the images are incredibly unusual and complex, once again producing video with a rich unpredictability.”
At the top of the post, you can watch Broad’s Blade Runner-trained neural network reconstruct Blade Runner’s trailer, and below that his A Scanner Darkly-trained neural network reconstruct A Scanner Darkly’s trailer. Curiosity demanded, of course, that Broad let a neural network trained to watch one film have a go at reconstructing the other, and just above we have the A Scanner Darkly-trained neural network’s reconstruction of Blade Runner. He’s also given Scott’s famous 1984-themed Super Bowl Apple ad and Godfrey Reggio’s Koyaanisqatsi the neural-network treatment. We read so often, these days, about artificial intelligence’s growing ability to out-think, out-work, and one day even out-create us. What on Earth, the Philip K. Dicks of our day must wonder, will the neural networks come up with when they can finally out-watch us?
Related Content:
Watch an Animated Version of Ridley Scott’s Blade Runner Made of 12,597 Watercolor Paintings
Artificial Intelligence Program Tries to Write a Beatles Song: Listen to “Daddy’s Car”
Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. He’s at work on a book about Los Angeles, A Los Angeles Primer, the video series The City in Cinema, the crowdfunded journalism project Where Is the City of the Future?, and the Los Angeles Review of Books’ Korea Blog. Follow him on Twitter at @colinmarshall or on Facebook.
I would appreciate the effort put in by Mr. Colin Marshall in stitching the content of this article and align the text with connecting videos. Kudos to him once again.