What Happens When Blade Runner & A Scanner Darkly Get Remade with an Artificial Neural Network

Philip K. Dick, titling the 1968 novel that would provide the basis for Blade Runner, asked whether androids dream of electric sheep. But what goes on in the “mind” of an artificial intelligence designed specifically to watch movies? Terence Broad, a computing researcher at Goldsmiths, University of London, took on a form of that question for his master’s dissertation, using “artificial neural networks to reconstruct films — by training them to reconstruct individual frames from films, and then getting them to reconstruct every frame in a given film and resequencing it.”

Neural networks” sounds like a term straight out of one of Dick’s influential science-fiction novels, but you’ve almost certainly heard quite a bit about them in recent years of real life. A neural network, in the words of neurocomputer pioneer Dr. Robert Hecht-Nielsen, “is a computing system made up of a number of simple, highly interconnected processing elements, which process information by their dynamic state response to external inputs.” These systems, in other words, imitate the problem-solving methods of the human brain as we currently understand them, and can, when provided with suitable data, “learn” from it.

One thinks less of the Replicants, Blade Runner‘s lethally engineered superhumans, than of Number 5, the artificially intelligent robot star of Short Circuit (co-designed, incidentally, by Blade Runner‘s “visual futurist” Syd Mead), with his constant demands for “input.” When it came out in the mid-1980s, that goofy comedy once looked like by far the more successful film, but over the intervening three decades Ridley Scott’s one-time bomb has become perhaps the most respected work of its kind. “The first ever film remade by a neural network had to be Blade Runner,” Terence Broad told Vox, pointing in his explanation of his project to the movie’s prescient treatment of the theme “that the task of determining what is and isn’t human is becoming increasingly difficult, with the ever-increasing technological developments.”

Dick, as his generations of readers know, had deep concerns about the difference between the real and the unreal, and how human beings can ever tell one from the other. He tackled that issue again, from a very different angle, in his 1977 novel A Scanner Darkly. Richard Linklater turned that book into a movie almost thirty years later, one which Broad also fed as input into his neural network, which then attempted to reconstruct it. Though still thematically appropriate, its colorful rotoscoped animation posed more of a challenge, and “the results are less temporally coherent than the Blade Runner model.” But “on the other hand, the images are incredibly unusual and complex, once again producing video with a rich unpredictability.”

At the top of the post, you can watch Broad’s Blade Runner-trained neural network reconstruct Blade Runner‘s trailer, and below that his A Scanner Darkly-trained neural network reconstruct A Scanner Darkly‘s trailer. Curiosity demanded, of course, that Broad let a neural network trained to watch one film have a go at reconstructing the other, and just above we have the A Scanner Darkly-trained neural network’s reconstruction of Blade Runner. He’s also given Scott’s famous 1984-themed Super Bowl Apple ad and Godfrey Reggio’s Koyaanisqatsi the neural-network treatment. We read so often, these days, about artificial intelligence’s growing ability to out-think, out-work, and one day even out-create us. What on Earth, the Philip K. Dicks of our day must wonder, will the neural networks come up with when they can finally out-watch us?

Related Content:

Watch an Animated Version of Ridley Scott’s Blade Runner Made of 12,597 Watercolor Paintings

Philip K. Dick Previews Blade Runner: “The Impact of the Film is Going to be Overwhelming” (1981)

Ridley Scott Talks About Making Apple’s Landmark “1984” Commercial, Aired 30 Years Ago on Super Bowl Sunday

Watch Sunspring, the Sci-Fi Film Written with Artificial Intelligence, Starring Thomas Middleditch (Silicon Valley)

Artificial Intelligence Program Tries to Write a Beatles Song: Listen to “Daddy’s Car”

Two Artificial Intelligence Chatbots Talk to Each Other & Get Into a Deep Philosophical Conversation

Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. He’s at work on a book about Los Angeles, A Los Angeles Primer, the video series The City in Cinema, the crowdfunded journalism project Where Is the City of the Future?, and the Los Angeles Review of Books’ Korea Blog. Follow him on Twitter at @colinmarshall or on Facebook.


by | Permalink | Comments (1) |

Support Open Culture

We’re hoping to rely on our loyal readers rather than erratic ads. To support Open Culture’s educational mission, please consider making a donation. We accept PayPal, Venmo (@openculture), Patreon and Crypto! Please find all options here. We thank you!


Comments (1)
You can skip to the end and leave a response. Pinging is currently not allowed.

Leave a Reply

Quantcast
Open Culture was founded by Dan Colman.