Do you feel like humanity’s had a decent run? Are you ready for all of our works to be lost, like tears in rain? Because a programmer named Terence Broad created an AI, and then decided the best possible thing to do was to make it watch Blade Runner. Because that should start our relationship with our soon-to-be-overlords on the right foot. Why not show it Terminator next? Or The Matrix?
Seriously, though, this is a fascinating project, and the results are extraordinary. Broad has taught an AI how to watch movies, and, in a rudimentary way, interpret and reconstruct them. The resulting film is either a great moment in computing, or an eerie harbinger of humanity’s doom. Either way, this is a historic moment!
As Aja Romano, writing for Vox, explains:
Broad’s goal was to apply “deep learning” — a fundamental piece of artificial intelligence that uses algorithmic machine learning — to video; he wanted to discover what kinds of creations a rudimentary form of AI might be able to generate when it was “taught” to understand real video data.
Basically, Broad needed to train the AI to recognize a film. Using a “learned similarity metric” he introduced the encoder to data from Blade Runner, as well as to non-Blade Runner data, so the encoder could learn to compare them. Once the encoder recognized Blade Runner, it broke each frame down into a 200-digit representation, and then reconstructed the frame. The result is a blurry interpretation of the film. As a follow up, Broad introduced the encoder to A Scanner Darkly, because he thought it only appropriate to make sure the AI was well-versed in the works of Philip K. Dick:
[T]here could not be a more apt ﬁlm to explore these themes [of subjective rationality] with than Blade Runner (1982)… which was one of the first novels to explore the themes of arial subjectivity, and which repeatedly depicts eyes, photographs and other symbols alluding to perception. The other ﬁlm chosen to model for this project is A Scanner Darkly (2006), another adaption of a Philip K. Dick novel (2011 ). This story also explores themes of the nature of reality, and is particularly interesting for being reconstructed with a neural network as every frame of the ﬁlm has already been reconstructed (hand traced over the original ﬁlm) by an animator.
Here’s a side-by-side comparison between the original trailer for A Scanner Darkly trailer, and the autoencoded version:
And here’s the full, autoencoded Blade Runner:
As Broad told Vox: “In essence, you are seeing the film through the neural network. So [the reconstruction] is the system’s interpretation of the film (and the other films I put through the models), based on its limited representational ‘understanding.'”
This story has a twist ending: when Warner Brothers issued its usual DMCA takedown warning to Vimeo, asking the platform to pull all the uploads of Warner films, it included the AI’s recreation of Blade Runner. Of course, technically this isn’t Blade Runner, but an uncanny recreation of it, so we now live in a world where “Warner had just DMCA’d an artificial reconstruction of a film about artificial intelligence being indistinguishable from humans, because it couldn’t distinguish between the simulation and the real thing.”
After Vox contacted Warner and explained the project, the company rescinded the notice, so we’ll be able to watch
our doom the complete project as it unfolds! In the meantime, be sure to read more about the project over at Vox, and check out Broad’s posts on Medium!