Iconic Film from 1896 Restored with Artificial Intelligence: Watch an AI-Upscaled Version of the Lumière Brothers’ The Arrival of a Train at La Ciotat Station

Machine learning keeps, well, learning in leaps and bounds, and at Open Culture we have watched developments with a fascinated, sometime wary eye. This latest advance checks off a lot of Open Culture boxes: traveling back in time through the power of film; homegrown ingenuity; and film history.

YouTuber Denis Shiryaev took the latest advances in AI tech and turned them onto one of the earliest works of film: The Arrival of a Train at La Ciotat Station, shot by the Lumière Brothers in 1896. There are plenty of urban legends around this 50 second short: that it was the first ever Lumière film (it wasn’t, they had a selection of previous shorts); and that audiences were terrified, thinking the train would hit them (they were amazed, no doubt, but they weren’t that naive).

You might want to watch the original below before watching Shiryaev’s 4K upscaling and AI “smoothed” version to get a sense of the marvel at the top of the post.

What we are seeing is not a traditional “restoration,” however. Instead, Shiryaev is using a commercial image-editing software called Gigapixel AI. (If you have the processing power, you can try it out). The original film was not shot at 60-frames-per-second. Instead, neural networks are looking at the original frames and “filling in” the data in between, creating what you can see is a more naturalistic effect. People on and off the train move like they do in real life. It looks like it was shot yesterday.

Now, this isn’t perfect. There are a lot of artifacts, squooshy, morphing moments where the neural network can’t figure things out. But hey, this is just one guy on his computer. It’s an experiment. The computer code will get better.

The Gigapixel AI was developed by Topaz Labs originally to help photographers upscale their pics by 600 percent without losing detail. It didn’t take long to apply this to video, but be warned, it can take hours of processing power to render a couple of seconds. Still it hasn’t stopped people from experimenting, even with similar neural network programs:

Here’s a clip from Nirvana’s “Heart Shaped Box” video upscaled to 4K with Gigapixel AI:

User AkN upscaled A-Bomb footage from the 1950s:

Some clips from Home Alone:

You get the idea. As with any technology, there are also some horrific examples out there too where it just does not work. But I have a feeling that Shiryaev’s first dive into film history is not going to his, or the internet’s, last.

Related Content:

Dramatic Color Footage Shows a Bombed-Out Berlin a Month After Germany’s WWII Defeat (1945)

Pristine Footage Lets You Revisit Life in Paris in the 1890s: Watch Footage Shot by the Lumière Brothers

Immaculately Restored Film Lets You Revisit Life in New York City in 1911

Ted Mills is a freelance writer on the arts who currently hosts the artist interview-based FunkZone Podcast and is the producer of KCRW’s Curious Coast. You can also follow him on Twitter at @tedmills, read his other arts writing at tedmills.com and/or watch his films here.

by | Permalink | Comments (1) |

Support Open Culture

We’re hoping to rely on our loyal readers rather than erratic ads. To support Open Culture’s educational mission, please consider making a donation. We accept PayPal, Venmo (@openculture), Patreon and Crypto! Please find all options here. We thank you!

Comments (1)
You can skip to the end and leave a response. Pinging is currently not allowed.
  • Nikolas James says:

    Use a satellite telescope as a reference for an earthbound device with rotating currents of warm air, then ai can increase the resolution of the image found through those currents by interpolation…use the atmosphere’s distortion to our advantage. Everyone can then see from earth like the Hubble. This process can be regarded as an analogy.

Leave a Reply

Open Culture was founded by Dan Colman.