Connect with us

Internet

Neural networks have been used to upscale footage of the Hindenburg disaster to 1080p

The blow-up’s had a glow-up.

hindenburg disaster
Screenshot: Neural Networks and Deep Learning / KnowTechie

Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.

Video editing has improved leaps and bounds in recent years. Using different methods and systems, it’s possible to see famous moments in a whole new light. Neural networking is one of these techniques, and with it, we can gain a fresh understanding of past events, or, as in today’s case, see disaster strike through a modern lens.

The Hindenburg was an incredible feat of engineering when it first flew in 1936. Measuring 237 meters long (778 feet) and powered by 10 engines, it’s fair to say she was an absolute unit, and her tragic final flight is an extremely important moment in aviation history.

Video cameras captured her historical 1937 crash over New Jersey, and now YouTube channel Neural Networks and Deep Learning have upscaled the footage to 1080p.

Smoothing out the wrinkles

They did this by using Gigapixel AI, an almost magical program that uses deep learning technology to accurately recreate the missing details in images. Actively analyzing millions of photo pairs, the insanely clever network learns where details are usually lost before applying the knowledge to each individual frame. This results in the details being enhanced, so the grainy footage now looks much clearer and far smoother.

This process is completely different from traditional photo editing programs, which usually take advantage of regular interpolation to upscale images. In some cases this is fine, but in others, the end result can just be bigger and blurrier. Think of it as looking at something under a microscope, rather than just squinting at it for a better view.

Bringing color back to the Hindenburg

Going one step further, the open-source program DeOldify was used to bring color to the black and white film of the Hindenburg. When the two processes are combined it gives the tragic footage a whole new dimension. It resonates in a completely different way to see details in the airship’s duralumin structure that weren’t visible in the past. Watching the hydrogen-filled airship become engulfed in orange and red flames makes something that happened before any of us were born much more realistic.

This isn’t the only time Neural Networks and Deep Learning have done something like this. They’ve been exploring the possibilities of AI with everything from Marilyn Monroe screen tests to the atom bomb test at SpongeBob’s home, Bikini Bottom. Their whole channel is an incredible look into pushing the limits of technology by casting a new light on moments from the past.

What do you think? Surprised that technology allows people to do things like this? Let us know down below in the comments or carry the discussion over to our Twitter or Facebook.

Editors’ Recommendations:

Follow us on Flipboard, Google News, or Apple News

A gamer since I've been able to hold a controller, I can usually be found with a PlayStation, Xbox or Switch controller in my hand.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Internet