Disney’s latest AI tool de-ages actors in seconds

Remember when making actors look older or younger in movies was a huge deal? The amount of postproduction work to achieve realistic results was immense back in the day, but now, researchers from Disney have revealed FRAN, a new artificial intelligence tool that can convincingly age or de-age an actor in a fraction of the time.

In an academic paper, Disney Research Studios explains that FRAN (which stands for face re-aging network) is a neural network that was trained using a large database containing pairs of randomly generated synthetic faces at varying ages, which bypasses the need to otherwise find thousands of images of real people at different (documented) ages that depict the same facial expression, pose, lighting, and background.

FRAN uses this information to come up with a prediction about which areas of a real person’s face would age and how and then overlays the new details — such as adding or erasing wrinkles and jowls — onto video footage. The result is what Disney Research Studios claims is “the first practical, fully-automatic and production-ready method for re-aging faces in video images.” Looking at video examples provided by Disney, the tech definitely blows Snapchat’s aging filter out of the water.

There are a few limitations, however, and this sort of research isn’t unique. Disney noted in its research that FRAN may be unsuitable for significant alterations such as re-aging to and from very young ages and that the graying of scalp hair isn’t reflected when aging up an actor, as this wasn’t present in the dataset used to train the tool. Given manual VFX work and even practical prosthetic makeup application don’t have these restrictions, FRAN is unlikely to be replacing many industry jobs for a good while.

Still, the results look just as good, if not better, than real on-screen examples from just a few years back. After all, we all got uncanny valley when looking at a de-aged Robert Downey Jr. in 2016’s Captain America: Civil War, right? Right.

It’s little wonder that Disney has been working on automating visual effects given it’s one of the biggest names recreating or de-aging actors on the big screen. Characters in the Marvel Cinematic Universe such as Nick Fury (Samuel L. Jackson), Hank Pym (Michael Douglas), and Ego the Living Planet (Kurt Russell) have all been visually adjusted in recent years, in addition to Star Wars characters like Leia Organa (Carrie Fisher) and Wilhuff Tarkin (Peter Cushing).

This also isn’t the first time Disney has trained an AI to alter someone’s appearance in video footage, as its research arm previously released a “photo-realistic” deepfake tool in 2020. Industrial Light & Magic (Disney’s visual effects company) has also worked on systems to reduce postproduction VFX that are actually in use, such as giant 20-foot-tall LED video screens for The Mandalorian.

Despite its potential benefits in filmmaking, it isn’t clear if Disney intends to make this technology available to the public, and there’s still certainly some room for improvement, so it could be a while until we see this level of intricate visual effects work practically automated within the industry.