De-aging and Deepfake Technology: Future of Film-making

De-aging and Deepfake Technology: Future of Film-making post thumbnail image

Characters in movies and TV shows follow certain rules of space and time. When the audience is given a scene from the hero’s childhood, the director usually casts a child that looks like the hero. But what if an adult figure is required to seem in their youth? Let’s examine how cinema heroes are “reverted” to their young looks and what challenges filmmakers face while shooting a brief scene.

 

De-aging Technology and Its Limitations

Until technology advanced enough, makeup artists accomplished all of the best actor makeovers. Makeup was one of the earliest technologies that permitted an actor’s age to be changed. The fact that movies were formerly filmed on grainy film also helped to conceal cosmetic practices. The audience could not perceive the fine features on an actor’s face. Since the resolution of vintage cinema movies is significantly lower.

Makeup is no longer an option in current digital and high-resolution film production. Viewers want to see an authentic visual without sacrificing quality. This is where de-aging technology comes into play. In a nutshell, de-aging is a 3D effect technology that makes an actor seem younger. Post-production studios often edit a digital picture or add computer-generated imagery overlays or touch-ups to the required sequences to accomplish this.

 

CGI Animations and VFX

When you watch actors that have been de-aged, you nearly always witness CGI animations based on visual effects (VFX). Actors are filmed with small markers affixed to their faces to coordinate the 3D subjects’ motions with the actor’s head movements. Then, an unseen 3D representation of the head is created and coordinated with the actor’s head motions. An editor places “patches” from the cloud of points on top of this head, which function as filters for the source image’s skin.

These “patches” are often obtained from the filming of a young stunt double who attempts to mimic the original actor’s actions while walking through the same scene in the same setting and lighting. An editor will next conduct a computer “facelift,” removing the chin, sketching the cheekbones more sharply, and decreasing the actor’s ears and nose, which have been growing his whole life.

De-aging technology is still quite pricey. Only major Hollywood studios can afford to utilize it. A computer head model costs roughly a million dollars. Each frame with animation costs between $30,000 and $100,000, depending on its complexity. These technologies may grow less expensive over time, but they are still prohibitively expensive for mainstream film production.

 

Examples of VFX De-aging

A Kurt Russell sequence from the second Guardians of Galaxy film is currently regarded as the gold standard for de-aging an actor. Perhaps this was the main motion film in which the rejuvenating effects on an actor were so expertly done that the unwary audience didn’t notice. All of this was made feasible by using professional CGI and makeup.

Few people know that de-aging technology is often used in conjunction with a duplicate screencast to resemble a de-aged version of the actor. Kurt Russel was a prime example of this. First, the made-up Kurt Russell played the part. Aaron Schwartz then mimicked his actions and facial expressions. Russell’s young skin was then digitally “applied.” The whole sequence required many months to develop all of the CGI for the film.

Examples of VFX De-aging

 

Deepfake and the Future of Film-Making

Deepfake is an image synthesis tool that uses artificial intelligence. It is used to merge and overlay old photos and videos onto new images and videos. In most situations, such videos are created using generative adversarial neural networks (GANs). One portion of the algorithm learns from actual images of a certain thing. It generates an image, literally “competing” with the second phase of the algorithm, until the duplicate becomes confused with the original.

In 2016, techniques for detecting motion and translating it into a target video resembling a picture were presented. They enable the creation of fake facial emotions in real-time in an existing 2D video. There are several instances of this technology being utilized for good. Deepfake has been utilized in museum environments. For example, to commemorate the 115th birthday of the great artist Salvador Dali, the exhibition Dal Lives was staged at the Salvador Dali Museum in Florida.

They utilized an artist prototype that communicated with museum visitors, gave them tales about his paintings and life, captured selfies, and sent images to visitors’ emails. Deepfake technology opens up a world of possibilities in the field of education. AI-generated multimedia can bring historical people into the classroom for a more interesting lecture.

Deepfake solutions tackle a handful of actual challenges for film production companies. Among them is the capacity to generate material utilizing the faces of renowned actors in their teens or to recreate stars who have died. For example, an actor’s initial look as a youngster might be used for a part long after they’ve evolved into adulthood.

Deepfake and the Future of Film-Making