Jo Plaete
AI Technology & VFX Supervision
About Jo Plaete
Jo Plaete is an Innovation leader and AI/VFX Supervisor, pushing the boundaries of visual effects and AI in the media & entertainment industry. With over a decade of experience in Hollywood blockbusters, Jo has helped create iconic and mind-bending visual effects in recent cinema history.
In his role as Chief Innovation Officer and Visual Effects Supervisor at Metaphysic.ai, Jo oversees the implementation of cutting-edge AI technologies, forming the foundation for AI-driven visual effects for motion pictures and live experiences.
He most recently worked on generative AI implementations for directors such as Robert Zemeckis (HERE), George Miller (Mad Max: Furiosa), Fede Alvarez (Alien: Romulus), and with talent like Tom Hanks, Robin Wright, Eminem (Houdini), Drake (US Tour) and Cai Guo-Qiang (TED2024).
Jo joined Metaphysic in 2022 as Chief Innovation Officer, bringing nearly two decades of experience in VFX and post-production within the entertainment industry. He has supervised VFX for productions like ABBA: Voyage concert experience, Ready Player One, Star Wars: The Last Jedi, The Legend of Tarzan, Guardians of the Galaxy, and World War Z, during previous roles at Walt Disney’s Industrial Light & Magic and MPC.
Jo continues to pioneer new techniques that blur the line between reality and digital artistry. His work not only enhances storytelling but also opens up new possibilities for creators in the film, television and live industry. Jo has been speaker at conferences such as SIGGRAPH, FMX, COGX, VIEW, RTES, BUMP, PIDS.
In his spare time you can find Jo running (ultra)marathons or producing electronic music.
News
This is how Tom Hanks was so convincingly de-aged in Here
As Here moved into production, Metaphysic trained the initial neural network models to generate photoreal younger versions of the actors. Metaphysic VFX supervisor Jo Plaete and his team ran the workflow in real-time during the shoot, providing a visual reference of what the performers would look like.
Read MoreAI for Live Performance: Eminem at the VMAs
The process of bringing Slim Shady to life was complex and required careful planning, according to Jo Plaete, VFX Supervisor at Metaphysic. “The basic workflow involves several key stages, each tailored to address the unique challenges of the specific show,” Plaete explains.
Read MoreVlaming Jo Plaete verjongt Hollywoodacteurs met AI
Enkele jaren geleden wekte de Vlaming Jo Plaete de legendarische popband ABBA weer tot leven, nu geeft hij topacteurs Tom Hanks en Robin Wright een digitale verjongingskuur. ‘De schrik voor AI is groot in Hollywood. Maar voor mij is ze een instrument om de films te maken die we vroeger niet konden maken.’
Read MoreA deep dive into the filmmaking of Here
With Kevin Baillie. The film has over 53 minutes of complete face replacement work, done primarily by Metaphysics, led by Metaphysic’s VFX Supervisor, Jo Plaete. Metaphysic’s proprietary process involves training a neural network model on a reference input, in this case, footage and images of a younger Hanks, with artist refinement of the results until the model is ready for production.
Read MoreWhat if A.I. Is Actually Good for Hollywood?
“You can’t expect James Cameron to prompt an ‘Avatar’ scene,” says Jo Plaete, Metaphysic’s chief innovation officer and the lead architect of the A.I. tools used in “Here.” “It’s just not going to work. Or with Bob Zemeckis or Steven Spielberg — if you’ve ever made a movie with one of these guys, you know that they will want to change every pixel if they can.”
Read MoreCG Garage Podcast with Chris Nichols
Jo Plaete, a pioneer in digital humans and chief innovation officer at Metaphysic, joins the podcast to discuss his company’s groundbreaking work on the newly released film Here, now in theaters.
Read MoreHERE
VFX/AI Supervisor
For HERE (2024), my teams and I created 60 minutes of AI-driven, photoreal de-aging of iconic actors like Tom Hanks and Robyn Wright, maintaining their full performances while bridging 40+ years of age gaps. As Chief Innovation Officer at Metaphysic.ai and Visual Effects Supervisor for HERE, I led the design and development of our neural performance toolset, creating the workflows and pipelines that enabled the seamless de-aging and aging of actors like Tom Hanks, Robin Wright, Paul Bettany, and Kelly Reilly across multiple age ranges. Our AI-driven solution avoided the uncanny pitfalls of traditional methods such as prosthetics and CGI by generating photorealism and preserving the emotional depth and nuance of each performance. As the principal architect behind the technology, I guided both creative and technical teams to come together and deliver an unprecedented solution delivering seamless synthetic renditions of the actors' performances.
MAD MAX: Furiosa
VFX/AI Supervisor
In Furiosa: A Mad Max Story, directed by George Miller, me and my teams at Metaphysic played a pivotal role by leveraging our advanced AI-generated VFX technology to bring back to life a fan-favorite character from the franchise, the Bullet Farmer. Originally portrayed by Richard Carter and now played by stand-in actor Lee Perry, this challenging task required the use of our Metaphysic neural performance toolset, which was trained on licensed footage from Mad Max: Fury Road. Perry’s performance was meticulously transferred into Richard Carter’s Bullet Farmer likeness and expression space. Despite the limited dataset, which made training the AI models demanding, our team achieved remarkable results. Perfectionist compositing was deployed to seamlessly integrate the outputs of our neural nets into the photography, further enhanced with AI-generated identity-specific detailing. The result is a flawless portrayal of the Bullet Farmer, capturing the essence of the character as a supporting role in the movie, just as fans remember him.
EMINEM: HOUDINI
VFX/AI Supervisor
For Eminem: Houdini (2024), my teams and I recreated Slim Shady’s youthful look, blending AI-driven enhancements and compositing to seamlessly unite Eminem with his alter ego. The music video garnered millions of views and we won the first-ever VMA awarded for a digital character created by artists using an AI-powered pipeline.
EMINEM: VMA
Live AI Supervisor
At the 2024 MTV Video Music Awards, audiences were treated to a groundbreaking fusion of technology and performance. Eminem, also known as Marshall Mathers, opened the show with a riveting performance of his latest release, “Houdini.” In a stunning twist, he brought back to life his iconic alter ego, Slim Shady, not through traditional makeup or pre-recorded visuals, but via live generative AI technology developed by Metaphysic. This innovative collaboration not only captivated viewers worldwide but also earned the team the VMA for Best Visual Effects in a Music Video.
ABBA: Voyage
Computer Graphics Supervisor: Digital ABBATARS
For the ABBA: Voyage concert experience, Industrial Light & Magic (ILM) was tasked with digitally time traveling the iconic band′s members Agnetha, Anni-Frida, Björn and Benny back to their prime time appearances. For the duration of this fully computer graphics generated concert four continuous photo-real digital human facial performances had to be synthesised driven by their original current day counterparts and stand-in young actors.
ALIEN: Romulus
VFX/AI Supervisor
As Visual Effects Supervisor for Alien: Romulus, I led the further advanced development of our neural performance toolset to bring Rook, a humanoid robot, to life by re-creating the likeness of Ian Holm’s iconic character, Ash, from 1979s Alien. Our AI-driven facial transfer technology preserved the emotional depth and nuance of the new actor’s performance. The filmmakers captured the scenes using animatronics, but due to limitations in the physical performance, augmentation was required so we overlaid our AI-powered facial replacements to ensure seamless integration with the photography.
Cai Guo-Qiang: TED2024
Realtime AI Voice Project Supervisor
For TED 2024 in Vancouver, we were tasked with assisting Cai Guo-Qiang, a Chinese New York-based artist, in delivering his TED Talk. The goal was to translate his speech from Mandarin to English as he delivered it on stage, with the translation coming through the PA system in real-time. Importantly, the translation would maintain the tonality, timbre, cadence, and accent of Cai’s own voice, providing the audience with as authentic an experience as possible. A first on the TED stage!
Aladdin
Computer Graphics Supervisor
As Computer Graphics Supervisor on Aladdin, I led the creation of the Genie, a fully computer-generated character driven by Will Smith’s performance capture. This project marked the first deployment of Disney’s ANYMA facial performance capture technology at ILM, and I guided its implementation to ensure every nuance of Smith’s expressions translated seamlessly into the Genie’s whimsical and energetic persona. I oversaw the entire pipeline—from performance capture to final rendering—ensuring the character not only embodied Smith’s unique style but also aligned with the film’s magical tone.
Ready Player One
Computer Graphics Supervisor: Final Battle
As Computer Graphics Supervisor on Ready Player One, I oversaw the CG execution of the film’s epic final battle scene—a massive, high-stakes sequence featuring hundreds of iconic characters interacting within a fully computer-generated environment. This project marked the culmination of a new crowd system developed for ILM, specifically designed to manage the complexity and scale required for this work. The system facilitated seamless integration of vast background crowds with hero characters, ensuring a perfect balance of scale, detail, and narrative focus.
Guardians Of The Galaxy Vol I
Final Act Space Battle R&D and Execution
As Lead Technical Director on Guardians of the Galaxy Vol. 1, I spearheaded the design and implementation of a high-scale space battle system for the film’s climactic final act. This sequence involved coordinating thousands of ships, including starfighters, massive cruisers, and enemy fleets, in a fast-paced, visually stunning battle above Xandar. The challenge was to create a dynamic and immersive battlefield, blending large-scale procedural simulations with artistically controlled key moments to maintain narrative impact and visual coherence.
Bond Spectre
Supervisor: Opening Scene Crowd Simulation
First project when joining the new ILM London studio, booted up and supervised a crowd simulation department for the studio assessing current tooling. Also built a new crowd simulation solution that got adopted by the studio worldwide.
World War Z
Technical Director: Complex Crowd Simulation
As a Senior Technical Director on World War Z, I led the R&D and shot production for some of the most intricate and complex crowd simulations of the last decade. The project involved designing and executing large-scale simulations of thousands of zombies, pushing the boundaries of what crowd simulation technology could achieve at the time. Our goal was to create vast swarms of zombies that moved with a fluid, organic quality—flowing like liquid through urban environments, cascading over walls, and building human pyramids to breach defenses.
The Lone Ranger
American Indian Attack Sequence
For the Comanche vs. Cavalry sequence in Disney’s The Lone Ranger, my team at MPC and I were tasked with creating complex horse-and-rider simulations for scenes where a massive crowd of stampeding Comanche warriors are mowed down by the cavalry’s Gatling guns. This sequence required an intricate blend of dynamic crowd behavior and physically accurate ragdoll simulations, ensuring both the riders and horses reacted believably to the chaos on the battlefield.