We all know she put a curse on Princess Aurora, but do we know why? Maleficent comes to us 55 years after the Disney animated classic Sleeping Beauty, and it explores the origins of the Mistress of All Evil and why she came to put a curse on a little blonde baby.
Making this film about one of the most iconic storybook villains involved not just a cast of talented actors and visual effects wizards from Digital Domain and Moving Picture Company (MPC), but also USC Viterbi Professor Paul Debevec from the USC Institute of Creative Technologies. Debevec and colleagues Tim Hawkins, John Monos and Mark Sagar have been recognized with Scientific & Engineering Awards from the Academy of Motion Picture Arts and Sciences for the digital actor technology used on Spider-Man 2, Superman Returns, The Curious Case of Benjamin Button and Hancock, and Debevec’s lab scanned numerous actors to help Weta Digital create the Na’vi characters featured in Avatar. He and his team used the Light Stage housed at ICT to take high-resolution scans of the cast of Maleficent to create the eponymous character as well as the pixies Knotgrass, Thistlewit and Flittle.
For the computer-generated pixies, Debevec and his team scanned actresses Lesley Manville, Imelda Staunton and Juno Temple on the Light Stage with their hair pulled back in a headband and little to no makeup so that the 3-meter-wide sphere of 4,000 computer-controlled LEDs and seven cameras could capture every detail of their faces—every pore, freckle and wrinkle.
But the Light Stage data wasn’t used just for facial skin detail; it also captured data to make blood flow maps and light fields. When you squeeze your eyes tightly shut, purse your lips and wrinkle your nose for a few seconds, blood squeezes out of the area around your lips, and blood collects in your cheeks, making them flush. The Light Stage captured the effects of blood flow for each actress to add yet another level of realism to their computer-generated characters. To record that data, the actresses held various extreme expressions in the Light Stage for approximately five seconds, then relaxed their faces.
To create light fields for each character, actors are filmed while LEDs flash one at a time over their faces, recording the effects of light striking the face at different angles. This data can be used to create a realistic lighting effect so it looks like that character is really in the room, with the light falling on her face naturally.
And then, of course, there is Maleficent herself. Played by Angelina Jolie, Maleficent isn’t a computer-generated character like the flower pixies. Jolie did the vast majority of her own stunts, flying through the air on rigs in full costume. But there were still points in the film where the team from visual effects company Digital Domain needed what they call a digital double, or digidouble, for the actress.
The scans Debevec did on the Light Stage at ICT for Jolie were notably different from the scans for the pixies. Jolie was in full costume, makeup and prosthetics.
“Working with Angelina Jolie on Maleficent was exciting, and one of the most prominent actors we’ve worked with, alongside Tom Cruise, Will Smith and Charlize Theron,” Debevec said. “When it was time for Angelina to come to the lab where we have the Light Stage, she looked so impressive that people were speechless. She walked in the room, past eight or 10 people frozen in their tracks, directly up to me, without anyone taking action to make introductions. ‘So, you must be Angelina?’ I managed, looking into the green eyes of one of the best known faces on earth.”
The team ushered Jolie to the stage and closed the door to seal her into the geodesic sphere of LEDs. “Watching the light play off of an actor’s face in the Light Stage is always an interesting visual, and with Angelina Jolie in makeup as Maleficent, it was a remarkable sight,” Debevec said.
The visual effects team at Digital Domain used the Light Stage data to create a realistic digital double that was used for stunts that were not physically possible or too dangerous to attempt.
“What we’re trying to do here is bring the essence of the actor’s performance to the screen,” said Darren Hendler, the digital effects supervisor at Digital Domain. “Not to invent, not to create something new, but to re-create the actor’s performance capturing all those subtleties and nuances.”
Hendler and his team think of these new technological capabilities as digital prosthetics, not so much as a tool to create characters from nothing. At every step in the production process—animating, rendering and touching up the characters and special effects—it all goes back to the performance of the cast.
“We’re getting to the stage where we can mimic and create 100 percent photo-real versions of the actor,” said Hendler, “but without that actor’s actual performance, body, face, expressions, it’s nothing.”