Advertisement

AI gives Thanos a soul in 'Avengers: Infinity War'

Marvel and Digital Domain are banking more on machine-learning.

Spoilers ahead for Avengers: Infinity War.

Thanos isn't your usual Marvel nemesis. Then again, even after 19 films in Disney's superhero universe, it's not as if he's had much strong competition. Aside from the puckish Loki and tragic Killmonger, most Marvel villains have been pretty forgettable. Now, after years of buildup (we first caught a glimpse of Thanos in 2012's The Avengers), he finally took center stage in this summer's Avengers: Infinity War, which is now available on Blu-ray and digital.

But what's most intriguing about Thanos isn't that he wants to wipe out half of life across the universe — instead, it's that he's a big purple alien who feels genuine emotion. He cries when he's forced to sacrifice Gamora, his adopted daughter. He feels pain and anguish. But like many memorable bad guys, he believes it's all for the greater good.

Sharp-eyed viewers will notice Thanos looks very different in Infinity War than he did in the Avengers post-credits scene. That's not just due to the advanced in CG technology. "We all came to the conclusion that the performance would actually come through a little bit better if we introduced a little bit more of some of the recognizable features of Josh Brolin," said Kelly Port from the VFX company Digital Domain, one of many effects firms working on the film.

Digital Domain also used a piece of custom machine learning software called Masquerade to make the motion capture performance seem more natural. The process starts by placing 100 to 150 tracking dots on Brolin's face, which is captured by two vertically oriented HD cameras. It's not meant to be a high-resolution scan; instead, it's a fairly low-quality rendering. That's fed into a machine learning algorithm that's using a library of high-res face scans, across a wide variety of expressions.

"[Masquerade] takes that low resolution mesh and it figures out what high resolution shape face would be the best solution for that," Port said. "Then it gives you a solution, and then we would look at that result. If it didn't feel quite right, we would make a little tweak in modeling to adjust ... let's say this has more lip compression or the brows need to be higher, we feed that back into the system and it would learn from that via a machine learning algorithm."

The next time Digital Domain puts the low-res mesh through its system, it should have a better result than before. But that's just step one. Next up is a process called direct drive, which takes the high-resolution face mask performance and places it on Thanos's character model.

"And then we kind of go through a similar process in that we look at them side by side," Port said. "We look at Josh's performance and it's like, 'He's more surprised,' or 'He's more sad here," and there's something in Josh's performance that's not coming across in Thanos that we'd make a little adjustment. We'd feed that back through the system, and then hopefully the next time that same input gets fed into the direct drive system, the result would be more accurate or more what we desired."

Without a machine learning system like Masquerade, VFX artists have to tweak facial performances with animation manually, a process that can be more time-consuming. Still, there are other modern techniques, like WETA's Oscar-winning FACETS, which was used for facial tracking on Avatar and the recent Planet of the Apes trilogy.

"We knew going in Thanos has to work, or the movie doesn't work," said Dan Deleeuw, Marvel Studio's VFX supervisor. So from the start, his team was focused on understanding him as a character. Based on the glimpses of him we've seen before in Marvel films, they knew he'd be a large and angry character — a giant who's literally railing against the universe. But Marvel also wanted to capture subtle aspects of Josh Brolin's performance, especially his face.

The first day on set, directors Joe and Anthony Russo wasted no time getting Brolin in a motion capture helmet and suit to test out some of his lines. But they also went a step further. "Instead of cutting when they stop doing the lines, we just kept the motion capture going," Deleeuw said. "We kept when he was just experimenting with the different lines and how he would approach Thanos."

Using those off-the-cuff line takes, Marvel Studios was able to capture nuances that Deleeuw didn't originally plan for. "Just being able to read almost imperceptible movements in his face... movements in his eyes and his cheeks, and then you know later on to show his frustration or sadness with Gamora, or his anger with Tony... just really bring a character like that to the screen, I think was one of the biggest challenges," he said.

Marvel Studios' AVENGERS: INFINITY WARThanos (Josh Brolin)Photo: Film Frame©Marvel Studios 2018

"Doug Roble, the guy that's working on that [Digital Domain] software said something along the lines of, 'If you're not using machine learning in your software, you're doing it wrong,'" Deleeuw said, recounting a recent visit to the VFX company. Looking ahead, the technology will be used for more than just faces -- it could help with things like water simulations. Eventually, you can expect machine learning to have a role just about everywhere when it comes to visual effects.