Already deep into awards season, BEHIND THE LENS turns its attention to the artisans and craftsmen “behind the lens and below the line” who were each an integral part in the collaborative effort of making a film over the past year. One would be hard-pressed to make determinations as to whom among them will win awards (we’ll leave that to the Guilds and the Academy), but one thing is certain – each is award-worthy in his or her own right.
First up is WETA Digital’s DAN LEMMON, VFX Supervisor on WAR FOR THE PLANET OF THE APES. Technology, craftsmanship, and storytelling are so intertwined, it’s virtually impossible to distinguish between live action and the work of Lemmon and his team at WETA. From doubling the number of “apes” in WAR to technological advancements that now allow for the sensory realism of snow and rain on different furs, to working with the cinematographic real-time lighting, to motion capture, to the empathy realized from the emotionality of the apes through the eyes of the real-life actors portraying each, there is an unprecedented depth to WAR that comes in large part due to Lemmon and the magicians of WETA.
Giving detailed insight into the VFX process on the APES trilogy and more particularly, WAR, film critic debbie elias recently spoke with DAN LEMMON in this exclusive interview. . .
I can’t believe what has transpired, just technologically, over the course of the past three films – DAWN, RISE and WAR – and to such a degree that we can see the visual differences in the motion capture, in the animation of the films through the progression.
Yeah. It’s something that we’ve worked really hard to try to push the envelope in. At the end of every film, you’ve got a list of things that you’ve learned. You discovered things and sometimes you’re able to make changes as you go and sometimes it’s just too late to do all the things you want to do and the march of technology takes a lot of effort and time as well, so it’s been a really great opportunity to return to these characters that we love and just be able to make them better and better.
I’m curious, was there any one thing in particular or a sequence of things, technologically, that has occurred since you did DAWN that allowed you to, now with WAR, I mean, you’ve doubled the number of new characters and anybody that knows anything about motion capture, VFX, digital, you’re dealing with fur. Fur has been the bane of existence for Pixar, for everybody else, and it’s only been, I’d say, probably the past seven or eight years where we’ve really seen a leap with technology that allows for the individual fiber texture, so to speak.
Yeah, that’s definitely true. That’s something that we’ve been actively pushing on since “King Kong”, really. I mean, when we made “King Kong”, we completely wrote a new fur system from scratch and it’s been sort of a continuous rewriting of that system and that software ever since then, and every time we scrap the code and start over again, we’ve learned so many things and we’re tagging it in better and better ways. But there’s another thing that happens too, and that’s that the artists that are doing the work, have been getting better and better as well and our observation of nature and picking out the details that are important to making things look real, it’s gotten more sophisticated and just the skills of the artists as they’ve gone from project to project. I mean, there’s guys here that have been basically doing nothing but grooming fur for 10 years and they’ve just gotten really good at it. At the same time, they’ve been able to make suggestions to the people writing the software about different ways to approach the problem, and what they wish they could do.
The other thing that’s happened is each film has it’s own unique challenges. So on WAR, one of the big challenges was snow. We had to let snow fall and land on the fur, we had to let the apes roll through the snow and pick up snow onto their fur. The clumping and the wetness of the fur had to change as the snow would adhere and then melt a little bit and then kind of turn into ice. So these are all things that we hadn’t done before and that’s one of the things that’s really exciting with the job is being able to take all the tools that you’ve developed over the years and then add to them as you discover new problems they haven’t solved and new stories.
And all of that goes into play with this particular film where you’ve pretty much doubled the number of new characters that are introduced in WAR as opposed to what we saw between DAWN and RISE. And each one of those new characters is so distinctive. Did it reach a point, or does it ever reach a point where you sit there and you really want to do honor to Mother Nature, but it’s like you say, “Oh God, we don’t have any more ways to make this ape look different from the other one.”
You’re right. I mean, that’s always a challenge with this franchise is trying to figure out ways to make each ape sort of distinct and unique so that the audience, who are humans, obviously, that they’re able to pick out the differences and keep track of the characters. You know, it’s a critical film story-telling thing where you have to be able to tell the difference between the characters or else people start to get lost in the story.
And you see it even with human actors, two actors that look too close to one another and you don’t spend enough time with them to be able to understand the differences, it can be quite confusing in a movie. So that casting is really important. Casting’s obviously important for real actors, but it’s also important for digital characters, making sure that you’ve got a cast that’s visually distinct and unique. That’s something that Matt Reeves, the director was certainly tuned into and we were as well.
But there’s another thing as well, which is that one of the things we discovered as we made the first two films and certainly into the third, was that you really can’t improve on nature. The more we deviated from real chimpanzees, the more we tried to humanize the apes or to different things to adjust their design, the more we’d sort of lose our way and the design and character of the characters would start to suffer a little bit, so we tried to adhere as close as we to real chimpanzees and orangutans and gorillas and pick out all those little fine details that are unique and specific to those species and introduce those into our digital characters.
Now, there’s one exception to that and that is that these characters also have to be able to perform in a way that a human audience understands their emotional arcs and that they get the facial expressions they’re making instantly. And so that is the area where we would cheat a little bit. We would introduce a little bit of the human actors onto the digital apes in areas like the eyes, in particular, the folds above the eyes and the eyebrows and around the muzzle area, trying to get more human signature wrinkles and folds onto the digital apes.
Something that you never cheat on, and it’s a testament to the technology and to the skill of you and your team, is the emotional complexity that still comes through in each one of the performances with each of the characters and that’s through the eyes. You never cheat on the eyes.
Yeah, we spend a lot of time trying to get the eyes to look just right. I will say, we do take a little bit of liberty with them in that most apes have very dark scleras, the area that we call the white of the eyes in humans, is actually, tends to be more of like a coffee colored, or even black sometimes on apes, and that can present problems when we’re trying to make facial expressions and we’re trying to see exactly where the character’s looking and that sort of thing. So we have on not all, but many of our characters, we’ve lightened their scleras up a little bit in order to be able to tell a little bit more what they’re thinking, where they’re looking and how squinted down their eyes are and that sort of thing.
Before I get into collaboration, I’ve got to ask you – Talk to me about “ray tracing”. I know that’s relatively new.
Yeah, it’s interesting. Ray tracing is a technology that’s been in computer graphics since the ’70s, but it’s something that has been inordinately expensive in terms of just processor time and also in terms of the amount of memory that a machine needs to have in order to ray trace efficiently. It’s only in the last, I’d say probably six years or so that computers have gotten fast enough and have been able to hold enough memory to be able to ray trace big scenes like the ones you see in WAR efficiently.
And the ability to go down that ray tracing route actually has made a phenomenal difference in the realism of the characters. We explored a few different ray tracers. Renderman introduced ray tracing, this probably would have been about eight or nine years ago, and it’s something that we looked at, but it was not as robust as we needed it to be, so at a certain point we made a decision to write our own rendering software and we road tested it a little bit on DAWN, but really only on distant, like shots with lots of characters far away.
It was promising, but it wasn’t yet developed to the point where we could use it on hero, close up characters. And when we made “Jungle Book” with the King Louis sequence with all the apes and monkeys, that was a scene where we were really able to get ape characters, heavily furred characters into a scene and ray trace it all in a complex environment and really prove that the software and also the hardware was up to the task and that’s allowed us to spend a lot more of our time doing creative work in our lighting and our shading and the processes that we use to make the pictures, rather than technical work.
In the past, we’d had to do a lot of massaging and cheating, frankly, in order to simulate the way that light would transport through a scene. But now, with the ray tracer, we can actually just let the physics do its thing and the results we get are much more realistic.
So, I’m curious, you’ve got 1,450 shots in this film, or thereabouts. What is the actual process that you as a VFX supervisor and your team, that you guys go through for creating the apes in terms of the cinematographic elements of lighting and framing? Is that a collaborative nature with the cinematographer? Is that something that you work with Bill [Hoy] on through the editing process? Take me through that.
So we really follow the lead of the cinematographer, who when we’re shooting the scene on set, he and the director come talk through what they’re creative visual goals are for the scene and from shot to shot they’ll adjust the lighting and we take very precise measurements of what the lighting is physically doing on the set at that moment. And then when we get back into our world, the post-production world, we’ll call up those measurements of the scene lighting and we’ll reconstruct what was on set, digitally.
That doesn’t mean that we don’t change a few things. One of the issues with the characters is that our apes tend to be darker colored than the humans playing them and so sometimes we need to just increase the light that hits them a little bit more. But they give us a really great starting point, and we’ve got an excellent visual reference for the intent of the cinematographer and Michael Seresin and the director Matt Reeves. In front of us, in the lighting that’s on Andy Serkis, or on Terry Notary or any of the other characters playing the apes, and so that tends to be our guide and the same way we stick very closely to the performance of the actors, we try very hard to stick closely to the lighting of the cinematographer.
What is the process at hand? How does that work most effectively between you and editor Bill Hoy? Is Bill sitting there with Matt and they’re editing stuff, but it’s like something just is not quite right and it’s like, “Okay, we need a little more here, we need to tweak something here.” Do they send that to you, or is it more often where you see something and kick it back to them?
Usually the way the process works is Matt [Reeves] and Bill [Hoy] or Stan [Salfas], his other editor, depending on what scene they’re working on, they’ll sit down and they’ll cut together a version of the movie where it’s dramatically accurate. Everything is basically as it should appear in the final movie, except that there are no apes. There’s only the actors wearing their funny gray suits and it’s, I think, to someone who’s only seen the finished movie, that they see this kind of what we call ‘the template’, the assembly of the actors cut together into a scene.
The first couple of shots, you’re like, “Oh, that’s kind of funny, I see Andy Serkis wearing his gray suit, and I have to imagine that he’s going to be a monkey.” But it’s amazing, within about 30 seconds, you get totally sucked into it and your suspension of disbelief kicks in and you just accept that Andy Serkis is Caesar, he’s playing a character and dramatically you can watch the scene and it totally will work. You’re able to make decisions beat for beat what should be happening, when to cut, when the music should come up and then once Matt and the editors have done that, they’ll hand that over to us in what we call a turnover.
At the turnover, basically we’ve got a map to follow. We’ve got the scene all cut together, the shot lengths, everything is edited together as it should be with the performances chosen and then our job is simply to replace the actors with apes and to match dramatically beat for beat, frame for frame each emotional beat that the actors are hitting in that collection of hosts in the sequence.
Typically what will happen is I’ll work on those on the apes with my team, with Dan Barrett, the animation supervisor here and with the various visual effects supervisors that are doing the lighting and the compositing and putting the rest of the shot together and we work on it for quite a while in order to get the acting of our apes to match the acting of the actors and to get the lighting of our apes to match the lighting of the scene and we show it to Matt a few times along the way, to Matt and also the editors will be part of that review process.
For example, we’ll show them animation blocking first, which is sort of a rough version of the animation where the beats are mostly there but it typically wouldn’t have any facial expressions yet and then we’ll show them a version that’s sort of the complete animation which has facial animation, but it’s sort of unlit, it’s on a sort of video game version of the characters and then once they’ve approved that, we’ll go into the lighting and rendering compositing part of the process which is where we make everything look visually completely real, put it back together and then we’ll send it through again to review. And often, we’ll go a couple rounds at each stage in order to make sure that we’ve hit all the right beats.
Is there ever a time where you all collectively, finally say “Okay, no that’s it, that’s it. We’re done.” Even though in your hearts you know, “Well, we could tweak this a little more, we could do this.”?
Yeah, I think anytime you’re making a big movie like this, there’s a certain amount of privatism that has to happen where you choose your battles and you make sure that the most important things are 100% correct and the least important things, you don’t want them to bump, you don’t want them to pull people out of the scene, but you also don’t want to waste your precious time and energy chasing things that aren’t actually that important, so it’s a constant judgment call about what is acceptable and good enough and what has to be flawless.
With virtual reality techniques popping up everywhere, has virtual reality come into play at all with the visual effects in APES?
As part of publicity campaigns and as sort of the stewards of the assets of the characters, we’ll sometimes get involved in creating virtual reality pieces for different shows. We did one for “Jungle Book” that was pretty cool and we’ve done “Pete’s Dragon” and several others. We’ve looked at a few things for APES. I don’t think anything’s actually been released yet and I don’t know, Fox may or may not choose to pursue something in that direction, but it’s definitely an arena that we’ve been playing in and it’s something that’s exciting to be able to take characters that you’ve put so much energy into in making the movie and be able to experience them in a different medium.
Is it more fun for you, because with all the new characters that were introduced in WAR, is it more fun to have an established character like Caesar and age him up as the storyline progresses through the films or start with brand new, fresh characters to introduce? Do you have a preference?
They’re both great for different reasons. There’s something great about having an established character and just being able to continue to develop and evolve and change things on them and with them, and it’s fascinating to watch what the actors do with their characters as well. You look at the way that Andy played Caesar in RISE compared to DAWN and WAR compared to DAWN, that evolution of the actor in their authorship of the character is amazing. So that’s a really cool thing to be a part of.
But it’s great to make new characters. I mean, many of us got into the business because we’re so enamored with the creature and character films of the times when we grew up and it’s just a fantastic thing to get to play God a little bit and bring these characters to life and there’s a magic moment where you’ve done all this research and design and development and you’ve put all this effort into creating rigs and puppets and animating and at a certain point, you get to a point where the animation starts to work and you start to get the lighting together and you throw a composite together and suddenly a character that was an idea has suddenly come to life and that’s an amazing moment and it never gets old.
You’ve been on this digital journey for quite some time. Starting with WETA Digital, you did VFX on “Man of Steel” and “Avatar”, but really started getting into the whole MoCap thing with the VFX probably around the time of “King Kong”. And so this has to be really exciting for you to come through this whole journey.
Yeah. Yeah, it is. It is and I think one of the things that was great was in this franchise, we did something that hadn’t really been done before. We took a set of tools that were great for capturing performances of actors, but they were limited to only being used in kind of specialized, dedicated spaces and we took those tools out onto location, onto the live-action film set and that was a pretty incredible thing because it was bleeding edge and there were a lot of things that went wrong along the way but we were able to get it to work.
Being able to offer that tool to filmmakers and to actor, to be able to say, “We can capture your performance, wherever you decide to do it, whether it’s on a stage or whether it’s on location or whether it’s back in a performance capture volume, but we can record what you do and then we can put that into the movie exactly as you’ve done it, but on a character that looks nothing like you.” I think that’s pretty amazing and it’s been a real honor to be able to be part of that process.
Before I let you go, very quickly want to ask you, Dan, what did you personally take away from the experience of making this film that you can now take forward into your next projects?
Oh jeez, that’s such a good question. There’s so many things. I think the process of making a movie like this, you’re learning every day and you’re trying things out and you’re failing and you’re fixing things and you’re discovering and you’re having epiphanies. It’s hard to point to a single thing, but you know, I guess … And it’s true of really all the films, but every day I come to work, I’m just amazed at the caliber of the people that I work with and one thing that in this movie really kind of hit home was how dependent I am, and how dependent the movie is on the raw talent and dedication of the people that work here. I just feel incredibly lucky to be able to work with a team that comes from literally every corner of the world and has gathered in this tiny island country at the bottom end of the planet to make movies like this and it’s a pretty special thing.
by debbie elias, interview 10/19/2017