Warner Bros.’ latest sci-fi action/adventure film, Jupiter Ascending (starring Channing Tatum and Mila Kunis), from Andy and Lana Wachowski, tells the futuristic story of a young, destitute caretaker (Jupiter), who goes on an unexpected journey, taking her to a world outside our universe. Accompanied by a genetically-engineered soldier (Caine), she encounters a tyrannical ruler of a planet in need of a new heir, who she helps stop.
With Framestore’s London and Montreal teams partnering to deliver more than 500 final VFX shots for the film, lead motion capture TD (as well as realtime operator for on-set compositing [NCAM]), Gary Marshall, talks with Post, here, in an online exclusive about how the Vicon motion capture Blade and Cara systems, as well as Autodesk MotionBuilder, played significant roles in helping his studio pull off a number of critical VFX shots.
Can you describe the type of VFX you completed for Jupiter Ascending?
“The VFX was split between Framestore's London and Montreal teams, where the work varied from hero characters — the Sargorns and Keepers — to the environments of Balem’s boardroom and lab, the armory, and the Titus Clipper dock, as well as digital doubles, space ships, planets, and all manner of explosions, fireballs, shattering glass and liquid-like nano-technology. It was a huge undertaking for our VFX teams.”
How many VFX shots did you complete for the film?
“Over 500 finals were delivered for the film.”
Did you work with any other VFX houses or were you the sole VFX contributor?
“We worked closely with Double Negative, who were responsible for the Chicago environment chase scene, among others, as well as Halon and The Third Floor previs teams. We also worked on-set with the NVizible guys and their on-set comp’ing tools (NCAM).”
Can you go into some detail on a few of your key VFX scenes?
“From a performance capture perspective, key scenes were the clinic scene, where Jupiter (Kunis) is under sedation — unbeknownst to her, the medical staff are alien creatures (Keepers) intent on killing her. Just as they are about to carry the execution out, Caine (Tatum) bursts into the clinic firing, killing all but one creature, who escapes. Our work here ranged from CG takeovers of the live action actors into fully-CG alien characters, digital set extension and more traditional paint and wire removal. Tatum was suspended on wire rigs as he skated into the set and up a physical ramp, simulating his character's futuristic rocket boots that were later added in CG.
“In another sequence, an impressive fight takes place between Caine and one Sargorn, Greeghan, in Balem’s boardroom — a huge digital environment built by our teams. Below the cavernous, gilded hall of the boardroom is another Framestore environment — a gruesome DNA laboratory revealed through the floor, which can be made transparent in an instant. The fight that rages through these environments is a combination of practical and visual effects, transitioning seamlessly between real-life stunts and fully CG shots. Again, as for the clinic scene, ramps were built in the partial Balem’s Lab set, which allowed Tatum to skate as if he was using the jet boots. We then removed the ramps, added Caine’s boots and animated the pursuing CG Greeghan. With the environment built in CG, we were able to replace camera moves when necessary and create fully-CG shots with a Caine digi-double for any shots that were impossible to film practically, such as when the pair crash through a pane of glass.”
What did the studio ask for in terms of a certain VFX style or look?
“This show felt like an amalgamation of several visual styles, however there was a strong direction where it came to creature look development and motion style. It was important for the Keeper creatures to take on an insect like aesthetic, and we spent several days in performance capture workshops with Lana and Andy directing gymnasts in mocap suits to try a variety of poses and movements to really flesh that out.
“Our realtime retargetting and rendering of these performances from Vicon Blade into MotionBuilder gave the director's instant feedback on those character models. We could experiment with inverting limbs and heads, for example, all in realtime to really create some nightmarish looks. These sessions proved extremely useful for driving the final look of these creatures.
“The winged reptile Sargorn characters provided an animation challenge as they flicked between calm conversations with humans, and moments of bestial rage. In those fight scene moments the emphasis was on making them as animalistic as possible — moving on all fours and using their tails, or beating their huge leathery wings. Again, using our realtime mocap techniques we were able to set up 'virtual mirrors' where a mocap performer could see himself as his creature avatar to really get a feel for how the character should move. This was done by projecting a render from a virtual camera in our realtime engine on a wall, allowing the actor to see himself reflected back.”
Any particular challenges you encountered?
“Framing and eyelines in scenes with numerous CG characters and sprawling environments is always a challenge, however we worked with NCAM on-set, who provide realtime compositing services of live action and CG elements to try and address those issues. The challenge there was where you had animation elements that had to be dynamically triggered to respond to certain events on set, in the context of the shot.
“I wrote a suite of plug-ins in Autodesk MotionBuilder to cue, trigger and warp animation events — be it creature motion, spaceships taking off or actors on motion paths. All this could be controlled from a MIDI input device with events mapped to buttons, and timewarping mapped to a jog wheel. Overall VFX supervisor Dan Glass could then radio to us when he wanted the events to fire as the shot was rolling. This worked out very well for us.
“Our supervising mocap TD Matt Rank captured facial animation on-set with four 720p cameras mounted to a helmet rig (The Vicon Cara system), whenever the hero Sargorn character (Greeghan) was on-set, and although final facial motion was hand crafted, this data was great reference for seeing exactly what the actor had done at the time. Their long, lizard skulls made dialogue interesting, especially as their teeth are fused to their lips so it was great to have this data.
“When capturing the Sargorn creatures we had mocap actors on stilts (heavy sprung lower leg attachments), that forced us to really think about how the leg should solve. It was more a 'dog-leg' kind of anatomy so we had to get creative with our marker sets and make sure the resulting motion was faithful to how Lana and Andy had envisaged it. Fortunately, with Vicon's Blade software, we weren't locked into a rigid template for marker solving, so we could experiment with how we captured that motion.”
What were some of the key tools you used for the VFX?
“Autodesk Maya and The Foundry Nuke for the more traditional CG pipeline. For the motion capture and realtime element, we used Vicon Blade and Cara, and Autodesk MotionBuilder. Maya and Nuke were obvious choices — our CG pipeline is built around those tools, and for mocap we have found that Vicon's tools get us great solves in realtime, with stability even when capturing numerous performers simultaneously.
“MotionBuilder has been the go-to application for virtual production ever since Avatar really revived it from the chopping board and I don't see that changing just yet. Its rich feature set and editing tools make it perfect for this kind of work.”
What are some of your favorite scenes that you created for the film?
“For me, operating realtime on the Chicago chase sequences alongside the NCAM crew was most fun — trying to dynamically warp motion of the digital set flying past Tatum and Kunis while they skated on giant green treadmills suspended 20 feet over the set was a great challenge, and not your everyday VFX task. In terms of pure CG creature work, I think the clinic sequence/Keeper shootout showcased some great work across the gamut of disciplines here at Framestore.”
What was the timeframe like on the film?
“Framestore first got involved in pre-prod sometime around Q4 2012, with a team of animators spending six months carrying out animation tests at Leavesden to help the directors plan the sequences. Delivery of shots continued until early 2014. I was personally involved from January 2013 to July 2013.”
Any special techniques used for shooting or in post that affected your work with visual effects?
“Working with NCAM meant we had to rethink some of our toolsets where virtual production was concerned. We were also getting rough camera tracking data from the output of their (NCAM's) pipeline, so that meant postvis could happen on-set in parallel with principal photography. Back at base, our tracking teams were unfamiliar with this kind of delivery, and re-integrating those cameras into plates where there was sometimes no world space referential was a unique challenge.”
How many people from your studio worked on the VFX?
“London 150 and Montreal 50.”
How important were the visual effects to the overall story telling of the film?
“Like any show of this genre, the VFX are a huge component. Physical set construction, especially for sequences like Balem's Boardroom, was minimal, so the VFX really brought a sense of grandeur and scale to the narrative.”
How different was this film from other films you worked on?
“This show felt like a real effort to progress the blurring of lines between production and post — with realtime compositing, on-set postvis and motion capture all enabling the directors to tell the story they are trying to tell with more realism and more efficiency than we have seen in the past.”