SAN FRANCISCO — After more than 30 years of Star Trek on the big screen, it must be a challenge to keep boldly going where no man has gone before, but director J.J. Abrams is up for the task.
He returns with the 12th installation in the franchise, Star Trek Into Darkness, the sequel to his 2009 Star Trek film and an imaginative new look at the young officers on board the USS Enterprise, who are called home to find a powerful villain who has attacked Starfleet and left Earth in chaos.
Industrial Light & Magic (www.ilm.com), which has a number of Star Trek features to its credit, led the VFX team, with Roger Guyett as VFX supervisor and second unit director, the same roles he filled for Abrams four years ago. Guyett and his team at ILM earned a visual effects Oscar nomination for Star Trek.
Four years can seem like a lifetime in terms of advances in technology, so what’s different about the VFX in this iteration of Star Trek? “A lot has changed in four years,” says Guyett. “The processes have become exponentially faster. But it’s sort of like your closet: No matter how big you make your closet, you somehow manage to fill it. The same thing with technology — no matter how quick the process, you invent new ways of testing and stressing it out so you end up back where you were. It’s quite a predicament!”
ILM modified its pipeline somewhat for Star Trek Into Darkness using Solid Angle’s Arnold renderer instead of RenderMan. Relatively new to ILM’s software arsenal, Arnold has a ray tracer that enables artists to more accurately calculate the way a single point of light reacts — an important capability since single-source lighting “is very much the style of Star Trek,” Guyett says. “Audiences may not necessarily notice the difference, but the process is more scientifically accurate.”
STEREO & IMAX
Star Trek Into Darkness is also a stereo 3D film, which added to the complexities of the VFX. “In traditional 2D filmmaking you can composite an element from another scene, but in 3D, with the big moving camera shots that J.J. loves, you expose the real parallax and depth of the scene, so everything has to fit in the scene in terms of dimension and depth — you have to have a spatial match. Any mismatch will literally give viewers a headache.”
The film’s live-action was shot in standard 2D and converted to 3D by Stereo D. But, since “a tremendous amount of the movie is invented with VFX, we could generate everything to match the depth correctly,” Guyett explains. “We rendered with two cameras, with two eyes for fully-dimensional 3D.”
Close to half the film was shot with IMAX cameras, he adds. “We decided that any exterior shot or any shot in space would be an IMAX moment. J.J. was a producer on the last Mission Impossible movie and he came up with the idea to open out a scene from anamorphic to IMAX, and that format change was very successful. Now, he wanted to go a step further, so any exterior or major action shot is IMAX, which lends itself to showing off the impressiveness of big vistas.”
THE VISUAL EFFECTS
Guyett’s involvement with Star Trek Into Darkness began in summer 2011, and shooting commenced the following January at Sony Pictures, — which housed the bridge of the Enterprise and more ship interiors than ever before — and Raleigh Studios at Playa Vista, where all the big greenscreen and exterior partial sets were built.
There were about 1,700 VFX shots in the finished movie, and VFX producer Ron Ames organized the workload. ILM handled the bulk of the complex work, with some 500 shots divided among teams working at the Presidio in San Francisco and in Singapore. Luke O’Byrne took on production responsibilities for ILM.
Pixomondo’s Ben Grossmann and Atomic Fiction’s Kevin Baillie were the VFX supervisors. Another VFX team, called the Kelvin Optical Company, was formed for the production and worked out of Bad Robot’s offices on 600 to 700 shots. “They were a mini-VFX company — J.J. has done that on just about every production,” says Guyett. “It’s a very effective way to get a lot of work done, from clean-ups to quite complicated shots, including matte paintings and set extensions, and to get more bang for your buck.”
Halon, under Brad Alexander’s supervision, created the previs, then segued to postvis duties, which helped determine how to fill any VFX gaps once editorial began stitching together scenes.
The ILM modeling team that had worked on Abrams’ Star Trek “was able to go back and leverage designs, like the Enterprise, and upgrade them,” says Guyett. Bruce Holcomb served as digital model supervisor.
Some of the most interesting shots to create were the scenes on Earth, reports Guyett. “The previous movie established the Earth aesthetic. We assume that we’re not so far in the future that everything has disappeared; aspects of London and San Francisco are recognizable. We built out cities and created a vision of the future to a certain extent in the last movie, but for this one we expanded those ideas, shooting at as many real locations as possible, which of course, had to be heavily augmented.”
Production designer Scott Chambliss set the tone for the look of London and San Francisco, and co-supervisor Pat Tubach and ILM art director Yanick Dusseault planned the cityscapes and filled out the streets. Guyett tried to shoot some aerial plates of San Francisco and LA but found that post-9/11 air space restrictions made it “difficult to capture any footage that was useful. We couldn’t feel as if we were in amongst the buildings.” Although some reference photographs were used, “ultimately we had to create those photoreal cities — a testament to the skill of our artists.”
3D projection mapping sometimes served as a starting point for the cityscapes, but extensive camera moves “made it impossible not to build a bit more dimensionally,” Guyett explains. Different lighting set-ups were required for almost every shot to match the light of foreground elements.
Guyett went on location at an old Budweiser factory in LA to borrow the mechanical look of pipes and other infrastructure for some engineering aspects of the Enterprise. The National Ignition Facility at Livermore provided more intricate high-end technology backdrops. “We augmented these locations digitally, but they provided a tremendous amount of visible technology and production value,” he points out.
He also shot some downtown LA streets as a basis for shots augmented in CG. “J.J. believes in photographing what you can, then it’s up to us to futurize the footage without losing the human aspects of it.”
The exciting sequence that takes place in a red jungle posed a number of challenges. Initial discussions about shooting in a real jungle then digitally manipulating the footage segued to building a small practical jungle, extending it digitally and adding a volcano and lava. “The lava was one of the most complex simulations I’ve ever seen done at ILM,” says Guyett. “Dan Pearson, simulation and FX supervisor at ILM, created some amazing processes to control the flow of the lava and how it reacts to the environment. Digital environment supervisor Barry Williams took a tiny set and built a digital extension of it for huge shots of the tribesmen chasing our heroes through the jungle. The scene ends with the Enterprise coming out of the water in a massive CG water simulation — it took weeks to run all the elements in that shot.
“What I love about Star Trek is that besides the cool space shots, you also get to create amazing planets that behave in seemingly unnatural ways. It’s a lot of fun!”
The on-set VFX team deployed a computer-controlled NavCam wire rig, loaded with lighting equipment, to mimic the motion of a small hovering ship attacking a building, so that the spotlights from the ship itself would match up when the CG ship was added in post. “This was a great collaboration between the various departments involved: electric, grip, and, of course, the VFX team. The final result is well worth the effort and you never doubt that the lighting is coming from our CG ship,” says Guyett.
ILM occasionally augmented the prosthetic make-up on creatures and aliens, and did a considerable amount of digital double work when even stunt performers couldn’t safely execute certain moves. Animation supervisor Paul Kavanagh sometimes stepped in and performed the roles of key characters.
Although most of the animation was done with Autodesk’s Maya, the ILM R&D team heavily supplemented the software toolset with custom add-ons, proprietary shaders and texture mapping systems. ILM’s Plume system generated smoke, atmospheric effects and realistic pyro. The Foundry’s Katana was the lighting tool.
Looking back at the enviable “predicament” ILM faced with expanded technical capabilities and boundless imagination, Guyett says the experience of Star Trek Into Darkness was “an interesting dynamic: art meets science.” Everyone involved “was driven by the desire to make spectacular images” and capitalize on “new fire power to do it.”