Apple TV+’s Masters of the Air is based on Donald L. Miller’s book of the same name, and follows the men of the 100th Bomb Group (a.k.a. “Bloody Hundredth”) as they conduct dozens of bombing raids over Nazi Germany. The nine-episode series looks at the struggles the crews face at 25,000 feet, including freezing temperatures and lack of oxygen, along with the psychological and emotional tolls that wear on the young men as they help to destroy Hitler’s Third Reich.
The series was executive produced by Steven Spielberg, Tom Hanks and Gary Goetzman, and was split up among a team of directors that included Anna Boden, Ryan Fleck, Cary Joji Fukunaga, Dee Rees and Tim Van Patten.
Austin Butler and Callum Turner star in the series, which also includes Anthony Boyle, Nate Mann, Barry Keoghan, Rafferty Law, Edward Ashley, Jonas Moore, Elliot Warren, Matt Gavan, Branden Cook, Josiah Cross and Ncuti Gatwa.
Stephen Rosenbaum (www.stephenrosenbaum.net) served as the show’s visual effects supervisor, coordinating work among a number of studios, including The Third Floor, Dneg, Whiskeytree, Rodeo FX and Weta Digital. Here, he reflects on the VFX needs of the show and its challenges.
Hi Stephen! Tell us about your background.
"I’ve been an independent supe for years. I will occasionally associate with a specific VFX vendor, but most of the time, I’m freelance, hired by the studios.”
How did you get involved in Masters of the Air?
“I got a call from somebody who was working on the show. Early on, they were deciding what direction to go — both creatively and technically. They were looking around and saw my credentials, and they called me up.”
You’ve worked on big features, such as Avatar and Kong: Skull Island. How does this streaming series compare?
“It was like doing three movies, back-to-back. Sometimes not even back-to-back — sometimes on top of each other. Most visual effects movies are about a half to a third the size. In terms of scope and complexity, this was definitely a beast. And it grew exponentially as we were prepping and then shooting.”
Can you talk about the different contributors and their responsibilities?
“(The) Third Floor, I hired to do previs. So they were on from the early days of development and then into prep. They helped me build out the concepts of mostly the mission sequences — the battles. Sometimes they would help me visualize sequences that were not battles — sort of somewhat abstract in nature and in terms of the narrative, and we needed to visualize what that would be before we got into shooting. So they were my vendor for previs. And then the other primary vendors were — aside from Dneg — a company called Whiskeytree, Weta Digital and Rodeo FX. I had some secondary vendors as well. One of note was The Distillery. But the principal four vendors in post production were Dneg, Whiskeytree, Weta Digital and Rodeo FX.”
Did they focus on specific episodes or scenes, or across the entire series?
“They worked pretty much across the board. Weta was the outlier in that respect. They only worked on Episode 9. But the other three principal vendors worked across all the episodes. And we cast them into certain roles. For example, Rodeo FX principally did environment extension work. But, they also did some in-air battle sequences. The same with Whiskeytree for that matter. They did some plane sequences and then also did some environment extensions, where as Dneg principally just stuck to air-battle sequences.”
You’ve embraced virtual production in your feature film workflows in the past. What was the extent of using LED volumes in this case?
“The virtual production work we had done specifically in this case was building LED walls. We had three-and-a-half volumes. I’d say 90 to 95 percent of that work was plane work. It was mostly interior plane shooting. For example, our primary volume was a large, horseshoe-shaped configuration. It stood about nine meters high by about 15 meters wide. And at the opening of the horseshoe, we had inserted a large motion base on which we rested plane set pieces. Primarily, we had the nose and the cockpit set piece, interior set piece. The actors would be inside of these sets, and we would have cameras inside and sometimes strapped to the outside of the sets, shooting the actors. If you want to break it down, anything that was facing inward, looking at the actors or interior, looking out over the actors was a set, usually shot up on the volume. Anything wider than that was computer generated. We would intercut with interior shots of the actors reacting, and then reverse cameras over their shoulders looking out at the action. And in that case, they were able to respond to the content that we play back on the walls. And that content was previs. When people are doing LED wall work, they’re trying to play back what we call ‘final-pixel’ content. In this case, the schedule didn’t warrant us preparing the final-pixel quality. It was just too much content. It was nearly five-and-a-half hours of content…So I made the decision to just playback previs content, which was the right decision in the sense that it gave the actors an idea of what the action was out the windows. Their eye lines were accurate. The lighting was accurate. You could get interactive explosions, flak blasts and so forth. The camera knew how to respond as well. So as the plane flew by, they could whip pan with a German fighter as it flew past the windows. There was a total understanding with just enough playback from the previs. From that, then we had to replace most of the content that was out the windows using rotoscoping techniques.”
How many visual effect shot are there across the nine episodes?
“It ended up growing, basically double in size. We ended up completing around 3,400 visual effects shots. Originally it was about half that.”
There are some great shots, but one that stands out appears at the beginning of Episode 9.
“The Berlin mission at the top of Episode 9! I would agree with that. That’s going to go down as one of the great openings of TV streaming history - if there is a history in TV streaming (laughs). There was so much sophistication in terms of the choreography, the action and the performances in that scene. We go from 20,000-feet in the midst of one of the largest air armadas…When [we] were reading this from the original script, it was, 'OK, how do we play this out in the five-minute sequence?’ It’s so complex and so sophisticated. It was with the help of just plugging through it — first with some storyboards and then with some previs to choreograph the action and then try to uphold the intensity. You have the cinematic qualities of the action. I think we succeeded. I’m very proud of that sequence. We probably had about 800 planes — computer generated planes — in the sky, plus all the clouds, plus all the flak. And that included B-17s and P-51s, so it was quite a feat.”
What was the timeframe for this to all come together?
"I started in January of 2021 — right at the height of COVID, I might add — and moved to England for a year, where there was no vaccine. We managed to shoot, which was amazing. It was quite a journey — quite intensive COVID protocols. But that was January 2021, and then I wrapped in October of 2023. It was about 33 months.”
Where did they have the LED volumes set up?
“We converted a warehouse space, north of London. They took a lease on big, multiple warehouses and essentially converted them to stage space.”
What are your thoughts on the state of virtual production?
“(We) introduced virtual production on Avatar, and I started on Avatar in 2007, and it came out in 2009, so it has been ten-plus years, so this really is maturing in terms of its use. People are figuring out how to leverage it effectively and efficiently in production. I think, up until up until a few years ago, it was kind of a nice to have or a novelty experience. A lot of directors were uncertain about it. They were curious and uncertain about how to best use it. Now, I think there’s a better understanding, particularly like on a show like this, where the directors didn’t have to necessarily commit to the content that was being played back on the walls.
"I would go through and block out the action with Third Floor via the directors’ storyboards. Then I would take it to the director and they would give me notes, and then we would address those notes with The Third Floor, so when we got into shooting, it was effectively content that the directors had contributed to and were satisfied with as a proxy to the action that they ultimately wanted to see in post.”
Are you on to your next job at this point?
“Not yet. There’s a few things that are percolating out there, but nothing yet."
WHISKEYTREE
San Rafael, CA’s Whiskeytree (https://whiskytree.com) was brought in to work on 19 shots for Masters of the Air, but that order grew over the series’ long schedule, ultimately rising to a total of 283.
According to Whiskeytree VFX supervisor Aidan Fraser, the studio’s contributions spanned the entire series, and began with basic, out-the-window shots that replaced previs elements shot on an LED volume.
“A lot of the scenes we were working on were kind of where they’re on the ground,” Fraser explains. “Less of the big action scenes, but required creating that photoreal environment. (They were) slower-paced scenes, and we’re used to working on big action scenes, where shots are less than three seconds long. We’re looking at a lot of really long takes. So that was interesting. We really get to stare at the work for a while.”
He points to a scene where the crew is grounded due to intense fog. The scene was shot with a gray-screen backdrop and then the Whiskeytree team created the low-visibility environment.
One of Whiskeytree’s more intricate contributions was a long establishing shot of Nazi-occupied Paris. The visuals are part of a three-minute scene that includes a view from a train and then expands to show period-accurate architecture that’s been adorned with German flags.
“All of that detail was really fun to kind of research and get into,” says Fraser of the city’s architecture.
Fellow VFX supervisor Brian Meanley agrees.
"There was a lot of research that went into it. Aidan had done a lot of good work to upfront, like trying to find the right location to base our shots in. And then from there, we could look at the actual map data to make sure we got historically-accurate buildings, streets and the trains that were running through the city. We wanted to make sure we started with that data upfront. And then things had to be art directed a bit too, so we had to remain flexible. But because it was such a large, vast shot, a lot of it [had] to be done procedurally. So that’s where (lead digital artist) Pierre (Nahoum) came in.”
The studio’s VFX skills landed them another complicated shot — a scene set just after the Allied forces take the coast of France.
“(VFX supervisor Stephen Rosenbaum) liked working with us,” recalls Fraser. “Our work was going over well, and he said, ‘How about we give you D-Day?’ It became one of the shots that’s in the opening sequence — that’s quite an undertaking. We’re up in the air, from the pilot’s perspective, looking down, seeing not only squadrons of B-17s and P-51s, but basically everything that has happened on Omaha Beach after the battle, and seeing all of the organizing of troops, vehicles and tents. Just an enormous amount of detail to put into those shots.”
That particular shot went through several iterations, notes CG supervisor J.P. Monroy.
"We kind of used a bit of that early part of the shot creation to define the amount of things happening in the scene,” Monroy recalls. “We had a lot of flexibility in moving things around and getting sort of the right framing, the right composition of boats…You kind of build the assets and then you start to lay them out and get a good read first. Then, you go from there with iterations and an art direction.”
Whiskeytree uses Autodesk Maya as its asset creation tool and Houdini for procedural aspects, while Gaffer is used more for layout and scene assembly.
They also developed a system to procedurally add bullet holes and flak tears to plate footage of actual bombers to represent the damage they received from their respective missions. Fraser describes them as extremely-detailed effects that were added at render time.
“(It) also gave us creative flexibility, so we [could] move that damage around the plane and kind of see it as we’re working on it,” he explains. “We didn’t have to be locked down to one kind of look, and we could work with continuity.”
DNEG
Dneg (www.dneg.com) drew upon its global network of studios to complete its visual effects contributions to Masters of the Air. According to VFX supervisor Xavier Bernasconi, as many as 400 artists across studio locations in Sydney, India, Prague, London, Montreal and Vancouver, all had a hand in delivering nearly 2,000 shots for the series, spanning Episodes 1 through 8.
Bernasconi (pictured) says he joined the show back in April of 2021 and spent close to six months with the production while it was shooting in London. Dneg’s work focused on the series’ aerial sequences, which feature B-17 bombers, P-51 Mustangs and the German Luftwaffe. The studio also contributed to the show’s virtual production, recreating previs content at a high level of detail for partner Lux Machina (www.luxmc.com) to display on the production’s LED volumes.
“Basically, The Third Floor did the previs, and then gave it to Dneg Dimension to recreate to make it as high quality as possible, considering the limitation of the hardware,” notes Bernasconi. “And then, Lux Machina was responsible for putting [the content] on the screen…The Dneg Dimension team was responsible for the motion capture of the gimbal, as well as preparing any scene that was coming (up) next.”
The studio’s aerial work spanned takeoffs and landings at the airfields, as well as in-air battle sequences.
“It was a big undertaking,” says Bernasconi. “As soon as we finished the shoot, we went into asset production, starting to work on the B-17s, all the planes, all the environments. That was until 2023, when we delivered the last few.”
The virtual production shoot served as a starting point for the visual effects process, acting as a low-resolution reference from which to gain direction.
“At some points we had like five hundred B-17s, or three hundred German fighters. Not all of them were super high resolution because the highest resolution was millions and millions of polygons. We went down to the single rivet in terms of (the) model, so it was super, super heavy.”
Bernasconi continues, noting that there is a ‘Holy Grail’ concept, where the same asset can make its way through previs, all the way to final picture, but that often doesn’t work out.
“The needs for any stage of the production are slightly different. Previs is about being faster (and) able to iterate. You don’t want to have very heavy scenes, so your assets are a little bit lower resolution…I think that what was very good about the previs in the [virtual production] is that it informed us so much about what was coming. We knew the extensive amount of work.”
Bernasconi adds that they rarely had a clean shot, with so much camera gear and lighting appearing in the frame. In addition, the cloudscapes were all designed and directed later, and in much greater detail.
“Everything was CG,” he says of the aerial sequences. “The production team, the art department and the production design team did an amazing historical research job. For each mission, we had what we call a ‘mission book.’ Every mission book had visibility parameters throughout the mission. So at every moment in the mission, we knew what was the visibility distance. Because of that, we knew we had to reproduce it very accurately. We couldn’t just improvise or make it up. So what we did was, create what I called a ‘cloud atlas.’”
The cloud atlas had reference images that he would show to VFX supervisor Stephen Rosenbaum, who would then pick the atmospheric needs for each mission.
“Then, what we did was created those images,” says Bernasconi. “We sculpted very simplistic 3D representation of the clouds. Once we had that, we’d run simulations inside each of these little volumes. How gas and water act and change.”
When they achieved a desired result, they would freeze it in time, and then create clusters of clouds, and ultimately mega clusters that the planes would fly through. They also employed proprietary tools that were developed for Dune, which could recreate the scattering of light at certain angles based on the horizon.
“You could say it’s 3 o’clock, and it’s this particular latitude and longitude, and you get the right (lighting),” he explains.
The planes themselves were based on LIDAR scans taken of actual World War II aircraft on display at museums.
“We had very high resolution reference models, (but) unfortunately, when you use LIDAR scans, the way that the model comes in is not really friendly for final images, so we needed to basically recreate them and then texture them, and then do the shading.”
Bernasconi says the B-17s were particularly challenging because of their camouflage matte paint.
“The B-17 was so difficult to recreate because of the specularity,” he recalls. “It was very matte…but a very broad specular, and it was almost like a very fine line between looking plastic (and not) exactly the right metal. Stephen is extremely detail oriented and really pushed us to be as accurate as possible.”
In addition to its proprietary tools, Dneg relied on Autodesk Maya for modeling, Adobe Substance for texturing, SideFX Houdini for effects, Clarisse for rendering and Foundry’s Nuke for compositing.
Bernasconi points to the mission in Episode 103 as a highlight of the studio’s work. In it, the 100th Bomb Group is called on to destroy aircraft manufacturing plants deep within Germany. The mission covers such a great distance that the bombers wouldn’t have enough fuel to return to their base. Instead, they fly on to an Allied base in North Africa.
“It was so, huge,” he recalls of the sequence. “We traveled from England (to) Germany, Italy and then Algeria. We covered thousands of kilometers — big air battles. (It was) very complex and we had to choreograph it with Stephen. I think Episode 103 was the biggest one from my point of view.”
RODEO FX
Headquartered in Montreal, Rodeo FX (www.rodeofx.com) completed work on 404 shots across eight of the series’ nine episodes, with Episode 9 being its largest contribution. Patrick David (pictured, right) is a VFX supervisor at the studio and recently shared insight into the studio’s work on Masters of the Air.
Patrick, can you give us some insight into Rodeo FX’s VFX contributions?
“Our main body of work was on the three different POW camps our pilots encountered in the story: Stalag Luft III, Stalag Luft XIII and Stalag Luft VII. We were responsible for all those environment extensions, as well as the digital crowd to populate them. In Episode 109, Stalag Luft VII is liberated by an advance of American tanks, assisted by a P-51 strafing the camp. We handled all of the action in that sequence, including firing tanks, advancing American infantry, P-51 planes, tracer fire, explosions and destruction. In addition, we created the opening sequence in Episode 108 involving the renowned Red Tails bombing an Italian outpost while flying their four P-40 aircraft at night. We also contributed to some sequences at Thorpe Abbotts airbase involving taxiing B-17 aircraft preparing for takeoff, and many smaller one-off environment and FX shots throughout the show, such as an extension of a heavily-bombarded Nuremberg, snowy environments during the Long March sequence, and a line of flak guns firing at approaching B-17s in Holland.”
What were some of the VFX techniques used to complete the shots?
“For the POW camps, production had built around five real buildings with some fencing in an airfield in England. As these camps were all real places that we needed to extend and recreate in VFX, we were provided extensive reference and storytelling queues by the show’s overall VFX supervisor, Stephen Rosenbaum. He provided detailed ‘bird-eye-view’ plans and photographic period references of each of the real camps that our team used to accurately depict the size and conditions there. Although production probably had around 100 extras on-set, we needed to create digital crowds of POWs and German guards to populate the background environments with hundreds to thousands of prisoners, depending on the camp and the progression of WWII. Digital assets were created of each of the different POW costumes and accessories, such as coats, shirts, pants, boots (and) hats, which our crowd system, driven by Golaem, could mix and match to create unique individuals for shots that required thousands of digital extras. Bespoke mocap performances were acquired in our studios using two Xsens suits.
“We captured performances…of POWs shivering, walking exhausted, warming their hands. We also took care to capture the performance of two characters interacting with each other so that we could have more realistic and natural behavior in the background…For a night scene where German guards search for a radio, we attached a digital flashlight to our mocap guards so even our [background] guards were searching as they walked.
“Our CG camps were also built to match summer conditions as well as snowy winter state, complete with icicles and snow drifts on the barracks' roofs. In Episode 9, Stalag III is set ablaze by retreating German guards as the POWs begin the ‘Long March’ towards another camp. Production shot many elements of some of the real structures burning, which we combined in comp with FX simulated fire and thick smoke plumes emitting from the camp's barracks.”
How about the Red Tails sequence that opens Episode 8?
“We were provided excellent air-to-air reference plates of real P-40s and P-51s to study lighting and airplane dynamics. Great care was taken to use the correct plane serial numbers, nose art and markings that the real Red Tails flying this mission would have used. Digital matte sky and environment paintings, and a digital version of an Italian outpost were built and used as backgrounds. Our animation team carefully recreated the air flight characteristics spotted in the reference, such as buffeting, banking and how pilots used different flaps to pilot the aircraft. We also paid particular attention to respecting the fighter flight formation for the approach and to show how the fighters would ‘fold’ together to form a single line for the bombing of the outpost. To clarify the speed and distance of the planes in our full GC shots, our FX team laid out several layers of wispy clouds in each shot in order to create a parallax effect. Buildings were built by our assets team to have wooden support beams, rock walls and stucco layers to provide different layers and scales for our FX team to destroy.”
What tools does Rodeo FX use for its VFX work?
“We use a lot of industry standard tools, such as Maya for animation, Katana for lighting, Arnold for rendering, ZBrush, Substance Painter/Designer, Golaem for crowds, Houdini for our FX simulations, Clarisse for our CG environments and set dressing, Photoshop for digital matte-painting work and Nuke for compositing. For our mocap we used Xsens suits, which capture the performer's motion without needing a large optical mocap stage, since sensors are placed at different joints in the suit to capture motion.”
Which shot would you say was the most challenging?
“The last shot of Episode 6 is a long, craning shot that pans off a group of POWs arriving at Stalag Luft III and reveals a very wide angle of the greater camp for the first time. This shot was challenging for its length — several seconds — and the amount of digital crowd in-frame that we needed to blend in side-by-side with real extras. Since the shot rises up so slowly and lasts so long, you have a really good look at everything in the frame. Great care was taken to vary the performances to not spot any two POWs doing the same action.”