<I>Those About to Die</I>: Virtual production supervisor James Franklin
October 8, 2024

Those About to Die: Virtual production supervisor James Franklin

Prime Video's Those About to Die is set in Rome, 79 AD, where the population has become bored, restless and increasingly violent. The masses are kept in line by two things: free food and spectacular entertainment. As the taste for entertainment becomes more jaded and bloodthirsty, a stadium designed for gladiatorial combat is needed - the Colosseum. Underground, thousands of people work and live among those who will die for the games.

Roland Emmerich and Marco Kreuzpaintner directed the program, which features Anthony Hopkins, Iwan Rheon, Sara Martins, Tom Hughes, Jojo Macari, Dimitri Leonidas, Moe Hashim, Jóhannes Haukur, Jóhannesson, Gabriella Pession, Rupert Penry Jones, Emilio Sakraya and Pepe Barroso. The show streams in the US on Peacock (www.peacocktv.com).



Here, virtual production supervisor James Franklin shares insight into the show's production, which in 2024's Season 1, spanned 10 episodes.

Can you describe virtual production setup that was used on this project?
 
"Those About to Die was filmed using a combination of location shooting and filming with an LED volume setup curated and operated by Dimension and Dneg 360. The LED volume setup used was located at Cinecittà Studios in Italy, and was chosen due to its extensive history with large-scale productions, including HBO’s Rome, and the recent Ben-Hur movie. Plus, Cinecittà World — an adjacent amusement park — already had a chariot racetrack built for Ben-Hur, which provided an ideal location for the chariot racing scenes in Those About to Die.
 
"The set up at Cinecittà Studios included a 24-meter diameter revolving stage and an LED volume stage, which was eight meters tall and had a circumference of 51 meters in a U-shape, making it exactly what the production needed for large, immersive sets like the Colosseum and Circus Maximus. The revolving set allowed for an incredibly rapid turnaround for shooting new scenes and reverse shots. 
 


"We worked to upgrade the internal systems of the LED wall to support the needs of previs, virtual production and visual effects. We upgraded the hardware, which drives the LED wall, with new render nodes and faster connectivity, and also installed a new camera-tracking system, which reliably tracked the camera into Unreal. With months of shooting ahead of us and a very busy schedule, everything had to be robust and efficient.
  
"An onsite content-production studio was set up, and artists were flown in to work locally. This required a render farm and local storage, which was connected to our base back in London, enabling on-site artists to work in sync with our remote artists around the world. Working with the production designer, art director and set designer, our team worked in Unreal Engine to recreate both the Colosseum and Circus Maximus, as well as other expansive Roman environments and internal sets, with precision. The use of an LED volume also allowed the production and director the added benefit of flexibility to adapt backgrounds as needed and at speed."
  


What kind of timeframe were you up against?
 
"The production had to move quickly due to actor availability. This led to the team building an additional smaller LED wall for some scenes before the main LED wall was fully ready to be able to continue filming throughout summer."
 
What kind of efficiencies did the virtual production stage provide?

"As well as efficiencies when it came to re-dressing sets, or adapting backgrounds, the use of virtual production also allowed the production to adapt to the needs of actors too. Building an LED wall on demand to accommodate for actor availability is a great example of the efficiency that VP enables — it effectively avoided a total re-do of the production schedule. 
 
"The director is known for his shots at golden hour. The use of virtual production meant golden hour was available all day long. The revolving stage also meant that production could quickly switch between shooting directions and sets. This also meant that reverse shots were much faster than with traditional techniques."
 


How did shooting on a virtual production stage affect the need for further VFX in post production?
 
"Director Roland Emmerich’s approach was key to making virtual production work for the show. He and his team embraced the technology fully, which enabled the production to rely heavily on the virtual-production stage. As such, the production ended up with 1,800 shots accomplished in-camera on the VP stage, and only 800 shots then that required post production VFX, making for a 2-to-1 ratio of VP versus VFX – a highly unusual ratio, especially for large-scale productions like Those About to Die.
 
"When fixes to what was shot in-camera were necessary, they were intentional and minimal. For example, there were scenes where physical sand was used in front of the LED volume on-set, and this often required edits in post to blend the physical and digital sand for a cohesive image. Most adaptations were also completed during the shoot itself, minimizing the need for post fixes. In some cases, we had to shoot quickly on-set to save time and stay on-schedule, as such adjustments, such as color correction, were left for post production. But these adjustments were very manageable in post, as it turned out that we had less than 10 shots that needed those quick post production fixes."
 


Can you share some details on how volumetric capture enabled the 80,000-seat Colosseum to be filled with a much smaller number of people?
 
"The team used the Polymotion Stage, a collaboration between Dimension Studio and MRMC, to capture 500 individual volumetric performances from 90 actors, both in London and in Rome. These performances were used to create 32,000 animated characters by customizing the looks and creating variations - which then filled the 80,000-seat Colosseum. The use of volumetric capture also allowed for realtime crowd rendering in Unreal Engine, and the team could relight these digital characters in realtime, making it possible to simulate different lighting conditions and time of day, further enhancing the realism."
 


What can you share regarding the revolving stage at Cinecitta Studios?
 
"The revolving stage was highly efficient, as it eliminated the need to physically redress or reposition set. Using the rotation of the stage, the crew could quickly change scenes and camera angles. They could also quickly rotate the content on the LED wall, making it easy to shoot in any direction without delays.
 
"The revolving stage was a 24-meter-diameter platform, essentially a giant Lazy Susan, which enabled the team to shoot complex scenes with animals, like horses and giraffes, without having to shuffle the set around and redress. On average, it took roughly five minutes to do a complete 360. This feature also allowed for seamless transitions, faster reverse shots and continuous shooting, significantly speeding up production.
 
"The use of the rotating stage was particularly effective in scenes like the Ludus stage, where gladiators trained. Although the physical set only included a small amount of sand and a fence, Unreal Engine was used to extend the environment digitally and add details such as the stands and surrounding buildings. There were also digital characters walking back and forth in the background, enhancing the realism by adding further depth to what was being shown on the volume."