The term ‘virtual production’ means different things to different people. In fact, the Visual Effects Society (VES) and the American Society of Cinematographers (ASC) recently partnered to create a new resource that’s designed to help establish a common vocabulary for professionals working in the virtual production space. VPGlossary.com defines virtual production as using “technology to join the digital world with the physical world in realtime. It enables filmmakers to interact with the digital process in the same ways they interact with live-action production.”
Photo (top): Dimension
Nic Hatch, the CEO of Ncam (www.ncam-tech.com), and a former CG artist for MPC and Mill Film, says, without question, that 2022 will be the year of virtual production. The company’s Ncam Reality, which is designed specifically for realtime camera tracking, complements studios’ existing motion capture setups, making it well suited for in-camera VFX (ICVFX), where shots are finished in realtime. He details the pros and cons of ICVFX in an
article on postmagazine.com.
Sony (pro.sony) has also been working on virtual production solutions and is pairing its Venice and Venice 2 cameras with the company’s Crystal LED wall technology to create cinematic images while reducing the need for additional post production. Head of new media sales and business development Kevin O’Connor shares his thoughts on the
Post website, including insight into tests Sony conducted with Epic Games to fine tune its ICVFX toolsets.
Here, Post talks with Steve Jelley of Dimension in the UK about their move into the virtual production business, and with Carlos Fueyo, who recently joined Eyeline Studio as their virtual art department’s art director/virtual production supervisor, following his own success creating the animated sci-fi short
Replica, which received an Epic MegaGrant. And later in this feature, Ian Milham, who serves as virtual production supervisor at ILM, talks about his role at the studio and how their StageCraft virtual production stages are enabling a new way of creating high-end content, including The Mandalorian and The Book of Boba Fett.
Embracing the possibilities
Dimension, with studios in London and Newcastle, UK, considers itself a new breed of production studio — one that creates content using a combination of volumetric capture and realtime engines.
Steve Jelley (pictured), the studio’s co-founder and joint managing director, says Dimension has more than seven years of experience with leading edge virtual production techniques, having gotten its start in immersive content production using Unreal Engine.
“We’ve built what you would now call a ‘virtual art department,’ though we didn’t know that’s what we were going to call it,” he explains. “We were an early pioneer of volumetric capture array-based photography and virtual humans/avatars. We also have a lot of skill set in terms of large-scale production, so we own and operate our own studios.”
The company’s move into virtual production was led, in part, by sustainability, he notes, and how they could do things more efficiently by reducing the need to fly large crews around the world for shoots.
“We started…doing some internal tests. And then when the pandemic happened, we found ourselves in the position of being able to do it at scale in the UK. We’ve been in continuous production for the last couple of years on virtual production jobs.”
While many of their current projects are still in production or for clients whose programming hasn’t aired yet, the studio has several examples of its capabilities on its website (www.dimensionstudio.co). In fact, Dimension partnered with Sky, Dneg and director Paul Franklin to showcase what’s possible using these latest techniques via a series of proof of concept videos. The examples call attention to a number of benefits for using virtual production, one being ‘instant locations,’ which play into the initial idea of being more efficient with travel, etc.
‘Durability of the day’ is another concept that is illustrated on the Dimension site. Rather than having to schedule shoots during the golden hour, which has limitations due to both timing, duration and the unpredictability of weather, using an LED volume can provide productions with an unlimited window in which to shoot at a optimum state. In addition, the acting talent has a more interactive experience, which tends to improve performances.
“Every shoot I’ve ever done, the actors are such fans of doing it this way, because they can see what they’re doing,” he explains.
“Our model is one of an end-to-end engagement in making a film or TV show, or an hour-long drama,” Jelley continues. “We will get involved right at the conceptual stage, when people will be thinking about, ‘What does this look like.’”
This includes meeting with the production designer, director and visual effects supervisor.
“We’ll be concepting what things might look like, whilst at the same time…going through a previs process,” he explains. “We will also run a techvis process, which is really just about: What does it look like in real life? How big are the LED screens? What kind of lenses am I going to use? Can I actually do a wide shot using 35mm lens?”
The ultimate goal, says Jelley, it to capture as much in-camera visual effects as possible.
“If you’re shooting final pixel — you’re photographing it — that’s your image. You might do some things, like wire removal or something…but you’re not relying on VFX putting the majority of the image into the frame. So that is the goal: in-camera visual effects. But virtual production is wider than that.”
In his time working with LED volumes, Jelley says he has come to a number of conclusions: “They are better at some things than others. Environments, vehicles, those kinds of things [are phenomenal]. But water? Not so much.”
Dimension has a small stage of their own and will help configure larger stages when needed, based on the production. In fact, most of their work is produced on major soundstages throughout the world.
“You tend to need large stages,” he notes. “Once you put the range of lenses you want to on it, you end up needing quite a big (LED) wall, so we tend to follow production decisions. We will build a stage to suit each production generally — build out the soundstage wherever they shoot — and we do that globally. We’ve done productions in UK, North America and Eastern Europe.”
Partnerships, note Jelley, are important in furthering the use of virtual production processes, and played a large role in the demonstration videos it features on its website. Dimension has been working with Microsoft on many levels, particularly volumetric capture. They’ve also had a long-standing relationship with Dneg as a visual effects partner. Sky is a long-standing client and is looking to explore virtual production as an option for its content.
“I think there is a myth that only the very large scale Hollywood productions can use this,” says Jelley of the technology. “That’s one of the reasons we did that shoot, because we wanted to prove that that wasn’t the case.”
Right now, Jelley feel virtual production exists in its 1.0 iteration, and that both growth and opportunity are on the horizon.
“The technology works. It’s complicated, but it definitely works. Innovation? That’s going to continue in this space…Where the real innovation is going to come is when people start writing for this technology. They started writing scripts. So if I’m sitting down, writing my series and I wanted to set it on a space station, but I never thought I could, now I can! I can do it all in an LED volume. Just as people wrote great westerns for the western sets they had in the ‘40s, you’ll find people writing scripts for LED stages and writing around them, and writing content that’s actually (for) what this technology can do best. That’s going to be the biggest change. And that’s already happening. I can see that in scripts that I’m reading now.”
A career in virtual production
Carlos Fueyo’s path into the world of virtual production was somewhat indirect. He got his start in animation for entertainment back in the early 2010s, after transitioning from an architectural background. He worked on director Scott Stewart’s Priest, and later, director Roland Emmerich’s I
ndependence Day: Resurgence. Happy Mushroom then contacted him to work on a Marvel movie, and that’s where his virtual production/studio career officially began.
He and his wife Tanya also founded Playard Studios in Miami (www.playardstudios.com), where they created Replica, a sci-fi animated short that was an Epic MegaGrant recipient. Replica, says Fueyo, will, in time, span multiple episodes, showing off the capabilities of Unreal Engine, as well as the tools that helped empower its creation, including releases from Lenovo and Nvidia.
In the meantime, Fueyo is now serving as VAD art director/virtual production supervisor with Eyeline Studio — a part of Scanline VFX. Eyeline, says Fueyo, recognizes the potential that virtual production techniques offer and will be making a big announcement in the months to come.
“Our job is to create these sets for the virtual art department,” he says of his latest role. “Eyeline is really set to redefine the way in which productions are made. They’re really set up to reinvent or change the future of filmmaking. That’s very clear. It’s a very, very passionate group of people.”
Virtual production, he notes, can have a number of different meanings, whether it's the use of an LED wall, or being able to work with filmmakers in modifying or creating sets so that when the time comes to shoot, potential hurdles have been eliminated.
“For me, ‘virtual production’…it’s more of this very broad statement as to anything that can be done in the filmmaking process,” he explains. “We can do it in the ‘virtual production process,’ meaning we can do it digitally, right? We can work with digital sets, but those digital sets are not only live at the beginning, where they’re kind of rough, and allowing the creatives to change and do things quickly. They can go all the way to the end to VFX. So to me, virtual production is something that can start at the inception of the project — you’re talking to the director and the production designer and the DP, and figuring out everything — and then go all the way to VFX, where you take that set, that environment, all the way to final pixel. That’s a huge part — the final pixel!”
Fueyo believes that in the very near future, a percentage of shots that used to go to post for visual effects work, will be finished in what he calls ‘pre post.’
“Let’s do the post while we’re doing the production,” he suggests. “Let’s have the director be involved in the creation of the shot — and the DP — look at the actors within an environment. Being able to go into the environment in realtime and iterate in realtime, change the lens, the depth of field, the light on the background, do live compositing, do everything that you will otherwise stretch over six months and takes a lot more people. I think, in a way, the vision of the creator gets diluted. Take all of that and just give it back to the creator so that they can do it on the spot.”
Fueyo believes part of the challenge of advancing the concept of virtual production is finding talent that is able to look at the concept differently than from traditional production and post workflows.
“It’s still something that is related to VFX, right? We’re bringing VFX into production, but then we’re using a game engine. So then you start thinking, ‘Can [it] be a VFX artist, because the VFX artist only knows things in post terms? [Can it] be a game artist, because a game artist typically doesn’t have the eye to take things to a final pixel, or to something that looks hyper realistic? They always work within the confines of the game. So you have to always look in the middle…The biggest hurdle that I find on virtual production is being able to make people understand that this is unlike anything else.”
Fueyo’s role as a ‘knight innovator in residence’ at Florida International University CARTA’s iStar, an immersive studio for altered reality, is meant to help change that. It was at FIU that he initially studied architecture.
“My role is to figure out a way [to] apply all these lessons from virtual production, and bring it into the educational world,” he explains. “The one thing that I’ve been noticing for the past few years is that, a lot of the talent — the lack of talent or finding talent — it has to do with the fact that we might not be looking in the right places? We might be only looking into the VFX world and the production world, when there’s architecture or engineering, communication students, people who, until this point, would have never [been] thought of. They could have a career in film, like I did, for example, [going] from architecture to film…You can start tapping into talent that has a different core of knowledge, and you can start bringing them in. So I think the educational part is a huge part of virtual production.”
ILM makes virtual production a reality
Ian Milham is a virtual production supervisor at ILM in San Francisco. The company has three of its StageCraft virtual production stages in Los Angeles, as well as one in London, and another that’s soon to come online in Vancouver. They are also capable of setting up temporary stages for productions on a case-by-case scenario.
Milham recently spoke with Post about his role, and how virtual production is changing the way content is being produced.
What is the role of a virtual production supervisor?
“My job as the virtual production supervisor is to be accountable, and the key, main point of contact and direction for the whole thing. A VFX supervisor is responsible for the whole show and whatever methodology they use to achieve their results. They’re in charge of the whole show. And then, if we’re using StageCraft, that becomes my subdomain of their of their domain.
“My job is really concentrating on successful shooting on the day. The virtual art department, where they are developing content and figuring out what are we going to shoot traditionally: what are we going to shoot on the stage, what’s going to really be there, what’s not going to be there, etc. I’ll participate in that to give counsel about how it will work and what we can do. I will be totally involved and there on the day for shooting. And then it’ll get handed off to the traditional post pipeline.”
ILM has several permanent StageCraft stages in place?
“Vancouver is coming online. We’ve got [three] in Los Angeles and one in London. Those are permanent, big volumes. And then we regularly do pop-ups, as a production needs.”
Did you have one in Sydney?
“We did. That was a pop-up. And we very well could put one back up. A lot of it is driven by client need. And in that case, we had a big client there for a bit. And then there was this sort of big bump that everybody hit in terms of COVID. If we have another client, we will put it in there again.”
Can you talk about the enormous screens that are used on the StageCraft stages, such as the one in the production of The Madalorian? Are those LEDs?
“Yeah, it’s LED, thousands of them. We can make them any shape. For something like a season of the The Mandalorian, we might do 60 locations virtually, as a portion of the whole show. So for that, you want something that’s very flexible in its setup. We’ve found that sort of ‘horseshoe’ shape, with a giant ceiling, and then we have got two big fly-out doors that can encompass it and make it a full 360. People tend to think of it as an LED wall. And that’s great, but we found that the ceiling is just as much — if not more — of a contributor to the results.”
Is the screen hard or flexible?
“No. That’s the way we’ve done that before. Obviously, we’ve developed this kind of technology (for) this way of working. So in-camera visual effects, yes, we would do something that was more of a rear-projection solution. We were doing that as recently as a few years ago. And sometimes, that might still be the way to go. But for these LED volumes, it’s a hard-screen product, not that different than the kind of LEDs or giant screens you might see at a concert or a sporting venue, just much, much more pixel dense.”
Does ILM build there own walls or are you buying solutions from a manufacturer?
“There are various hardware manufacturers that are making the LEDs themselves. And, depending on the situation, we might work with any of them. There (are) all kinds of partners. It’s pretty complicated to pull this off. There are LED manufacturers, who manufacture the individual LEDs, but those are about 18 inches square. And then you can deploy them into a volume, whatever shape you need. Sometimes we do them custom, to complement a film set. Sometimes it’s one of these, where it’s more agnostic. It might be a manufacturing company that does that, or it might be a company, or us, that will work with a company that is fluent in the sort of engineering necessary to make one of those to stand up, to keep it safe. You need to be able to do all the tracking to know where all the production cameras (are) and what they’re doing. So we’ll then work with and develop that as well, to make this simultaneously an LED screen volume (and) also a motion-capture volume at the same time.”
Talk about the connectivity between the screen and the camera. It’s not just a static backdrop. There is motion, based on the camera movement?
“A key thing for this is, production is typically going to be doing a lot of different stuff (in) making their production. StageCraft will be one part of it — sometimes it’s a gigantic part of it, as it might be for one of our Star Wars shows. Other times, it’s to solve a specific problem that they have. Our integration needs to be as light as possible, and as unobtrusive as possible, because [it’s] their camera operators, their DPs and everything. They have a whole lens kit…and they don’t want to change it just for our system. So we have a very light touch when it comes to that integration.
“Actually, all we really work with on the camera side is a small, little optical attachment. It’s a little crown that is mounted on the camera. We can mount it a bunch of different places to be able to use our mocap system to track where the camera is looking. Then we do some very complicated wizardry behind the scenes to make the volume complement the camera, not the other way around. We do a lot of color processing, a lot of perspective work, a lot of things to make our technology complement what they bring, without having to make them change anything.”
When talking about volumes and screen sizes, what kind of resolutions are you working with?
“There’s two separate ideas there: there is the hardware capability, which is enormous, but it’s depending on the volume. Because we build the volumes differently, the answer might be different. In terms of what we could natively display on there? 24K, something like that. We never really do, because there’s not much point in that. So the main deal is we want the wall resolution to exceed the camera resolution. And we dynamically change that all the time. What we do is, we fill the content…with all the stuff that’s designed to provide lighting and reflections. And then, where the camera looks, we do an incredibly high quality sort of magic window. And we could do multiples of those. So we typically run those 8K. We change it as needed for what the camera window is.”
What are the advantages of using this type of set up versus a typical greenscreen shoot and traditional post workflow?
“I would say there’s two sides to this. I think everybody can imagine the optical benefits of contributing realistic light and a background at the same time, and having all that right there in the moment, as opposed to something later, or adding it later via greenscreen composite. I think everybody can imagine that pretty easily. But we found that maybe even more impactful from having it there (is) the way that it creates collaboration. The problem with a greenscreen type of workflow is, the DP can’t see anything. It’s a big green sheet back there. So they light it however they’re going to light it. They’re going to do their best, but they’re going to light what’s in front of them. The director knows that we’re going to add stuff back there. We have cool paintings and artwork of what it’s going to be, but they might not see it for weeks later. And it may or may not be exactly what they were thinking of.
“The actors are looking at a tennis ball. They don’t know what’s necessarily going on. That night, in editorial, when they’re (putting) together the dailies, does the scene work? I think so, but it’s kind of hard to tell.
“StageCraft is a big step forward in all of those problems. The lighting is really there. The director can frame up and know what’s really going on. The actors can see the entire environment around them. (For) editorial, who may have never even been on-set themselves, or is looking at the footage, it’s all really there already. So (it’s) the way that helps people work together, as opposed to a more serial process, where the DP then goes on to another movie, and you guys finish it…And then the VFX person, who wasn’t there on the day, wishes they shot it a different way. This really helps people collaborate at every step of the process and even make some of the steps co-mingle a lot more in a way that leads to a really high-quality result.”
For shows like The Mandalorian or The Book of Boba Fett, I’ve been told that virtual production represents as much as 60 percent of the process? You’re still using composting techniques when necessary?
“Yes, we continue to believe (in doing) what is best for the result. This is not a silver bullet. I wouldn’t advocate doing an entire production this way necessarily. There are aspects where, if it’s easier to just go outside, and you get cool results that way, just go outside. Do it that way. I would say it’s averaged out, over a couple of seasons of Mandalorian that, half to two thirds, let’s call it 60 percent of the show, is shot this way. But, if you just have a quick scene in a small, intimate set that you can build out of wood and paint and everything, just do that, or go outside, just do that. That’s fine. So we use the best solution for each thing.”
Is there a scale where a project is too small to justify building a stage or volume?
“As a small, technical note, the idea of virtual LEDs and virtual production are not synonymous. They overlap a lot. We do a lot of stuff, either with simulcam or previs, where we are tracking cameras and shooting virtually. Virtual production just means incorporating things that aren’t there in your production. Sometimes that can be very light, and doesn’t necessarily need to be a whole LED solution, all the way up to something as ambitious as Mandalorian.
“I would say, in general, for something like Mandalorian, where we built this giant volume, if you’ve just got an intimate space, there’s no need to have this giant deployment for that. So something that’s physically smaller, we’ll do it that way…What’s cool about a permanent volume is that it exists already. So if a production only has one aspect of it that might really benefit from this, or they just need it for a day or two, where a pop up wouldn’t make economic sense for them…they can just roll up for a day.”
Knowing the technology that’s available now, how do you see it getting better in the year or years ahead? Where can the improvements be made?
“Well, I would say, of course, optically, it’s only going to get better looking. I think the biggest deal would be: it’s going to get cheaper, which is great. More people can use it. Right now, the content — the ‘world’ — is very bespoke. I think that’s going to get more developable, where there’s going to be more of a digital backlot, and where all the things that we are displaying on the screens are going to get better. And the tools are going to get more democratized, more easy to use and more elegant. We’re still at the stage where this is a really difficult magic trick, and our batting average is pretty great, but it [has] been hard earned. And there’s been a lot of experience to it.
“The tools, of course, are going to get a lot better. But I think what you’re really going to see is the experience of people. People are going to get so many more at bats. Right now, [there are] very few people with the technical experience and the sort of pedigree and batting average to really pull this off. You’re going see that grow over time. Just as much as the technology advances, you’re also going see the people advance.”