While the broadcast industry is still debating how to bring the stereo 3D experience to home viewers — although ESPN and n3D both offer 3D channels — the film studios, and even the videogame industry, appear to have a jump on things. Just about every week, a new 3D film seems to be opening at the box office, so for those looking for an immersive experience, they need to look only as far as their local cinema.
Rio and Hoodwinked Too were just two films playing in stereo 3D at press time. And scheduled for release this month — in stereo — is the fourth installment in the Pirates of the Caribbean series, On Stranger Tides. Nintendo’s 3DS handheld gaming system also provides a stereo 3D experience, albeit on a much smaller scale than cinemas.
Post spoke with several pros this month to get their take on the S3D business. The folks at Legend3D see 2D-to-3D conversion as a way to cost effectively create a high-quality 3D experience for new material or catalog assets. Cinesite is an accomplished VFX facility that just entered the S3D business by working on and completing its first 3D feature, the above mentioned Pirates. And Assimilate is a technology developer that is helping to streamline S3D post with tools that understand stereo on file, image and management levels.
LEGEND3D
Industry veteran Rob Hummel joined San Diego’s Legend3D (www.legend3d.com) six months ago with the goal of increasing conversion business from outside studios, as well as helping to articulate the vision of the company in relation to stereo 3D and stereo 3D conversion.
Based in Los Angeles, Hummel brought a deep knowledge of 3D stereo, dating back to its early days. He was involved in the Disney theme park attractions Captain Eo, featuring Michael Jackson, as well as on Muppet-Vision 3D. He’s also worked in visual effects and on high-profile restoration projects. Prior to joining Legend3D as president, Hummel was with global post company Prime Focus, which also offers stereo conversion services.
His high standards match that of Legend3D’s principals, and together they hope to move the business forward on a qualitative level without compromise.
“They really believe that if we don’t do it right, we end up damaging the overall 3D business in general,” says Hummel of the company’s vision. “We actually turn down work, when people offer us a lot of money, but there isn’t enough time to do a quality piece of work. We regretfully say, ‘No.’”
Hummel describes stereo 3D as an illusion. “It’s because of that illusion that we can do this conversion that looks just like things were shot with the camera. But, if you compromise that process, suddenly everyone thinks that conversion looks like crap.”
Legend3D has provided stereo 3D conversion for a number of features, including Alice In Wonderland and The Green Hornet. The studio also worked on the opening credits and interstitials for the Super Bowl episode of Chuck, which marked one of the first TV shows to air in stereo 3D.
At press time, Legend3D was about to open a new Burbank facility that would bring the studio’s services closer to the Hollywood community. The location will serve as home to as many as 50 artists, who will handle final comp work. The site will have a review theater and be connected to the larger San Diego facility via a high- bandwidth pipe.
Hummel says he spends much of his time meeting with studios to discuss strategies for new and catalog content. While technology for shooting productions in 3D is becoming more available, Hummel believes the cost-effective and high-quality results that can be achieved through a conversion process may lead more productions to shoot their 3D projects in 2D.
“Shooting in 3D has its problems. If somebody really likes the look of 35mm anamorphic film, you can shoot that way and it will look like there is no compromise whatsoever,” he says, referring to the stereo conversion process. “It can be a choice of the filmmaker. It’s a real choice.”
Legend3D’s expertise in conversion comes in part from the proprietary colorization software that was developed by company founder and COO/CTO, Barry Sandrew, Ph.D. The same technology that’s used in edge detection for colorizing is being used for rotoscoping, speeding up the process considerably. A scene that may take other facilities two weeks to convert, can be realized in just two days using their proprietary technology, notes Hummel.
CINESITE
London’s Cinesite (www.cinesite.com) recently wrapped up work on the new Pirates of the Caribbean film for Disney — On Stranger Tides. The film marks the first stereo 3D feature that Cinesite has been involved in, and according to head of VFX technology, Michele Sciolette, getting a workflow in place was key to making things run smoothly.
Cinesite handled VFX for more than 300 shots in the film, most of which involved set extensions, though several feature a CG frog they created. The film was shot using dual-camera rigs outfitted with Red One, and later Red Epic, cameras. “It has really refined our workflow,” says Sciolette of the film, “and it has been very different than anything we have done before in that sense.”
On Stranger Tides pressed Cinesite with developing proprietary tools to address color differences between the two cameras. “In particular, we’ve been working on a set of tools to do automatic color correction between the two cameras,” Sciolette explains. “The beamsplitter on the camera rig will always give you a pair of images that [don’t match] 100 percent in terms of colors. We’ve developed our own solution that we’ve found extremely effective to deal with that particular problem.”
Cinesite also used The Foundry’s Ocula, a set of Nuke plug-ins designed to solve common problems that occur when shooting stereoscopic imagery.
“We have used Ocula for a few shots to deal with other problems, like alignment corrections of the plates,” Sciolette explains. “We have not been using Ocula for any color correction. Our own solution… is fully automated. Even though Ocula is somewhat automated, it still requires some manual input and manual work to set it up, while the one we have developed here is entirely automated and has been extremely successful on that show.”
The studio’s tools are software-based and also run within the Nuke application. “Initially, we were thinking about using it on a few shots, but we ended up using it on all our shots in the end,” he recalls. “It was [used] for color matching the two plates.”
Cinesite did not work with the RAW R3D files from the Red cameras. Instead, they were provided with files that had been converted to EXR and DPX file formats. “We were provided both formats,” Sciolette notes. “I think it was the company that was providing the camera rig that was doing the conversion to DPX. It was already a converted frame sequence. It was no different than any other film production. We didn’t have to deal with any RAW file conversion. On any film project, we get our shots as sequences as DPX frames, occasionally EXR, and it was the same here.”
Because Pirates was a stereoscopic production, the studio was given left and right views for each shot. Since the film represented the studio’s first S3D job, a pipeline had to be put in place for viewing and review. “There was a significant amount of work dedicated to hardware — getting set up,” he recalls. “The first issue [was] to find a way to review our shots in stereo.”
Cinesite already had a screening theater, with a Barco projector, but it was not set up for stereo viewing. The solution was to install a Dolby system like those found in commercial theaters.
Smaller-scale viewing also presented a challenge. “We set up different types of reviewing stations,” says Sciolette. “We have a polarized system that we use in our daily review stations. And then we have a few 46-inch LCD televisions, with interlaced polarized glasses, that we use on the artists’ floor.”
Many of the artists also had 24-inch interlaced monitors and passive glasses for more informal viewing. “We wanted something that we would use for a few features, and I have to say, in general, it has been extremely successful. There are a few areas that we might want to improve on, in particular in the area of correcting for alignment between the cameras,” he says.
“For future features we will probably introduce that part of the process in an earlier state than we did in this case. But that is kind of a minor detail. Everything has been working extremely well considering all the unknowns that we had in the beginning.”
Sciolette also says having the team of supervisors learn about stereoscopy ahead of time also helped. “On top of all the infrastructure, there definitely was a component [of] making sure the knowledge was here. And there’s no better way to get good at it than actually doing shots.”
They’ll get their chance on the next job, the stereoscopic feature film John Carter of Mars.
A DIFFERENT PERSPECTIVE
Jeff Edson is the CEO of Assimilate (www.assimilateinc.com), a company that makes dailies, versioning, conform, color and finishing tools — specifically Scratch and the new Scratch Lab on-set dailies workflow tool. Assimilate announced support for Red’s Epic 5K camera at NAB, and the company sees the Red cameras, as well as releases from Arri and Canon, among others, as instrumental in the future production of stereo 3D films. As mentioned in our Cinesite section, Epic was used on the latest Pirates movie.
“[Pirates] was the first major motion picture shot in Red 3D,” notes Edson. “All of the first passes were all done with Red One MXs, and they did all the pick-up shots with Red Epic stereo cameras.”
Company 3 used Assimilate’s Scratch to process stereo 5K S3D footage shot with Epic cameras. “3D is a core part of the product,” says Edson of Assimilate’s Scratch.
Based on what Edson has heard from pros working in stereo, he sees two ways of thinking when it comes to this particular type of production. “One is, if you can shoot 3D at the same price and same schedule as shooting 2D, then everybody will do 3D,” he notes. “For people like 3ality, that’s their holy grail mission. They have proven that.”
Edson is referring to the use of 3ality Digital’s (www.3alitydigital.com) technology to produce shows like NBC’s Chuck, which was shot in S3D and posted in the same timeframe as a typical 2D episode for the series.
The other line of thinking, Edson says, looks at 2D-to-3D conversion as a solution for creating S3D content. “I don’t know if there is a right or wrong position in today’s world,” says Edson of the two ways of thinking, “as long as the end product is a good product.”
Last year, NAB had a very heavy S3D theme, with many manufacturers showing their support for the format with new technology. Edson says this year’s show had a different feel. “If there was one, I think it was mobile,” he says of the trend he spotted at NAB 2011. “Everyone already has 3D tools for doing what they do. To me, it’s sort of like 3D is a given.”
But that mobile trend can influence the stereo 3D market. Edson points to Nintendo’s 3DS, a hand-held gaming device that offers a stereo 3D experience without the use of glasses. For around $250, consumers can get their feet wet with a 3D device that even allows them to adjust the level of the S3D effect. The younger generation — Edson points to his own pre-teen kids — likes the portability of such devices, and are content with watching videos on a small screen.
And while the kids may not be averse to wearing 3D glasses, solutions that use a lenticular filter to create a 3D experience may further push manufacturers to come up with glasses-free solutions, which, on larger scale monitors today, is still cost prohibitive.
Edson also feels that the market for S3D tools that “fix it in post” will decrease as pros become more experienced in shooting it. “If you are shooting digitally today and get corrupted frames, nobody is going to sit there and say, ‘We’ll just fix it in post.’ Corrupted frames are corrupted frames. It’s a bad shot. You are either going to re-shoot it or work around it. You are not going to fix those frames in post. The same thing is fundamentally true in 3D. If you get bad 3D shots or don’t get the focal point that you want, you are not going to go in and do a lot of weird reconfiguring to make it work.”
He continues, “Our take on it is that more and more is going to be done at the acquisition stage. If I have a dual-camera rig, I will have a stereo processor fixing everything as it comes out the back end of the shot. If it’s not ‘fixable,’ they will get that instant feedback right there and fix the shot. So, the amount of fixing bad stereo coming down the pike in post? My take on it [is that] six to nine months from now, it’s going to go away.”