TECHNOLOGY: MOTION PIXELS
HOLLYWOOD — The way live images are captured today, things get digital
just about as soon as a picture hits the lens. Post proved in a
comprehensive survey years ago — before Red, before HD palmcorders —
that everybody loves a good camera. Including the everybodies who work
primarily in post.
My first question when I saw my first 4K camera was, “What are you
going to do with all those pixels?” I recently asked David Stump, ASC,
a question along those lines. Stump is a veteran feature film
cinematographer and a VFX supervisor — he’s also a director of visual
effects photography. As an ASC member, Stump is co-chair of the
metadata sub-committee and is chairman of the ASC camera sub-committee,
both of which are under the ASC’s technology committee.
There’s a “huge distinction between 4K files and 4K pictures that no
one has made effectively anywhere yet,” says Stump. By definition, a 4K
picture means that all 4096 pixels have been acquired with co-sited RGB
information — each pixel site has RGB information. For instance, the
Red camera’s captured 4K files arrive at co-sited RGB info by way of a
de-Bayer algorithm, which shares color info from adjacent pixels to
create individual, co-sited RGB pixels.
“On a mono-planar Bayer-array sensor like the Red,” Stump says,
“individual pixels are single-color pixels — either red, blue or one of
two greens.” These pixels don’t pick up their RGB color values natively
at acquisition — they’re interpolated. The Bayer Pattern Sensor was
developed 30 years ago by Eastman Kodak, which licenses it. “To
de-Bayer that information from the raw Bayer files means that you have
to share information from adjacent pixels to create single RGB pixels
throughout the picture array.”
DEBATING DE-BAYERING
“There is a huge discussion about what MTF, modulation transfer
function, [Red’s] pictures achieve by virtue of sampling four pixels to
create color information for any one 4K pixel.” Stump and others want
to discern how much resolution you can have if you share adjacent color
information from four pixels. “Many will argue that you divide the
original resolution in half, based on combining the color information,
so what you’d end up with are 2K or 2.5K images from a 4K sensor. And
that is a hotly debated topic.”
Stump adds, “I’m not advocating for one position or the other. I think
the pictures have to speak for themselves. As yet, no one has done any
real, relevant testing of the Red system and the community is begging
for that.” It looks like Dave Stump, given his stature in the
filmmaking community and his position in the ASC, will be among those
impartial experts who get under the hood of the Red camera for tests.
Stump stresses that none of the talk about Bayer Patterns and pixels,
regarding the Red camera or other new models, necessarily reflects his
own opinions.
The industry is justifiably excited about the promise of the Red
camera, but there are more 4K-and-higher cameras on the horizon — some
employing a form of the Bayer Pattern Sensor — that Stump is very
interested in. He mentions Dalsa’s Origin and admires its 16-bit
uncompressed output. The 4K, 2/3-inch, four-CCD Olympus Octavision,
mainly shown in Japan, uses a modified Bayer Pattern. JVC is also
testing its own 4K entry — which may link up well with the 4K JVC
projector. Then there’s NHK’s 8K camera. 8K? Is that like building a
musical instrument whose high notes only dogs can hear? The more you
can sample digitally, Stump says, the more “you bring back the beauty,
the warmth and the accuracy [of the] analog world.”
THERE WILL BE TESTS
There are no formally announced plans for an all-out impartial camera
shoot-out, but it’s coming is more a question of when than if. Stump
believes the testing process would need to be highly rigorous, highly
accurate, above reproach and without prejudice. “If someone tests
cameras and comes up with the wrong result, then the veracity and
integrity of the tester is the first thing that falls into
question.”
Curtis Clark, ASC, is chair of the ASC’s technology committee,
overseeing Dave Stump’s two related sub-committees. Clark is a veteran
cinematographer with experience shooting feature films, television,
commercials and documentaries both here and in the UK since the early
‘70s, so he’s seen the whole digital changeover take place.
Clark is working to produce a formal, impartial camera testing and
evaluation series for high-end digital cameras that Stump is looking
forward to, and plans involve representatives from both the Producers
Guild and the Art Directors Guild when the time comes. Both guilds have
committees formed to look at new workflow and technology issues. “It’s
a significant coalition,” says Clark. “We are in the early stages of
laying out the procedures, the selection of the cameras, the post
workflow implementation.”
But this camera-testing series will not be an abstract research
project. It’s about production-ready, film-emulating digital cameras
and their implementation in the real world of film production workflow.
The question will be, “are these cameras ready to serve the creative
and budgetary requirements of feature film, television or commercial
production?”
The ASC testing body will use 1920x1080 as a baseline and examine
cameras that go way beyond that. Who might bring what camera to such a
demo/evaluation was still being worked out at press time.
Clark, like Stump, does have some cameras he finds of interest, and
Genesis, Arri’s D20, Thomson/Grass Valley’s Viper and Sony’s F23 are on
his short list. Sony’s F23 (1920x1080 4:4:4 RGB recording) docks to the
SRW1 and can work in extended dynamic range mode — that is, beyond the
limitations of Rec.709 (SMPTE’s standard for HD broadcast).
Clark says the F23 is a good example of a camera that’s gone beyond its
initial purpose as a camera for HD video or television broadcast — it’s
now being used for digital motion picture shooting. While still
1920x1080, “they have moved beyond the limitations of HD and you are
able to take advantage of tonal characteristics and reproduction that
more closely emulate what one expects from film.” Sony’s F23 can also
record output to solid state memory or disk drive. Previously, “it was
not even thinkable to have that kind of flexibility.”
Clark also salutes the Thomson Viper for pioneering dual modes:
traditional HD as well as FilmStream, which emulates a Cineon file such
as one you’d create by scanning film.
When you add in the three-chip vs. single-chip issue, spatial
resolution, pixel count, CCD or CMOS, the variables in the high-end
digital camera world seem innumerable. But it looks like we have some
very smart guys from the world of cinematography ready to start to sort
it out.
METADATA FROM THE LENS
Cooke Optics’ /i dataLink mounts on cameras and records focus, zoom and
iris settings, allowing the information to be passed along to post
production on an SD card as metadata. “It is a way to record the data
as you’re recording the images,” Stump says. The whole notion of
metadata is to attach the data about the images to the images
themselves, and that’s where digital advances now begin — right at the
glass.
This is very useful for integrating VFX into live action. “There are a
myriad of data that go into those images,” says Stump, who was on-set
VFX supervisor on Stuart Little. “The biggest part of the exercise in
that movie was to believably create an animated mouse that could
inhabit the real, physical world with the rest of the actors.” Stump’s
team mapped the distortion characteristics and depth-of-field
characteristics of every lens they were going to use. “In order to
place a CG object like Stuart Little into the field of view captured by
a [wide-angle] lens,” Stump says, “you have to know what the cuts in
focal length are. Because, as Stuart Little walks across the frame, if
you don’t obey the physics of that lens, he’s going to appear to float.
Knowing something about those lenses was critical to the success of
that motion picture.”
On Stuart Little, five crew members had to record by hand what lens
each shot was done on and where the focus lay. They had charts from
each lens in use and could go back and interpret the distortion for
every lens in the film. That’s where Cooke’s i/technology — which Stump
says creates “automated metadata” — comes in handy today. “When the
20mm lens goes onto the front of the camera, the Cooke i/ system reads
the lens and reports to the camera ‘here’s what lens I have and here’s
where it’s currently focused’ for every frame of the shot.”
Cooke’s /i “Intelligent” technology is a “brave first step” toward
eliminating “the enormous possibility for error” and making five or
more people more efficient and creative. “What Les [Zellan of Cooke
Optics] has done in designing and building the /i lenses is push us
immensely closer to the tipping point in the realm of metadata. And
he’s done it in the spirit of open source! ”
Stump recently shot with Viper on Killer Pad, a feature directed by
Robert (Freddy Krueger) Englund. “I attempted to create a metadata
stream of color look-management on-set that would apply to post
production and color correction of the finished movie.” But he cautions
early adopters: “Before you abandon your old tool box, remember to fill
the new one with all the same tools. If we’re going to reinvent the
future, it should be at least as good as the present — if not better.”