The Hobbit Editor Jabez Olssen had the unenviable task of sorting through the
50,000 clips that were shot for the new trilogy. Add to that the demand for
48fps stereo and you’ve got an unexpected journey all of your own
t was promoted as the biggest test for digital cinematography
since George Lucas vowed never to use film again. The Hobbit, An
Unexpected Journey, was not only one of the first films to use RED
Digital Cinema’s EPIC cameras in 3D but also to shoot double speed at
48 fps. But how practical was that for the crew and how did all the data get
organised in to something that was editable for so many different releases?
Editor Jabez Olssen is New Zealand born and one of the crew who has
worked throughout the Lord Of The Rings trilogy in this new prequel trilogy,
he worked as an assistant editor on Fellowship Of The Ring and The Two
Towers, somehow missing out on The Return Of The King. But his career is
marked by other Peter Jackson films including editor of The Lovely Bones
and additional editor on King Kong. He is also editor of the next two Hobbit
So Jabez is used to the way Jackson works but was he ready for this job
Life Of Pi - 3D Shoot And Edit
Capturing The Golden Light - Claudio Miranda
Bilbo Baggins checks the edit list for that days work
with its 266 shooting days of the main unit, 195 days
of a second unit and 11 days of pick ups?
Jabez explains the basics of the shooting, “48fps
was shot on set, for each eye as it was a 3D shoot.
“Converting 48fps to 24fps
was done simply by discarding
every other frame, a process
called ‘disentangling’”
So we were effectively generating 96fps of data.
After all the shooting days we had over 50,000 clips,
25,000 per eye. (But each clip might contain several
‘normal’ takes, as often the cameras were not cut
between takes. 2,200 hours of footage shot, 1,100 x
2 eyes (24 million feet of film equivalent)! The largest
shoot day was 11 hours of footage per eye. These
figures are for all three movies combined.”
The offline edit was all done on Avid Media
Composers. By the end of post on the first movie
they had 13 systems, 12 PC and one Mac. Nine of
these had Avid Nitris DX hardware inside. But how
was the workflow organised for such a large amount
of data?
“For the offline edit we worked in 2D at 24fps with
Avid DNxHD36 media. The Avid media is stored on
an Avid ISIS 7000 with 116TB of mirrored storage
giving 58TB of active space, this contains all the
footage so far captured for the three Movies. All clips
were logged and injected with notes and metadata
by assistant editors and filed within the Avid Projects
structure, once ‘By Day’ and once ‘By Scene’.
“Scene Rolls, Sequences per scene with footage
as it was shot were delivered to the cutting room for
guidance on splitting and selecting then roundtriped through assistant editors until it is organised
as required. All clips are also logged with all their
metadata inside a Codebook Database for any non
Avid metadata requirements.
“An off site live Media backup was kept up to date
on an Apace vStor system that could be immediately
switched over to in the case of failure. Project
backups were done daily with internally created
backup software.
“In terms of the ‘camera neg’, the 5K Red ‘r3d’ files
from set. After being checked by a team working on
set, they were duplicated and sent over fibre to Park
Road Post who backed it all up to LTO tapes. They
then did a ‘One light’ telecine type grade pass on
their SGO Mistika systems, screened rushes daily
and converted all the 48fps 3D material into the
24fps offline Avid media (DNxHD36) for us in the
Picture Editorial Dept. Converting 48fps to 24fps
was done simply by discarding every other frame,
a process called ‘disentangling’ (which is also what
it is called when you extract a 2D video file from a
3D one where the image for both eyes is recorded
one after the other in the same stream. Because it is
The production shot with six Alexa cameras
paired on three Cameron Pace Fusion rigs with
ARRI / Zeiss Master Primes. The uncompressed
HD data was recorded to Codex recorders. The
result is a remarkable cinematic accomplishment that enhances the viewing experience and
immerses audiences in a sea of breathtaking
imagery. Here DP Claudio Miranda talks about
materialising images and emotions on a grand
scale for Life Of Pi.
“Life Of Pi is naturalistic and appropriate for
the time that we were trying to shoot. The look
has a kind of a golden hour, magical feel, which
reflects on the story itself. There’s a great, soft
feeling to it. It wants to draw you in. At times
there are more realistic environments. You feel
like you are taking this journey with the main
character, Pi. You feel like you are with a boy
and a tiger.
“I did some early tests with other cameras.
We needed strong, controlled highlights.
Normally, sunlight reflecting on water is
a pretty big digital issue. We shot off the
Venice Beach pier with the camera very low
to the water. The Alexa was the only camera
that didn’t feel electronic in the highlights.
That’s pretty critical to the story, with all of
the highlights going out of control in the
reflections and with characters really close to
the water. This was really important to get a
handle on. It was a landslide why we chose
Alexa. It was obvious very early on that was
our camera.
“I’m a really big fan of low light and being
able to get as much light out of practicals as
possible. The low light sensitivity of these
cameras is pretty amazing. We wanted to
light a whole pool and the art department
brought in about 120,000 candles for this
night scene. It was shot mostly with avail-
able lighting - not completely, but it could
have been. We had a fraction of lighting in the
background, in the trees to give a little depth.
It was low light and these cameras are so sensitive nowadays, you can capture this kind of
scene with just candles. We shot everything at
800 ASA. The candlelit scene is probably my
favourite in the whole film. To be able to retain
those highlights and light people’s faces, that’s
a pretty impressive camera to be able to control that. It was a really fantastic scene to have
everyone pitch in and create this magical event
in India. It was stunning.
Building The 3D Edit - Editor Tim Squyres
“We shot in Taiwan and in India and we had
a digital lab that went with production. We
had the lab set up in Taiwan at an abandoned
airport where we were shooting, mostly in the
terminal building. Also that’s where we built
our wave tank and we were 50 steps away from
it. I travelled back and forth, part of my time in
Taiwan and part in New York, probably a little
more in New York.
“The footage would go to the digital lab, go
through the stereo correction and then come to
us as side-by-side media, we also had each eye
discrete. In the AVID we stored the images at
DNxHD 115 which is a pretty high resolution,
low compression format. We did that because
we wanted to minimise compression artifacts. In
3D if the compression artifacts are slightly different in the two eyes, that gets really irritating
to look at so we dealt with that by minimizing
“So for every frame that we got we got three
frames, left eye right eye and side by side,
which took up a lot of storage. We cut in AVID
Media Composer 5, even though MC 6 has fantastic 3D support. We were Beta testing 6 while
we were prepping and most of the shoot but it
didn’t come out early enough for us to use it.
“We worked entirely in 3D. Neither Ang or
I had ever worked with 3D before. We didn’t
want to be cutting in 2D and imagining what
it looked like in 3D. Right from the first day of
dailies I worked in 3D and didn’t actually see
the movie in 2D until about two months ago.
“MC 6 has a different way of setting up the
clips so once you start in 5 you can’t really
move to 6 as you would have to rebuild everything. In order to re-converge shots we would
have to duplicate them, crop out the left eye,
crop the two eyes separately and then re-position one of the eyes in order to re-converge. It
was a cumbersome way to do it.
“It’s interesting though, when you get into
3D you discover all kinds of interesting things
especially in things like dissolves for example.
You may have a dissolve that would look
perfectly fine in 2D but in 3D you realise that
this doesn’t work. Sometimes the out going continues
page fifty
and final approval from Jackson only occurring
“Peter is very hands on with the editing process.
During the shoot, as I was on set, most of the work
that Peter would do with me was reviewing footage
and selecting takes. I would then do an edit of the
scene on my own whilst he was directing the shoot.
Later during post (or during a Saturday cutting
session during the shooting period) we would edit
the scene together from scratch, and at the end have
another look at the original assemble I had done.
Occasionally we might blend them in some way.”
Tell us about the process of working with so much
CGI and motion capture material.
“We had a great previz team who would generate
shots for us when we were missing something for the
cut that would eventually be CGI. “Editing a live action film that has a lot of CGI can
be a very multi layered process. You are not only
editing horizontally between shots, but vertically
within the one shot. Stacking layers of background
plates and performance captured elements and
various things, trying to make each shot work as a
cohesive whole.
“The performance captured material would be
roughly rendered for us and we would treat it like
live action footage, selecting the best performances
and cutting them in, although we did have the
advantage that if we needed a new angle or camera
on it we could just order it up.
“It was a long job and a big job. Which can be both
a great thing and a hard thing all at once. The share
size of the production; three films, all the CG material,
the large cast and the large amount of footage all
adds to the challenges both technically and creatively.
Meeting the challenges is always a highpoint.
“As always the people are one of the great things
about this job. Peter, the cast and crew and my team
back at Editorial, all make it interesting and doable.
“Personally I enjoyed being on set and seeing the
movie shot. Which is quite a rare opportunity for an
Editor, we are often locked away in a dark room.
But actually being there and seeing how Peter was
directing a scene was enormously useful to me when
it came time to do my initial edit. Also being able to
go on location and see some of the most beautiful
parts of this country was a great thrill.”
Jabez Olssen is currently in post production for The
Hobbit: The Desolation of Smaug which is due out
later this year. d
Peter Jackson and Cate Blanchett plus a couple of RED Epics
effectively the same thing).”
How did they use Media Composer on location?
“The shoot itself was over a year long, and for the
majority of that time we were filming at Stone Street
Studios in Wellington. But for a three month block
we were traveling around remote locations in New
“The Editing Rooms are based adjacent to Stone
Street Studios, and whilst we were shooting there
we had a fibre network connecting our Unity ISIS
storage system to the stages. As Editor I was based
on set for the entire shoot. I had a portable Avid
system (based on a HP workstation mounted into
a roadcase on wheels) that would be setup beside
Peter’s directors chair each day. By plugging in
the cable I would have full instant access to all our
media, just as if I was sitting in the main cutting
room back in the offices. This way Peter and I could
work together when he had down time between
“If we needed a more secluded environment to
work, we also had a mobile editing room set up in
the back of a large camper van type vehicle, that was
driven close to which ever stage we were shooting
in. Inside it was a full Avid system, and large Plasma
monitor and couch for Peter. We called it the ‘EMC’
for Editorial Mobile Command.
“Also if we were not shooting during the weekends,
we would edit on a Saturday, and in this situation we
would go back to the editing department offices and
cut in the main Cutting Room.
“When we left the Sound Stages and went off on
location we took a Mac laptop based Avid system
with us, with an array of hard drives that held a
selection of footage on them. This was a much
smaller more nimble setup, that meant I could setup
on a small table, beside where Peter was directing,
even if he was half way up a mountain or something.
Hard drives with each days rushes were sent back and
forth from Wellington each day from where ever we
were. Also the EMC mobile cutting room was driven
around the country on location too, and parked at
the Production base camp for each location. Peter
and I could use this if we got time (or rained out).
Otherwise the Assistant Editor who came on location
could use it for preparing footage.”
By our reckoning there were six versions of the
movie, a 48fps stereo, 24fps stereo, 24fps 2D and
all at both 2K and 4K. How did they conform the
various versions for the DI? “We did not do the DI at 4K at all. This was
primarily due to cost and time for the VFX shots if
they had been done at 4K (it is four times the data
of 2K).
The DI was conformed at Park Road Post using an
array of SGO Mistikas, it was all 2K and was primarily
done at 48fps 3D with the other versions, 24fps 3D
and 24fps 2D derived from that. “Beyond the three main deliverable formats we
also delivered the 3D versions at different brightness
How was the 24fps version prepared from the 48
fps master?
“Many frame blending tests were done with
different systems, but we tended to like the results
we got from simply discarding every other frame.”
Does Peter Jackson work very hands on, or are you
left for long periods to cut sequences, with tweaks
Film-style Professional Camera Accessories (PCA)
■ Available for the latest digital cameras
■ Legendary ARRI quality
watch online: