At first glance, it doesn’t seem particularly ambitious. Take some wildlife documentary footage, cut it together, and cut it into a film. That was the idea of “Earth…“, the BBC’s film version of their highly acclaimed “Planet Earth” TV series. But look at the details, and things become much more complex. The series has 4,000 days-worth of footage to draw upon, across every format in use today, from DVCAM through to Super-35mm film. Much of it was processed under less than ideal situations, and as such is missing vital metadata such as timecode, sync points and so on. Pretty much all of the audio had to be created from scratch.
earth © BBC Worldwide Ltd 2007
My involvement was limited to the picture side of things, so I’ll be focussing on that for this article. More than anything else, what was unique about this project was the production’s unwavering commitment to quality. By the time we were done, every second every pixel of the film had been scrutinized. It’s probably the first thing I’ve worked on since Band of Brothers that completely exploited the possibilities of the digital intermediate process. It mixed different kinds of media, and each shot had multiple versions right up until the end. But I’m getting ahead of myself slightly. In the beginning there was just a QuickTime offline reference, some EDLs, a stack of HDCAM tapes and around 4TB of storage (we had a lot of technical problems with our conforming/grading system, so I’ll spare the parent company their blushes and not reveal which system we used). The first thing to be done was to capture what we could from the tapes, ensuring that all the video headroom was captured as well. That was a fairly painless process, as the same tapes had been used to build the offline edit. There were a couple of sticking points, oddities such as duplicate tapes with different timecodes, but nothing particularly out of the ordinary. We also had to massage the data a little to get it to adhere to a project/source/reel/frame.dpx structure at this stage (and retrospectively, we were very glad that we did).
earth © BBC Worldwide Ltd 2007
Then things started to get a little tougher. Much of the film material had no correlation to its offline equivalent, and so the scanning submissions were put together by eyeballing shots on the rushes tapes. Then when the scanned data was supplied, it had to be eye-matched into the timeline against the offline. There were also things such as DVCPro varispeed footage which had to undergo an elaborate process devised by HD Consultant Jonathan Smiles to preserve the integrity of each frame and get it to look right, ultimately resulting in a set of frames that also had to be matched by eye. Everything was ingested in its native form and then processed in its uncompressed digital form. And, as is the norm in the film world, shots arrived on a very irregular basis. As many of these shots did not have specific reel numbers, I assigned them unique ones- this would also ensure that the arbitrary timecodes they now had became less significant. The best way to do that was to use the event numbers from the EDL. In actual fact, at this stage we had a sort of master spreadsheet for the production (this was just prior to the advent of things such as Google Docs & Spreadsheets, unfortunately, so they had to be shared by memory stick rather than synchronised online), so this method made it easy to cross-reference the shots with all of their metadata.
earth © BBC Worldwide Ltd 2007
At this point, the conformed timeline was almost complete (this is around two weeks into the digital intermediate process), then everything stepped up a gear, and we left the typical DI workflow behind. For example, our film output resolution was fixed at 2048×1242, but we now had everything from PAL SD resolution through to 4k film scans. Normally the grading system handles all of this for you, but as the production team was well aware, different scaling algorithms affect different shots differently. So a decision was made to scale everything prior to grading it. We used a variety of scaling processes depending on the content (some of the techniques we used will be covered in future articles), generating a new set of data each time. We needed this process to be non-destructive, that is, we had to be sure we could go back to the original shot if we discovered any artefacts later on. So I made codes for each of the resize methods, and appended these to the original reel numbers to generate the new reel. So S248 would be reel (shot) 248 scaled using a Shake method. This new reel is then loaded into the conform timeline on a new track above the original. Again, this naming convention proved to be very robust, even right at the end when everything was very complicated, and even at this stage there were some shots with two or three different versions kept in the conform for comparison purposes. Colourist Luke Rainey was able to switch between the different versions of a shot very quickly and approve (or not) the scaling.
earth © BBC Worldwide Ltd 2007
The same tact was taken with things like de-interlacing processes (we used Natress in conjunction with Final Cut for the bulk of these) as well as things like time-lapse sequences (many of which were actually done by director Mark Linfield on a Mac in the back room). Soon we had an average of 2 or 3 versions of each shot in the timeline, and I was starting to realise we’d seriously underestimated the amount of disk space we needed for the project. Without any sort of SAN storage, what we’d done is set up two independent systems, both running the grading software, and both with identical copies of the source data. We’d been synchronising the timelines between the two using EDLs, but at this stage it became far too complicated. We now had 5 separate projects (one for each of the output reels, this was done for performance reasons) each with 4-8 timeline tracks. Since Luke would only be working on one reel at a time we decided a far better option would be to move the actual project files back and forth instead, and actually this process worked out rather well.
earth © BBC Worldwide Ltd 2007
The grading process was done using a Barco projector. What was interesting about it was that the primary grade was actually done in 709 (HD) colour-space, rather than P3 (film), even though the primary output would be film. The grade would be done as if it were a HD project, and then the result would go through a 709 to P3 transform and then tweaked using a LUT. This decision was made by the production after weeks of testing, and produced very accurate results. There was some sacrifice of dynamic range, but on the other hand, the film version truly looks identical to the HD version (which itself looks stunning). It also meant that the conversion to digital cinema (one of our primary output formats) would also be very accurate. The entire color calibration process was overseen by Post-Production Producer Jon Thompson.
earth © BBC Worldwide Ltd 2007
One of the unusual things about this film was that it had no visual effects, and yet there were several effects companies involved that we were bouncing shots backwards and forwards with. Most of what they were doing was things like reducing noise or stabilizing shots, but even here we found that it was necessary to hold on to many of the versions. So once we were nearing the end of the grade (and had added a lot more disk space), we had an average of eight versions for every shot: that’s 8 frames of source material for every frame of output (and that’s not including handle frames either). There were now more source reels than there were cuts in the film, so it was just as well that the naming conventions we established early on were still holding up. With the grade pretty much in the bag, Luke had turned his attention to adding grain to the video material, to increase the continuity between video and film-sourced shots. Much of this was now tracked in Surreal Road’s proprietary database (more on that soon, I promise) as the master spreadsheet had reached mind-boggling complexity at this point. This also allowed us to track the whereabouts of physical assets, which was also useful, because missing shots and last-minute recuts meant we had a big stack of the BBC’s tapes to look after.
earth © BBC Worldwide Ltd 2007
Output was fairly straightforward, we had some bizarre render errors and caching issues that I won’t bore you with here, but nothing really severe. We output to one 2TB removable drive for each of the film, HD and digital cinema versions, and then, to be really safe, copied all of that back onto the grading system to QC it before giving the ok for it to be printed to film. And for a nice change, we sent everything to be filmed out in one go, rather than drip-feeding it to them in reels. This was May 2007.
earth © BBC Worldwide Ltd 2007
Since then, there have been more changes and recuts made. At the time of writing, there are no less than three distinct cuts of the film (not including regional differences), each of which exist in three formats (digital cinema, film, and HD video). In fact, the version that hits the UK cinemas today is one I’ve not actually seen. The recuts mostly involve rearrangement of the existing material, and so were reconformed directly from the output data, rather than from the source data, which has made the process significantly less complicated.
UPDATE: Jon Thomson provides more information about the Digital Cinema mastering process:
“The D-cinema version was made by Martin Greenwood who wrote a whole new set of algorithms (which now form part of the Yo-Yo system from Pandora). The D-cinema version is 1998 by 1080 pixels, which gives an exact 1.85 ratio.”
And more on the film recording process:
“Our output to film used Cinesite’s “Super-2K” method, so everything was done at 2048 by 1242 pixels using a 1.66 ratio, giving us some safety room for 1.85 projection. The reason for this is that every theatre seems to have a aperture plate that says 1.85, but never seems to match a 1.85 test chart.”
And on the color space used:
“We graded in P3 colour space as this was the route I was used to working with. Jim Whittlesea and Howard Lukk in the U.S. had defined and proved it worked, when working on the Stem tests for DCI (in 2004). P3 was a fairly close match to the colour space of film and meant that we also had a DCI P3 version for the DCDM without needing to re-grade. The route we eventually took was to grade in 709 [providing the HD and DC masters] and then do a pass in P3 space at gamma 2.6, then finally convert into log space and tweak to make the film output version.”
Posted: November 16th, 2007
Categories:
Articles
Tags:
BBC,
conform,
data management,
digital cinema,
digital master,
documentary,
DVCAM,
DVCPro,
earth,
film,
grade,
HD,
HDCAM,
movie,
output,
project,
QuickTime,
storage,
varispeed
Comments:
No comments