Posts Tagged ‘data management’

DITs: We Can Do Better…

Having worked with DITs (the horrible acronym for Digital Intermediate Technician, the person responsible for the delivery of digital media from the set to post) over many years, and more recently, doing the job of a DIT on a couple of productions, I’ve come to one very simple conclusion: the process could be better.

That isn’t to say that most DITs do a terrible job; on the contrary, many are extremely passionate and conscientious. Rather it’s that they approach the role from a cinematography standpoint, and almost never from an IT background. Historically, most DITs were barely-trained people provided by the rental company, who knew how to operate a particular digital camera, and how to copy files on a Mac.

 

Problem 1: data transfer

GUI-based copying (whether through Windows or Mac), has several flaws:

  • It batches copy operations together into one lump. If a single file fails, you have to restart the batch
  • It doesn’t verify files
  • It’s slow
  • It’s prone to human error
  • It doesn’t handle errors very well
  • If it does encounter an error, it will stop the entire process
  • Having multiple copy operations tends to cause problems

and on and on.

 

Solution 1: rsync

Rsync is a universal file copying application that runs from a command-line. It’s even built-in to the Mac OS. It’s extremely robust, and recommended in many situations. It’s startlingly easy to use, and there are even GUIs available for it.

Open terminal, and type:

rsync -avhP /Volumes/A001_C001/ /Volumes/RushesStorage/Day01/

And watch it go. You can cancel with control-c and run the same command to resume it. I defy you to find anything that will copy files faster. The only thing to pay attention to is the trailing slash at the end of the paths. More information on rsync.

 

Problem 2: rented storage media

Every production I’ve worked on has rented storage media (SD cards, CompactFlash Cards, LaCie drives and so on), and needless to say, every one of them has had at least one take that had to be aborted due to an error with said media. Disk drives are sensitive things at the best of times, and this is one of the reasons used disk drives lose value.

Flash storage, which is most popular with digital cameras for its high performance is especially problematic. The problem is, it can also be prohibitively expensive.

 

Solution 2: buy storage media when possible

The real solution here is proper planning. There is rarely a good argument for renting storage media. Cost is certainly a factor, but question what the cost is of having to reshoot due to drive failure? Also, consider planning to sell the media after a shoot, recouping much of the cost (and probably putting it on par with the cost of renting the media in the first place).

 

Problem 3: daily storage system

Low to mid-budget productions all use external (USB or Firewire) disk drives for immediate transfer of rushes by the DIT. On a given day, the camera(s) will produce anywhere from a few GB to 1 (or more) TB of data, which gets dumped onto these drives for an undetermined amount of time.

There’s nothing inherently wrong with this, but we can do better.

 

Solution 3: network attached storage

Rather than using a stack of G-RAIDs, consider using one or more network attached storage (NAS) systems. There are so many benefits to this, it’s incredible everyone isn’t doing it:

  • Data is better protected, instantly
  • All the data is available without having to swap through drives
  • Data is accessible to multiple people, simultaneously
  • You can run other services (like a media server)
  • Data can be encrypted easily
  • It’s lighter
  • It’s almost the same cost*

I’ve used the following kit on the last two productions, giving me all the benefits above:

This will give you 9 TB of RAID 6 (meaning you can lose 2 disks). Why sacrifice 6 TB of storage space? From Wikipedia:

RAID 6 provides protection against data loss during an array rebuild, when a second drive is lost, a bad block read is encountered, or when a human operator accidentally removes and replaces the wrong disk drive when attempting to replace a failed drive

The QNAP is considered reasonably high-end (it even has HDMI out, imagine that!) by NAS standards, and so if price is a concern, there are many cheaper options available. I personally recommend the QNAP because it has decent Mac support, and is very user-friendly.

* Compare this to the equivalent in G-Technology Disks: 5 x 2 TB G-RAID ~$1450. If you opt not to use RAID storage, you then of course get more storage for your money with the NAS, and the difference in price becomes negligible.

And if you want to be really paranoid about backing up that data (as well you should), then you can just get an HDD dock, some extra internal hard drives, and copy data onto those as well.

 

Problem 4: inefficient workflow

Every DIT I’ve ever worked with does things the same way:

  1. Receive camera media
  2. Plug into laptop
  3. Copy to external media
  4. Return camera media
  5. Make second copy to other disk drive
  6. Wait for copying to complete
  7. Watch media directly from external drive
  8. Give one of the disk drives to director for rushes

Solution 4: more efficient workflow

With a NAS setup, here’s a better approach:

  1. Receive camera media
  2. Plug into NAS
  3. Tell NAS to start copying to itself and/or other connected drives (yes, you can still use rsync in most cases)
  4. Return camera media
  5. Everyone who wants to watch needs only connect via ethernet

This now gives rise to a new problem:

New problem: DIT has too little to do

Solution: Do something useful and beneficial, like logging all that data as it’s being shot.

Posted: November 4th, 2012
Categories: News
Tags: , , ,
Comments: No comments

Standard data transmission speeds…

The following table shows the data transmission speeds of common interfaces.

Note that you should always use the slowest interface for reference. So for example, if you copy a file across a gigabit ethernet to a drive connected via USB 2, you should refer to the USB 2 speed only.

Also bear in mind these are theoretical maximums, and real-world speeds will vary greatly. For example, although 802.11g has the same theoretical speed as 802.11g, the latter is far more stable in the real-world.

Posted: June 15th, 2012
Categories: Articles
Tags: ,
Comments: 2 comments

The Many Problems with Prelude…

I was initially very excited by the promise of Adobe Prelude (replacing Adobe OnLocation as of version CS6). The idea that it can streamline (even automate) the Digital Image Technician’s workflow of copying and checking digitally-sourced footage on-set and making proxies, thus freeing up the DIT to focus on more useful tasks, such as logging information into the footage’s metadata.

I spent a couple of days using it on a multi-camera shoot, but the results were disappointing. Allow me to count the ways:

  1. It hangs a lot
  2. It’s impossible to batch-edit metadata (it seems to work in principle, but ultimately it will hang)
  3. If any errors are detected during an ingest, none of the footage ingested shows in the project window
  4. You can create bins, but cannot open these bins in separate windows
  5. There’s no way to create metadata templates (so that you only see/edit the metadata tags you’re interested in)
  6. Ingested footage doesn’t appear in the project window until the entire ingest has completed
  7. It’s not possible to transcode footage from the project window
  8. It’s not possible to duplicate footage from the project window
  9. There’s no way to apply metadata during ingest
  10. Despite it being plastered all over the product page on Adobe’s website, there’s no way to transcribe the audio from within Prelude
  11. There’s no way to do anything clever with the metadata. For example, I was hoping I’d be able to produce copies of the clips but renamed to the scene and take number. No such luck.
  12. There’s no thumbnail view
  13. You can’t sort clips by any metadata field (in fact, you can’t display the metadata fields in the spreadsheet-like project view)
  14. You can’t set any event (notification, action) to trigger on completion of ingest
  15. There’s no way to filter the event list
  16. The help system redirects to the Adobe website (because when you’re on-set you always have a reliable internet connection)
  17. Worst of all, it seems to be corrupting files whilst copying (although I can’t prove this conclusively, I did encounter 2 corrupt copied files although the originals were intact)

Although this isn’t a case of “Adobe dropped the ball” (it is only the first release, after all), it does seem like even basic functionality that is required by all DITs is missing. Part of the reason this is so disappointing is because they already have much of this working in Lightroom. They’ve even structured the UI with a 4-room (ingest, logging, list, rough cut) metaphor along the same lines of Lightroom, but have failed to properly utilise it.

It does seem that Adobe is using Prelude to push you into moving the footage into Premiere and then doing more there, but I don’t really want to start moving data between applications at this stage. There’s a “rough cut” feature that I didn’t even use, because well, that’s what Premiere is for.

Posted: May 12th, 2012
Categories: Opinion
Tags: , , , , , ,
Comments: 2 comments

The Hiero we deserve, not the Hiero we need…

The Foundry’s Hiero launched last month, after a public beta period. Described as “a pipeline in a box”, perhaps the best way to think about is a bells & whistles conforming system.

Here are some of the things it can do:

  • Conform media
  • Transcode or rename media
  • Track versions (to some extent)
  • Generate NUKE scripts

It’s fully extensible through python, so in theory a lot of features can be customised to specific workflows. Quite frankly, I would have killed for this on almost every production I’ve worked on. It would have made a lot of data management chores a breeze. There are a few notably absent features, such as the lack of scene detection, and the extremely limited notation functionality, but that will happen in time no doubt.

The Foundry view Hiero as a kind of post-production hub, managing shots coming in, and shots going out. A client can view the latest overall version of a show, before going into a grading room. On one hand this is a necessary step: colour grading is less often about colour and more about asset management and versioning. This fulfils a crucial need: to have a stage that exclusively deals with editorial issues prior to grading.  So with Hiero, the production team goes over to the Hiero system, reviews visual effect versions, checks editorial issues, delegates more work and so on. Unfortunately, it just doesn’t work like that in the real world.

For starters, who’s responsible for maintaining this hub? In general, the production team would lack the expertise required to manage the process, and in any case, from their perspective, they are paying everyone else to ensure the various pieces fit together. At the launch event, there were talks by people who’d been using it at visual effects houses The Mill and Framestore. But even these are edge cases: it would be extremely unlikely to have a single facility responsible for doing the bulk of the post work on a major film. On a typical film, The Mill might be handing off a bunch of effects to a DI facility elsewhere, and not really care how it fits in with elements from other sources (let alone that the production might not want the Mill having such a level of control over the film). Likewise, the DI facility will expect to just conform everything in the grading suite, as they always do. There wouldn’t be much benefit to adding another link in the chain.

So it could fall to a third party, who would coordinate everything, but then who is going to pay for such a service? I agree with the principle of Hiero, and I’d argue that someone should be paying for such a service. But if there’s one thing we know about post, it’s that people hate having to change their workflows.

So where does that leave us? Currently Hiero is around $5,000 for a node-locked license, and that prohibits it from being considered a utility a freelancer could invest in, or that a facility would pay for “just in case”. I hope that the Foundry can crack this problem, because it can arguably make post easier for all of us.

The Foundry offer a 15-day trial of Hiero, as with all their products.

A problem of numbers…

Anyone working with digital intermediates will undoubtedly have experienced this situation.

You start off with a frame sequence, let’s say 090000.dpx through 180000.dpx. However, there are gaps in the sequence (maybe because these frames were captured using an EDL, or maybe they’ve been selectively exported from a more complete timeline). You process them in some processing application, but now you have something more like 0000.dpx through 7864.dpx.

Often it doesn’t matter how the modified files are named, such as if you are going to edit them into a timeline by eye, but sometimes you just really need the names to match and so you have to waste lots of time massaging all the filenames until they are just right.

I found myself in just that situation recently. We’d exported a bunch of frames from a timeline that needed some last-minute dust-busting. The quickest, most available option was to run through them all in After Effects. Great but then the problem was getting them back in. I imported the renders as a single, new reel, and then proceeded to cut and splice them back into the timeline shot by shot. That took around 2 hours. But we had time.

The next time we were in the same situation, I decided I would make like easy for myself. I essentially had a list of filenames I needed to use (from the original exported folder), so surely there had to be an easy way to automate renaming them. Well there wasn’t, so I made one.

One of the things that I’ve come to love about working with OS X is AppleScript. The process of writing some AppleScript, testing and running it can be done (in this case) much more quickly than just doing everything manually. Granted, there’s a learning curve, but the other good thing is that even if you can’t program AppleScript yourself, you can benefit from someone else’s.

With that in mind, I’ve released the AppleScript I made on Google Code. If you find it useful, let me know.

Download

Posted: November 15th, 2010
Categories: Tools
Tags: , , , , , ,
Comments: No comments