Appendix 1
Basic Strategies

 

 

 

Working with digital media can be very daunting. Every time I start a new project, I pretty much have to re-think my approach, due to the fact there will have been so many technological advances and changes since the last one. Disk space for example, will always be cheaper, faster, and more portable than the last time. Almost overnight, a workflow based upon digital tape storage can become redundant, with external disk drives suddenly posing unique advantages over digital tapes. However, there are always certain practices that remain useful, and regardless of the details, the underlying principles are what enable you to make informed decisions on the best course of action.

Digital Assets Management

During digital post-production, you'll generate a lot of data. Even if you only work with audio, you'll turn over many gigabytes of sound files and related data over the course of working on a single project. As well as being good at what you do in terms of finding and fixing faults and making improvements, you also need to be good at how you do it. If you are well organized, you'll spend less time back-tracking and searching for things, and more time doing the fun stuff.

The first stage to being organized in the digital realm is having a structured approach to managing your digital assets (in other words, the media, metadata, reference material, and so on, that are related to the production). You should be comfortable working with the file management tools provided by your operating system (Explorer on Windows, Finder on the Mac), and employ an indexing system using files and folders that is not only easy for you to follow, but is for others as well (I normally write a short cheat sheet at the start of a project which lists how and where to find different elements on disk, primarily suited for other people on the team to comprehend, but which becomes increasingly useful to me as well as the complexity of the project starts to grow).

Naming Conventions

Establishing naming conventions are probably the simplest and most logical method to organizing data. The idea is that you define a sort of template structure of files and folders, and then apply the structure to all the data.

For example, a popular method is to use:

/scans/reel#/resolution/frame#.ext

So that frame 1200 from reel 40, scanned at a resolution of 2048 by 1556 pixels becomes:

/scans/040/2048 × 1556/01200.dpx

Instead you could use a variation of this system, perhaps organizing by shot number:

/shot#/tk#/scan/frame#.ext

The important thing is to decide on a convention early on, and be sure to stick to it. The absolute worst thing you can do is come up with a system for filing shots and then decide halfway through that it's inferior to another method which you then adopt from that point on. Suddenly you'll find yourself in a less organized state than you would have been without any system in place.

Backups

If you haven't been in a situation where you've regretted not making backups (or not making them often enough) you're either doing it already or long overdue for something to go wrong. Seriously, no matter how well-prepared you think you are for problems, something can always happen to test your preparation.

In some cases it's not practical to backup everything, due in part to the sheer volume of data as well as the corresponding time required to manage the backups. On some productions, such as video-based ones, the original source material (i.e., the video tapes) can act as a backup of sorts, because if anything goes wrong with the digitized data, it can always be redigitized. However, this itself can be an easy trap to fall into, as redigitizing footage can take significantly longer than merely copying data, and the digitization process is by no means completely automated (especially in the case of scanning film). In addition, there is a strong argument for only ever digitizing from source once, as it each time can potentially degrade the media.

Ultimately, the most important (and most often overlooked) data to backup can be the project files and metadata used at various stages of the process (such as the editing system projects, compositing scripts, and so on). Although technically these can be recreated, doing so can require every bit as much effort as shooting the scenes in the first place, and given that these files often comprise the smallest volume of data in a project, it seems negligent not to back them up as regularly as possible. As a final point on the subject of backups, you should also consider making off-site backups when possible. These ensure that should anything go wrong at the post-production location, you will still have a viable backup. These days, one of the most cost-effective methods for making automated, fast off-site backups can be using an internet-based service, especially for smaller files and metadata.

Version Control

After scheduling regular backups, the most important thing you can do is set up some sort of version control system. This is a system whereby you must check out files to make changes to them, and then check in the changed files. This fulfills several important tasks. First, it allows you to roll back changes you've made to previous versions (thus fulfilling the role of rudimentary backup system as well). Second, it prevents two people from independently changing the same footage at the same time, as you cannot check out data which has been checked out by someone else. Third, it provides an audit trail of every change ever made, as everything can be annotated. At a glance, you could look at a single frame and see its entire history, from acquisition to final output. And finally, it provides authority on what's considered the final version. At any point, you log into the version control system and there's the latest version.

The concept of version control applies to all digital assets, both in terms of the footage itself, and also project documents, EDLs, and so on. Historically, the disk space requirements of digital productions made version control a luxury no one could afford. Now however, with the cost of disk space per terabyte so low, it's a worthwhile investment that will hopefully become adopted on all manner of productions.

There are many different methods for version control. Many free systems are available, and there are also commercial products that cater specifically for media files, such as Avid's Alienbrain (www.alienbrain.com).

Asset Libraries

One of the really good things about analogue systems is that you can easily get a visual reference of what's on them. For example you can place a strip of film on a lightbox, or put a video tape in a player. In the digital domain, this is not quite so straight-forward.

First of all, digital media is not always kept online. The amount of data and the non-linear nature of post-production means that it's often necessary to cycle data on and off of different systems. In addition, work may be undertaken at multiple locations, in which case it's rare that one location will have an up-to-date copy of all the data.

Second, there may be issues encountered due to the formats used by digital media. Film scans, for example, tend to be stored as high-resolution image sequences, which are not appropriate for, say, playing back on a laptop.

for all these reasons, it can be worth constructing a virtual asset library, that provides convenient, visual representation of all the media used in the production. Doing so usually involves making low-resolution, compressed copies of the footage and audio (usually to a common format). It need not be sophisticated; in fact, I routinely use the simplest freely available software I can find for this purpose, as it increases the likelihood that it will work on whatever system I need to use it on (you should also aim to use cross-platform software if you're working in a cross-platform environment).

The main aim when building a reference asset library is ensuring that it provides an accurate representation of all the data in the production. This means that any new versions that are created should be added to your library. It may also be worth burning-in pertinent data (such as timecodes) to the images to make the reference even more useful.

TIP

Regardless of whether you choose to use an asset library or not, you will often need to create proxies for your media. Proxies are copies of the original media but encoded at a lower quality, to allow playback on less powerful display. For instance, it's common to create proxies of film scans for offline editing.

Conforming

A big part of data management in post-production involves conforming it all successfully. Conforming is the process of assembling the various elements together to build the master version which is then used to produce all the different distribution masters (for instance, the DVD and digital cinema masters).

In theory, the conforming process happens once (typically after picture editing has been completed but before color grading begins), and happens automatically. In practice though, the reality of various revisions to the final edit, as well as time constraints, will mean that conforming must be done at several key points. And although much of the process can be automated, supervising the conforming of all the data normally ends up being a full-time job.

Conforming digital video is normally accomplished by matching reel number and timecodes (or clip shot numbers and takes) to an EDL produced from the final (and revised final) offline edit. The key thing to bear in mind when conforming data is that the EDL will rarely have any direct reference to the actual data to be used. For example, the EDL might point you to use 046A take 7, but this will almost certainly refer to what was shot, rather than the latest digital version, which will likely (particularly if you've been using the techniques in this book) have become something else (for example, 046A_tk7_repaired_graded_cropped) and therefore will not conform automatically. The solution to this problem will depend upon the specifics of your conforming system, but it is likely you will either have to substitute the source material with the fixed version, or else modify the conform EDL and other parameters to point to the new data instead of the source material, either way requiring a great deal of delicate data management to ensure nothing goes awry in the process.

Conforming digital audio is potentially even more problematic, because there may not even be a timecode reference in some cases. However, most audio post-production is done on a single system, and so it may not even be necessary to conform it at all. In addition, a lot of the time the full-quality audio source is used directly (even at the editing stage) and so it may even be possible to output the full-quality, cut-together audio directly from the offline editing system (although this should be considered a worst-case scenario).

Collaboration

The bigger the production, the greater the need collaborate with others. Without a doubt, the most effective tool for collaboration is the Internet, whether used as a communication method or even for scheduling and project management. There are a great number of applications that use the internet for things like synchronization, and these can be incredibly helpful in getting organized.

Even if you're without Internet access, communication is the key to success. If people are working on data in remote locations, there should be a single reference point which is considered the master source, or else you run the risk of having a degree of uncertainty over which set of data is correct. Again, this should be combined with a good version control system for the best collaborative experience possible. In fact, one of the best methods for combining version control, off-site backup and still allowing for collaboration is to use a system such as Amazon's S3 storage service (aws.amazon.com) in conjunction with a version control system.

TIP

For the digital intermediate for Earth, we made use of the free Google Docs to log information related to the final cut and related footage in an annotated spreadsheet that was constantly synchronised between everyone using it. There was no question of whether or not people had access to the most up-to-date information, which simplified things a great deal. Similarly, we used Google Calendar with similar results for scheduling events.

Non-destructive Workflows

Non-destructive, or parametric workflows allow changes to be made without committing them (or baking them in) until the end. The basic idea is that all changes, such as color correction and scaling, are stored as metadata rather than actually modifying the source files. The benefit is that anything can be adjusted at any time, but the trade-off is that it requires a lot of computational power to apply the changes fast enough to enable accurate and real-time playback. In reality, parametric workflows are possible in small pieces. For example, within the context of a paint program, it may be possible to not have to commit any changes you make until the point where you render the new footage out. However, for the editing system to be able to show the changes, the sequence will need to be rendered and reloaded. In general, every time you need to use a different system, you'll probably have to render what you've done so far, at which point it gets baked in. You should therefore strive for a nondestructive approach where possible, and consider every time you render as a sort of milestone (or checkpoint) in the workflow (in other words, a point to schedule a backup).

TIP

In some situations, it may be possible to switch applications without the need to render. For example, the Adobe suite of applications (the same goes for most of the Autodesk and Apple systems as well) allow a great deal of interchange, allowing you, for example, to load a Photoshop Document (PSD file) into After Effects, complete with layers and other properties.

QuickTime Reference Videos

One of the most useful file formats is Apple's QuickTime. Not only does it provide cross-platform support, it also caters for a wide variety of different codecs (and thus, different situations. One of the most useful (and perhaps underused) features of QuickTime is the ability to create reference movies. These are QuickTime movies that point to (reference) other QuickTime movies based upon certain criteria, such as connection speed, CPU power, etc. In practice, the most useful applications of this process are for creating versions for playback on different systems. For example, you could create full-resolution copies of the source footage as single frames, a smaller version for playback on a laptop, and even a version suitable for iPhones, and then a single reference movie to point to each. Depending upon what you're doing, you'll automatically reference the correct version.

TIP

A free utility for creating QuickTime reference movies is available from developer.apple.com/quicktime/quicktimeintro/tools]

Image Sequence Workflows

A very useful workflow to adopt is to use image sequences. The basic idea is that rather than storing a file per clip of footage (as is the accepted method for video editing), a file per frame of footage is stored instead. This method has its roots in film production, as film scans are often stored using this system, but can be applied to any kind of production. To use it, you simply export each frame of footage as a single image. Pay careful attention to how you name the frames. If the original source has a timecode associated with it, it's good practice to name each frame as a number derived from it's timecode, so that, for example, a frame with a timecode of 01:00:00:01 (at 24FPS) becomes frame number 86401 (remember to pad the numbers with leading zeros as well, making the filename more like 0086401.tif in this case).

There are several benefits to using a frame-based approach:

Image editing systems (such as Photoshop) can be used to make quick fixes to frames (rather than needing a dedicated video editing system).

Footage is better protected from digital corruption (damage will typically only affect a single frame, whereas corruption on a clip file can potentially destroy the entire clip).

Basic editing can be performed simply by manipulating the files, rather than having to use an editing system (for example, you can simply delete frames to shorten a sequence).

You can render individual frames in order to fix problems across a frame range, rather than having to render out an entire clip each time.

There are some drawbacks to this approach though:

The overall volume of data increases dramatically.

Some systems cannot handle playback of image sequences properly, and even those that can handle playback may suffer some sort of performance hit (you can get around these limitations by using proxies or QuickTime reference movies to some extent though).

Managing the data becomes much more complicated.

Security

The strengths of digital media can also be its greatest weakness. Working digitally means you can get a perfect copy every time with ease. With video and audio tape, subsequent copies would result in degradation on each copy, as the signal would suffer from generation loss, with the signal-to-noise ratio decreasing (in other words, there would be more noise introduced) on each copy. With film, an equivalent effect would also apply to each print run of a reel of film, with the added complication that film requires highly specialized equipment and expertise in order to make a copy.

What this means is that it used to be really hard to make a good copy, which, in post-production, is a bad thing. But in terms of preventing a production from being counterfeited, it was actually a good thing. Anyone trying to duplicate a movie and pass it off as the real thing was at a severe disadvantage, as the copies they would produce would almost always be completely inferior to the original, and therefore easily recognized as a fake.

Now however, the situation is markedly different. It's so easy to make a full-quality duplicate of an entire production (in some cases, literally as easy as clicking a button), that protecting the integrity (and investment) of digital assets is more important than ever.

What to Protect

The degree of preventative methods you need to employ (and the level of paranoia you will need to adopt) will depend upon the type of production you're working on. A corporate video will have very different security requirements than an unreleased Hollywood soon-to-be-blockbuster.

The first aspect to consider is the type of elements you need to protect. Consider each of the following: source footage; project files; intermediate files; proxies; audio elements; still images; preproduction material (digital versions of scripts, briefs, storyboards, etc.); and final masters

What you'll probably find is that certain things will never need to be protected from intruders, such as EDLs, because without the context of the footage to go with them, they are largely meaningless. However, some EDLs will carry information related to specific shots in the comments sections, and so the prying intruder (with a lot of time on their hands), the sequence of events of a story being kept under wraps could be revealed through this method; so if you're being really paranoid, it pays to consider every possibility.

The next aspect to consider is how an unauthorized person may gain access to any of these elements: Interception of physical media (for example, getting hold of a DVD sent in the mail); interception of electronic transmission (for example, intercepting emails or satellite feeds); remote access to computer systems (for example, hacking into a computer terminal via the web); and local access to computer systems (for example, having access to a workstation holding digital media).

TIP

There is another aspect of security to consider, which is that intruders may try to cause destruction rather than make copies. It's not unusual for a hacker to break into a system purely to spread a virus, or in some way corrupt or ruin data. Although this can be resolved through a thorough backup policy, the ideal case is to prevent them from doing it in the first place.

How to Protect It

The concept of needing to protect digital assets is by no means unique to the media industry. It's a challenge that is faced by a large number of industries, many of which have a much more critical interest in guaranteeing security (financial and medical institutions spring to mind). The result of this is that there are several different methodologies that can be adopted for use.

Password-Protection

Certain file formats may include a password-protection feature that can prevent access to the contents of the file without supplying the correct password to “unlock” them first. For example, the PDF format inherently supports this type of approach, and so it can be worth routinely password-protecting any files that need to be sent out where there is a danger they could fall into the wrong hands. The benefit of this type of method is that it can be applied fairly easily, without much technical expertise. However, it may prove unworkable (you wouldn't want to have to supply a password every time you try and access a certain file if you're going to be using it many times in a day. The other problem is that this type of security is often not particularly secure and is therefore more of a deterrent to the average computer user than to someone who actively wants to find out what's inside a locked file.

Encryption

A much more sophisticated approach is to use a form of an encryption. Encryption can work in a number of ways, but essentially what happens is that the data to be protected is scrambled in some way so that it becomes meaningless until it is decrypted. Typically encryption and decryption requires the use of a digital key of some sort, which is usually in the form of a password (potentially a very long, unmemorable one) but it could also be in the form of a piece of hardware. In very sophisticated cases, the key will be rolling, changing over a period of time. Some financial institutions use this method, providing you with a fob which displays the correct key at the press of a button.

Encryption can apply to stored data or transmitted data. For example, you can encrypt an entire disk drive, or part of a disk, or even a single file. There are various (and free) utilities for creating encrypted virtual disks (such as the cross-platform TrueCrypt, www.truecrypt.org), which allows you to encrypt any number of files which all become unlocked at once. Encrypting transmitted data becomes a little bit more complex, but in many cases, you may already be using encrypted transmission without realizing it. For instance, certain website transactions (typically those that begin with https), are usually encrypted automatically, protecting them from prying eyes. There are other technologies, such as virtual private networking (VPN) and secure socket layer (SSL) which are built into many applications that provide a form of encrypted transmission as part of what they do.

TIP

As a final note, it's worth pointing out that there is no substitute for good, structured security policies. If you manage a large facility with frequent visitors, then it should go without saying that you set up a system whereby no-one should be able to gain access to anything they are not authorized to.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset