CHAPTER

9The Final Product: Postproduction

Everything you can imagine is real.

Pablo Picasso

THIS CHAPTER'S TALKING POINTS

I. The Producer's Role
II. The Editor's Role
III. The Sound Designer's Role
IV. Delivering the Final Product

First film, then video, and now, new media—each has a language with its own vocabulary, tools, and sets of rules. Your vocabulary is the picture, and your sentence structure is the edit. You're telling your story with images, and the juxtaposition of these images, fleshed out with underlying aural impressions, and the nuance of graphics design. It's in postproduction that the final story comes together.

I. THE PRODUCER'S ROLE

Postproduction can be the least understood aspect of the producer's domain. Before starting postproduction, you want to study any shortcuts that can make the postproduction process more creative and efficient. The producer's job is to know as much as possible about everyone else's job—what they do, the tools of their trade, their rates, the facilities in which they work, and the subtleties of their art. You want to find the most talented and highly qualified people who can work within the limits of your budget. The more prepared you are when you begin postproduction, the faster the process will go.

The Postproduction Supervisor

In producing for TV and new media, the producer usually supervises the project from beginning to end, including the entire postproduction process. A more complex project might require a postproduction supervisor who acts as the producer of postproduction. She works closely with the producer to maintain the vision of the project, and supervises all phases of postproduction including editing, mixing, graphics design, final composite, and delivery of the final master to the end user, client, or broadcaster.

The postproduction supervisor (or the producer, or assigned PA) keeps track of:

images  All the footage that has been shot as well as all the numbered and organized tape reels or storage devices, screening logs, dubs, and other log sheets

images  Other visual images, such as all stock footage, archival footage, animation, graphics, art work, and copies of any related legal release forms

images  All audio elements such as dialogue, background audio, special effects, original and/or stock music, and cue sheets

So whether you are hiring a postproduction supervisor or you're the one in charge, the following guidelines can help you navigate the postproduction process.

Postproduction Guidelines

1. Triple-Check Your Legal Documents

The need to legally protect your project never stops. It continues way past postproduction, so as you put together all the elements you'll need for your edit and mix, be sure that all your legal ducks are in a row. Do you have clearances for all the music you'll use in the mix? Signed releases for stock footage?

When I'm drafting a contract, after I'm finished, I will always go back over it, not just for each word, but I have a checklist, too: did I do this and did I do this? I make sure that I haven't left anything out.

J. Stephen Sheppard, excerpt from interview in Chapter 11

2. Spend Money to Save Money

You want an editor who has a keen sense of storytelling. You want him to be familiar with the best editing system for your project, someone who has kept up with its technical compatibilities and system nuances. He must be able to deliver a color-corrected, audio-balanced final product that's technically up to specification. You want him, ideally, to have experience in cutting a show that's similar to yours, and you want him to be the kind of guy with whom you can spend long hours or days at a time.

Most editors and editing facilities have demo reels or web sites showing their work. If you like what you see, try to meet with the editor before you start shooting. Take a tour of her editing facility. Ask her what you can do to make her job easier. Compare notes on your shooting formats and their compatibility with her editing equipment.

You can save money and time in postproduction when you:

images  Organize your tapes or storage devices and location logs

images  Screen and log your footage

images  Organize editing elements including footage, audio, and graphics

images  Write a paper cut for the edit session

Let's say… that your best friend has an excellent computer and the latest upgrade of a popular editing software program. He's offered to edit your project—for free. This sounds good—great, in fact—since your budget is really low. Yet, you're torn. You know this project could be good enough to be your calling card, but you're just not sure if your friend shares your vision, or has the right skills to showcase that vision. Sure, the editing is free, but if it's not cut right the first time, you'll have to edit it again, wasting time and money. In the long run, you can save the project by editing with a seasoned editor on a professional-quality system. And because editing software is cheap enough to buy, or at least, to rent, you (or your friend) can do a rough cut, then hand it over to a pro who can really make it come alive.

3. Organize the Components for the Edit

It isn't unusual for a producer to shoot 20 hours of footage, or 50, or 100, for just a one-hour program, especially for reality shows or documentaries. This requires an organizational system during the actual shoot that keeps track of where that footage is stored. It helps when you label each tape cassette (or disk, memory card, etc.), including:

images  The tape number (Tape 1, Tape 2, etc.)

images  The location where it was shot (Studio B, in Central Park, etc.)

images  The date of the shoot

images  The audio tracks (Track 1 is the lav, Track 2 the boom, etc.)

images  The camera it was shot with (Camera 1, Camera 2, etc.) in multicamera shoots

When labeling your tapes or memory cards, design an easy system for naming each one. For instance, if you shot in Central Park and used nine tapes, you might label them CP01, CP02, and so on, to CP09. In a studio setting with several cameras, match the camera number with a tape number

Say you're shooting The Jane Smith Show. You use two cameras and change the tapes three times. Your first two tapes can be labeled JS0101 (Jane Smith-Tape 1, Camera 1), JS0102 (Jane Smith-Tape 1, Camera 2), and JS0201 (Jane Smith-Tape 2, Camera 1), and so on, through the subsequent tapes.

This may seem unnecessarily obsessive or too time-consuming. But ask any editor and she'll tell you that she's impressed with your organization, and relieved that she's not spending the extra time looking for shots. Whatever your system, organization like this will pay off a hundred-fold over your producing career.

Tape Log

The producer keeps track of the footage that's been shot in a tape log. Whether the footage has been shot on tape, on P2 cards, in 2K/4K files, or another storage format, you want to know where everything you've shot is located. During the edit and mix, for example, you may find that you're missing a cutaway shot or need to replace a shot you thought would work but didn't. The tape log provides a fast way to find your footage. You'll find an example of a tape log on this book's web site.

Film-to-Tape

Any footage shot on film (Super 8, Super 16, 16 mm, or 35 mm) must first be transferred to digital video before it can be edited in a nonlinear editing (NLE) system. Film-to-tape transfer is a complicated and costly procedure in which the film is converted to video via a telecine machine, also called a film chain, that scans each frame and converts it into a video signal.

During the film-to-tape transfer process, the image and production sound are transferred, and if needed, the film can also be color-corrected. Some producers like to color-correct their footage as they transfer it, adjusting for color and contrast; others prefer to wait until the rough cut is approved, correcting just the approved footage in the final edit. Increasingly, producers color correct their footage in the NLE room, using a program like Final Cut Pro to give the project the right look.

Sometimes, complications can arise from the difference in frame rates between film (24 fps) and video (30 fps) as well as in audio syncing. If you plan to shoot in film and transfer it to video, discuss the film-to-tape process with the editor, and research the resources available on the subject.

Tape-to-Film

Producers occasionally need a 35-mm film print of their video work. Regardless of what format it was shot on, if it's been edited and mixed in an NLE system, the master can then be transferred to a 35-mm film print. However, this process is expensive, so research it carefully. Some 4,000 cinemas across the United States now feature digital projection, and increasingly, more festivals project both film and digital video.

Alternative Sources: Stock and Archival Footage

Using high-quality stock footage not only saves you money, but also can add real nuance and production value to your vision.

Let's say… that you want an opening establishing shot of Los Angeles, a glittering nighttime L.A. skyline from an airplane. By the time you hire a helicopter and a pilot, a camera operator and equipment, arrange for shooting permits and insurance, and pray for good weather, you've spent a small fortune. Enter stock footage. Stock footage can be a godsend to a producer's budget.

For considerably less money, you can buy the exact shot you want. Stock footage facilities license a wide range of high-quality footage that's been shot all over the globe by professionals who sell the clearance rights to producers. This footage is high-quality and is often shot on 35-mm film and transferred to video, or shot on high definition, or onto 2K or 4K digital files.

The choice of shots is limitless, covering almost any scene description, ranging from field workers in Vietnam to time-lapse footage of Tokyo at night. You can get exotic flamingoes in flight, migrating Monarch butterflies, vintage cars, sleeping babies, couples in love, beaches at sundown. Stock footage can save you considerable time and money, as well as lend real quality to your final product.

Stock Footage Search

Finding stock footage is relatively simple. Go online, and search “stock footage” facilities. There are dozens, offering a range of footage genres. In most cases, you can see all their footage online—it'll be watermarked in some way so it can't be stolen. After you've made your choices, and picked the footage you want, you'll then negotiate a fee for the rights to use it in your project.

Stock Footage Fees

The fee for using stock footage depends on several factors. If, for example, you wanted to open your show with a shot of the British Museum, the fee for buying that shot depends on how you plan to use it. If your show is airing on a major network or cable channel, the fee is much higher than if the museum shot is in an educational piece with limited distribution. If it is part of a one-time, nonbroadcast project like a training film for an organization, or being used in an industrial film, the fee is more negotiable and usually lower. If you are buying the exclusive rights to use the shot, the cost is even higher.

Other factors that influence the license fee are:

images  The amount of time for which you want the rights (usually from two years to perpetuity)

images  The territories (just your country, or a few other counties as well, or worldwide)

images  Any special advertising or promotional uses

images  The total number of runs (broadcasts)

images  Use in new media formats (webcasts, mobisodes, etc.)

You also want to double-check the clearances on any copyrights and trademarks. In some cases, stock footage may also require you to obtain releases from any talent or people on screen. Music or narration that is mixed into footage also needs to be cleared. You'll find more information on legal clearances in Chapter 5.

Archival Footage

There are hundreds of sources for footage that has an historical context. The archivists will research, gather, and/or clear the rights for historical footage. They may also transfer it from film to video in whatever format you'll need. As with the stock footage, fees vary and are dependent on their use. Often, the footage may be negotiable, but the underlying music isn't cleared. You want clearances for both images and the music.

Let's say… that you're making a documentary on the history of the Manhattan skyline. You want to contrast the current skyline—you've already got the perfect shot—with the skyline of the city as it looked a hundred years ago. You've found an archival researcher who shows you old photographs, etchings, and a range of skyline images shot in 35- or 16-mm film. She's also found still photographs, magazine covers, old newsreels, and a range of in-depth material that brings an extra texture to your project. She's an expert in finding and clearing the rights, and looks for any good material that's in the public domain and free. Although her fees aren't cheap, she's brought a dimension and credibility to your project that's priceless.

Public Domain Footage

When the copyright has elapsed on footage (or a book or picture or painting or music), it is no longer owned by anyone and its rights are in the public domain (PD). You can use it freely, without paying for clearances or royalty fees. Many private companies find and resell PD footage, and will charge you fees for their research and duplication.

In the United States, you can access any footage that has been shot by almost any agency of the U.S. government using taxpayers’ money, with no clearances or royalty payments. The National Archives, the Library of Congress, the Smithsonian, and NASA, for example, all offer vast footage collections that are available to the public; the only costs to you are for duplication. You can also consult with an independent archival footage expert who can help you locate options from within these vast collections. Other countries have similar sources for footage that is either inexpensive to clear, or it's free.

4. Screen and Log Your Footage

The use of nonlinear editing systems has revolutionized the postproduction process. This is good news for producers who want to be creative while staying within their budget. Because of the ease of NLE, it can be tempting for producers to bring all their tapes into the editing room and screen them as they download, rather than screening them prior to the edit session. This turns the creative process into an expensive administrative swamp that can be time-consuming as well as frustrating for the editor.

The cost of storage drives is going down, and the storage drives are getting bigger, holding more data. But it's still a waste of time to load all your footage when you know you're using only a small portion for your edit. You may have shot 30 hours of footage for a one-hour piece, and your selections, or selects, may boil down to less than three hours.

A cost-effective method of preparing for editing is to screen and log your footage before the edit session. From these log notes, you can construct a paper cut, or an editing storyboard. It's like a shooting script for your editor and the sound designer; it gives them a clear outline of what scenes appear in what order, and where each shot can be found. The paper cut lists time code (TC) locations and descriptions of selected edits, as well as notes about graphics and audio, and the order in which footage appears in the script. You'll find a storyboard template on this book's web site.

Ideally, you want to transfer your footage for screening to DVD (or an FTP site) with matching time code. This means that the TC on your original footage is exactly the same on your screening cassette. It's called visible time code, also vizcode, or VTC, and it is displayed in a small box on the bottom or top of the screen.

IN THE TRENCHES…

It's so much better—for me—to screen and log my footage ahead of the actual edit session. I want to look at it objectively and play around with it in my head, then log it and eventually create a script from all the many shots coming together in my head. To make this script work for the editor, I need to know that my screening dubs have exactly the same time code as the masters from which they were copied—otherwise, my notes won't match. Making sure I've got something as basic as matching time code saves time, a lot of money, and my editor's blood pressure.

images

Screening Log

As you saw in Chapter 8, TC has eight numbers. For example, 01:03:16:22 is the same as one hour, three minutes, 16 seconds, and 22 frames per second (also the exact number of that frame).

As you screen your footage, log it by taking notes of each pertinent shot and its TC number. Let's say that you're screening Tape 1. You like a specific shot where the actor picks up a cup of coffee, sees a letter on the bed, reads it, then angrily throws his cup against a wall. The action starts at 00:01:03:16 (one hour, three minutes, 16 seconds) and ends 10 seconds later (at 00:01:03:26). Your log notes on this scene might look something like this:

Tape # TC in Scene description TC out
# 1 00:01:03:16 (MS) Tom picks up cup, reads letter, throws cup 00:01:03:26

Your tape log details the tape number, the TC numbers for the in-point and the out-point of the scene, the shot's angle (MS, etc.), and a brief description of the scene. If you're logging dialogue, either scripted or unscripted, you might type each word verbatim for an exact transcription. Or, type just the key words and mark irrelevant sections with an ellipsis (…). Often, the tapes are transcribed by a professional transcriber who makes a note of the TC at regular intervals, usually every 30 to 60 seconds.

A number of logging software programs are available, such as The Executive Producer and MediaFiler, along with free programs from Final Cut Pro and Avid. These programs can cut down your screening time and provide notes for the editor during the digitizing process. This saves you download time, which must be done in real time, by more easily transferring your logging notes to the NLE along with the footage. Another choice is to screen and log by hand, using the simple forms found on this book's web site, such as the screening log or the storyboard.

The Scope of the Log

Footage is just one of several elements in your project. Other elements include the audio, animation, music, and graphics. As you did with your footage, you'll keep a log for every element on a tape log, and distribute copies to anyone involved in postproduction such as the postproduction supervisor, production assistants, the editor, graphics designer, and/or the sound designer.

Your log sheet might include any of these elements:

images  Studio or location footage (tape or drive numbers, dates, locations, etc.)

images  Stock footage (footage that's been professionally shot for resale, like helicopter shots or time lapse photography)

images  Archival footage (historical footage or photographs)

images  Graphics (opening titles, closing credits, lower thirds)

images  Animation (animated insert segments)

images  Audio tracks (anything recorded on the footage audio tracks),

images  Additional audio components (music, stings, needle drops, special effects, ADR)

5. Write a Paper Cut

Not every producer has the ability to “visualize” what shots cut well with other shots. But you know what the primary scenes are, and their sequence in the script. Because you've most likely shot your footage out of order from what appears in the script, you'll include all the reel numbers and TCs onto your paper cut, in the order in which they'll appear in the final edited product.

IN THE TRENCHES…

In my experience of working with editors, the ideal editing paper cut could arguably be called an 80/20 paper cut. This means that I've screened the footage and come into the edit room with a paper cut that provides details for roughly 80 percent of my key scenes, cutaways, and the sequence in which they'll appear in the final edited product. The paper cut also includes tape numbers, TC, and scene descriptions. The extra 20 percent? That represents extra leeway that the editor can take, making creative and technical choices beyond the written paper cut that enhance the piece and give the editor a chance to add his or her unique signature.

images

After you've transferred your footage for screening, then screened and logged it, and written a paper cut, you're ready to edit and mix your project.

Remember the earlier scene in which Tom picks up his coffee cup, sees the letter, reads it, and angrily hurls the cup against a wall? Although the footage was shot at different times, on different tapes, and maybe at different locations, they all cut together as one smooth sequence. The finished paper cut format might look like this:

Tape # TC in Scene description TC out
Tape 1 01 03 16 WS >>MS Tom picks up
cup, reads note
01 03 20
Tape 3 03 10 04 CU the note 03 10 08
Tape 2 02 20 25 CU Tom's reaction to note 02 20 30
Tape 1 01 03 21 MS Tom throws coffee cup 01 03 26

In writing your paper cut, you'll find these terms helpful::

Shot. A single uninterrupted videotaped segment that is the primary element of a scene.

Scene. A dramatic or comedic piece consisting of one or more shots. Generally, a scene takes place in one time period, involves the same characters, and is in the same setting.

Sequence. A progression of one or more scenes that share the narrative momentum and energy connected by an emotional and narrative energy.

II. THE EDITOR'S ROLE

When you are editing, the final master is Aristotle and his Poetics. You might have a terrific episode, but if people are falling out because there are just too many elements in it, you have to begin to get rid of things.

Ken Burns

An editor can be a creative magician, a technical consultant, and an effective arbiter of what works and what doesn't. Each editor has her own strengths and styles of cutting. One editor has an ideal style for MTV, and another editor knowledgeably cuts documentaries for the BBC. An editor might specialize in sports or news, sitcoms, movies-of-the-week, online content, commercials, music videos. And then there are those few editors who can cut almost anything.

An experienced editor can take disparate shots and elements and weave them together, creating a seamless flow. As a creative artist, he can “paint” a mood with pacing, place a perspective on the action, and signal conflict or comedy. A technically adept editor can design special effects or transitions between scenes, color-correct the footage, and make sure your project conforms to broadcast standards. Often, he can “fix it in post,” covering up mistakes or finding solutions to seemingly impossible problems that inevitably pop up in everyone's project.

Working with an Editor

I first ask producers for notes and scripts. If I'm lucky, they'll have those, but more and more producers seem to think that editors wave a magic wand over hours worth of footage, and only the good stuff comes up. I remind them that if we first need to screen, log, and digest the material, and then make an insightful and coherent movie, it's going to take time. For every one hour of footage, it takes at least two or three hours to view, log, and highlight it. You then need to knock this down into a script with some kind of theme, and only then can you start to edit.

Jeffrey McLaughlin, excerpt from interview in Chapter 11

Producers come into an editing room with varying levels of experience in postproduction. One producer may have spent hundreds of hours editing and mixing; another has only limited exposure or expertise. Some producers don't have the luxury of extra time, or the foresight to screen their footage, before the edit session. They'll hand over hours of their unscreened footage to the editor, and expect her to work miracles without any script or direction.

The producer's role with the editor is highly collaborative. You want to give the editor specific targets for the project, and you also want to create an environment in which the work can get done. When you're in the edit room, the editor needs to concentrate, so keep phone calls and distracting conversations to a minimum. Discourage people from crowding into his space. When possible, encourage creative leeway with different shots or new ideas. Make sure he gets a genuine “thank you” along with plenty of food, water, and coffee during the edit sessions. The editor is one of your most valuable team members.

With the user-friendly, inexpensive, creative, and evolving NLE systems, like Final Cut Pro, Avid, or iMovie, an entire project can be edited on a laptop. You can now build a functional edit room in your bedroom, or do a rough cut on an airplane. Many producers do their own rough cut first, working out some of the more obvious problems, then bring that rough cut for the editor to fine tune and take to the next level.

But, not every producer has the technical savvy or creative eye to be a good editor. You want your project to reflect your vision, and to adhere to all broadcast standards so it can be aired or connected into other platforms, and also have the technical capacity to be dubbed with no loss of generations, or quality. So, how do you find an editor who can satisfy these objectives?

There are dozens of web sites, phone listings, and television industry directories that list professional editors and editing facilities in various countries, regions, and cities. Visit their web sites and when possible, check out their facility. Meet them, screen the editor's reels, and discuss what you need for editing your project. You can also:

images  Talk to other producers, directors, and writers about editors they've worked with.

images  Call regional or local television stations who may “hire out” their editors and facilities for outside work. If not, ask if they can recommend local freelance editors and/or facilities.

images  Check with local high schools and colleges that have editing equipment for their students. Often, their student editors can be hired for low-budget projects, or can work for academic credit.

Working with Editing Technology

One of an editor's golden assets in an edit room is that he was never part of the production process. He wasn't shooting the film for four weeks and feeling the “magic” of that process. The war stories the crew told about shooting in the midst of a hurricane, or when half the staff came down with food poisoning—they mean nothing to the editor. He only sees the dailies, and the only “magic” he feels is what comes out of those dailies. If a shot works, he will use it, but if it doesn't, he can easily let it go. The pain, the love, or the cost of any one element means nothing if it doesn't work in the edit. No one shot is more important than the whole of the film.

Jeffrey McLaughlin, excerpt from interview in Chapter 11

The rapid evolution of postproduction technology has brought editing, sound mixing, and graphics into the digital domain. These advances have expanded the producer's horizons. From prime time broadcast to art gallery installations, from educational teaching tools to high-end commercials, from user-generated content to slick online sites, the creative possibilities of today's digital tools seem limitless.

Yet the learning curve can be steep. The choices seem endless. The terminology is confusing, and everyone has an opinion. Every six months, new equipment and software floods the marketplace; a system that is state-of-the-art this year is either upgraded or replaced next year.

Nonlinear Editing

Fortunately, there are consistencies between these systems. As you research the right editor for your project, also look at the range of digital nonlinear edit (NLE) systems that conform to professional, broadcast-quality standards. These systems work on the same basic principle as editing on film with an NLE system, pieces of footage can be digitally “spliced” together out of order, just like film editing.

Film editing has always been nonlinear, done with tape and scissors, and its pieces cut and taped together by hand. Before nonlinear editing, video editing was linear—electronically edited in an “always moving forward” direction. An editor could start only at the beginning and work toward the end because of the nature of electronic recording. The traditional way of editing video has been to edit in the chronological or lineal order that shots appeared in the piece.

Now, editing with digital equipment is done in a cut-and-paste mode, just as with film, except it's edited electronically rather than manually. The popular NLE systems Final Cut Pro, Avid Xpress, Premier Pro, Media Composer, and iMovie all work on similar principles. When you can learn one system, it's only a matter of nuance to find the right buttons in the right place on another system. Final Cut Pro and Avid are the systems currently used by most professionals. They offer high-quality options for finishing, are updated consistently, and support more plug-ins . Because these systems are now the pervasive editing modes, we'll be concentrating only on this method of editing and its technology.

If there is one downside to editing on NLE systems, it is the tendency to shoot more footage than is really needed, and to make decisions about your footage in the edit room. This one factor can result in spending valuable time deciding between Take 3 and Take 14 in the editing room, rather than prescreening it. This often translates to spending more money than you budgeted.

Let's say… you shot your one-hour talk show with six cameras in a multicamera setup. The show ended up being just 23 seconds too long. But you don't want to download and digitize six hours of footage or the line cut that was live-switched at the time. That would mean you're downloading—in real time—seven hours of footage, just to cut out 23 seconds. In a linear room, often called an online room, you're working directly with the tapes themselves, and the job will take about an hour. There aren't as many linear rooms as there used to be, but they will always have their functions as an alternative to NLE. Nonlinear editing isn't always the best editing choice.

Digital versus Analog

Once a subject for lively debate, the topic of digital vs. analog is now essentially a non-issue. Outside of BetaCam SP and Hi8, all professional-quality cameras now shoot a digital signal. Some producers in news or unscripted programming still shoot in Beta— it's a tried-and-true standard. It can be downloaded into an NLE via a component signal, or through a Digibeta with an analog board that can process the analog signal to a component digital path.

On the other end of the technological spectrum are newer cameras like the Red One. It doesn't use tape or a disk, but records image and audio onto a digital file, and this case, files up to 4K. Talk with your DP and editor about your shooting options and how they translate into the editing process. And always back up your digital files. Always.

Compression

Compression relates to digital video, and simply means that the video signal is compressed to reduce the need for extra storage, as well as transmission space and costs. Compression techniques involve removing redundant data, or data that is less critical to the viewer's eye. The more the digital signal is compressed, the more distorted the image's details. You can see this effect in pirated copies of DVDs when the picture dissolves or fades to black—the sharpness of the image disintegrates and the pixels become larger. You can see this same effect on your NLE at a low resolution, also called low rez.

At what compression rate should you load your video into the NLE? Making this decision depends on the number of hours you have shot, and the disk space you have available. You can start at the lowest rate of 1-1, which results in the highest resolution (best quality picture). You can also load in at a low resolution, up to 40-1 (the poorest quality picture). This decision is based on how much storage you have and how much material you're working with. Downloading at 40-1 gets you 40 times more dailies and footage you can access, but the quality suffers. If you download at 1-1 or 2-1, your footage is high resolution (high rez), and doesn't need to be conformed later.

Drop Frame versus Nondrop Frame

During the shoot, the DP or camera operator might ask if the footage needs to be shot with a TC setting that's either drop frame (DF) or nondrop frame (NDF). Because video runs at 29.97 fps and not 30 fps, nondrop frame footage has a : 03 frame discrepancy. By the end of a one-hour show, there are 3.6 extra seconds to account for. Broadcasters demand an exact program length, so a 60-minute program is usually delivered in DF, because it's exactly 60 minutes long and the show's timings are in real time.

Why work in NDF? If your show doesn't have to be frame-accurate or an exact length, it's easier for the editor to work with graphics and match edits in NDF because every frame has a sequential number. Some edit systems encounter problems in dealing with both DF and NDF simultaneously, though with the advent of HD and its different frame rates, most NLE systems can now easily make the necessary adjustments.

The Steps in Editing

There are several steps you'll follow if you're editing with a nonlinear system, as described next.

1. Download and Store Footage

Before you begin editing, your footage must first be transferred, or downloaded, into the NLE. There it is digitized. The downloaded tapes are digitized in real time—it takes eight hours to digitize eight hours of footage, so build digitizing time and costs into your budget. When it's downloaded, it's converted into a digital file that can be read by editing software, such as Final Cut Pro or Avid.

More on Downloading and Digitizing

The editor of your project is not always the person who does the digitizing; often, it's done overnight at a lower rate by a dubber on the night staff. As it is being digitized, perhaps you and/or the editor can categorize the footage with recognizable information like tape numbers, time codes, and scene descriptions, and store everything in computer folders or bins. Producers often designate only certain segments or portions of tapes to digitize, called selects, so they don't take up storage space for footage they won't use. This is an area in which a good logging program is an invaluable tool.

The amount of storage available to the NLE is a real consideration if you've got excessive footage, complicated audio components, animation, or graphics. But for most projects, storage isn't a problem with the drives currently available. Only a few years ago, the biggest drives available were one gigabyte drives that sold for $10,000. Today, a 500-gig portable hard drive costs under a tiny fraction of that, and a 4-terabyte drive has been promised by a major manufacturer for a price that's easy for most production budgets.

This luxury of digital storage no longer limits the producer to loading footage and editing in low resolution. You can now cut in high rez, which looks much better than low rez, get client comments, and do any revisions in high rez as well.

FireWire

Initiated by Apple Computer, FireWire is also called known as IEEE-1394 and is a standard communications protocol for high-speed, short-distance data transfer. Sony's version is iLink, Avid uses this protocol in its Adrenaline series, and Panavision calls it DV. Think of it as a transfer pipe that receives and stores data in its native compression/ decompression scheme, or codec.

FireWire theoretically presents itself as the only “lossless” way to digitize footage directly into an NLE. It's currently considered the most efficient way to load editing components into an NLE. Though FireWire initially worked only with DV, it is now capable of working with uncompressed standard definition video, and with data transfer as high as 65 mps. FireWire allows you to transfer video to and from your hard drive without paying the higher costs of JPEG compression, or buying NLE software or banks of RAID-striped hard drives. It also deals well with artifacts.

2. Make the First Rough Cut

After all the footage, audio, and graphic elements have been loaded into the NLE, the editor cuts together her first rough cut—a basic edit. It forms the core of your finished piece, and reflects all the basic editing decisions. Over time, and as part of the creative editing process, this rough cut changes and evolves, but it's this first cut that shapes the project.

Some editors refer to the rough cut as a radio edit or an A-roll edit. This describes the process of first laying down all the sound bites, with video, and listening to it as much as watching it. This helps make sense of the project's narrative viewpoint and its pace.

The next step is to make it visually interesting by editing in all the video footage. But each project is unique, and it dictates its own approach to the rough cut. In a music video, the editor first lays the music down and then cuts the footage to synchronize with the musical beats.

In some programs, the narration is laid down first. Then, the footage is edited to fit the narration. If the narration, or the voice-over, hasn't been finalized, you can record the script by using a scratch track as your cue. This preliminary scratch track of narration, read by you or someone else, helps set the timings and beats for your rough cut. It is replaced later by a professional narrator. Regardless of what your particular project calls for, your rough cut clearly shows what works and what doesn't, what shots cut well with other shots, and the total running time (TRT) of this first pass.

3. Mix the Rough Audio

Throughout the editing process, the editor works closely with the audio tracks: she's separating them, balancing out levels, and keeping track of where everything is in the computer. She may do all the rough audio mixes as well as the final mix. Or, she'll do just a rough mix and then give all the tracks to an audio mixer or sound designer in an audio facility who'll do the final audio mix.

Most editors lay out their audio tracks like this:

Track 1: Narration
Track 2: Sound on tape
Tracks 3 & 4: Stereo music
Track 5: Sound effects
Tracks 6, etc.: Overlapping audio, music, or dialogue

4. Agree on the Final Cut

Most projects take time to edit. The editing process usually goes through several rough versions before there's a final product that makes everyone happy. Then, the editor makes a frame-accurate edit decision list (EDL) that provides exact notes of all the reel numbers, time codes, cuts, and transitions in the rough cut. Finally, the editor re-edits or conforms the rough cut by matching the original footage in high rez, using the EDL. The final cut is the result of all these decisions that come from fine-tuning, tweaking, shortening, or lengthening the piece in editing.

This online finishing stage, also called the conform, concentrates on adding any high-end graphics, color corrections, and audio leveling that your project needs. The entire process is known as going from off-line to online, and is a system that many seasoned producers follow to save money and maintain their vision.

Editors go into their NLE at 1-1 compression and output the project, saving them hours of redigitizing and conforming. They can also digitize at 2-1, getting twice as much storage space. This works fine in the standard definition world, but with the advent of high definition, editors can use several terabytes (one trillion bytes) of storage to work in a high-rez quality. So for now, the offline to online process has become the norm in HD editing. However, as disk space gets cheaper and computers speed up, this final conform stage may eventually be phased out.

Editing High Definition TV—In Five Steps

NLE technology is making real inroads in editing high definition footage. Although it's still going through changes, and is a kind of work-in-progress, HD is clearly the direction the market is going. So, the producer needs to plan ahead.

Step One

The first step is to know what downconversion format you want to use. This means that the HD footage is converted to an NTSC (standard definition) tape that can be downloaded into your NLE system. (For example, when your 24p project is downconverted, the video changes from 23.98 fps to 59.94, and the TC changes from 24 to 30 frames.)

Shooting 24 frames in HD can occasionally complicate the editing process. Some producers downconvert the 24 frames to 30-frame DVCam, and then rely on a conversion program to reconvert the 30 frames back to 24 frames for the conform session. Other producers stay in 24 frames, feeding the 24 frames directly into the NLE. Both systems work, yet each has its pros and cons, so talk the process over with your editor before you start the edit process.

Because a mistake can be costly down the line, professionals recommend that projects that are shot in 1080i or 24p be edited in the NTSC video format; it's easier and cheaper at the moment. The standard downconversion formats are DVCAM, DVCPRO, and Digital Betacam. They all share similar high-quality images, digital audio, and TC capabilities. Other formats like MiniDV and DV aren't recommended because of the problems with embedding TC that exactly matches those of the field tapes.

Step Two

Next, the downconverted footage is digitized into the NLE system. Before you download, clearly mark each reel with a name or number that can be easily read by the computer. Ideally, limit it to four to six characters so the computer can easily read and distinguish each name. For example, Tape 1 shot in Griffith Park could be named GP01. Also make sure that the TC from your original field tapes is downconverted properly with an exact match.

Step Three

Although it's easy to import animation, graphics, and computer generated imagery (CGI) into an NLE system, taking these elements into an online session can be tricky. You can either have them created in the final HD resolution, or you can bring them into the online session, render them out to frames, and transfer these to an HD tape. These are then downconverted and treated like all the other elements in your edit.

Step Four

After you've completed your NLE edit, the editor can export an EDL (edit decision list) with all the information needed to conform in the online session, if needed. After you've made your final cut of your project, send its EDL and the digital cut to the online editing facility in advance of your actual session. Come prepared to the online session with all your original camera reels, graphics files, CGI and effects reels, and any titling or credits information that may be added to your cut.

Step Five

The online editor then assembles the show, using the EDL information. Your presence in this phase of editing is critical—the editor isn't familiar with your project, and the EDL is only an impersonal list of numbers that may not include transitions, wipes, dissolves, and other important creative details.

You don't have to be an expert in postproduction. That's why you're working with an editor you can trust, who'll answer your questions, and let you observe. If you're unsure of how to edit your project, talk to an experienced editor who has worked on a range of projects from start to finish.

Styles of Editing

Looking at a first assembly is kind of like looking at an overgrown garden. You can't just wade in with a weed whacker; you don't yet know where the stems of the flowers are.

Walter Murch, editor

What is editing? Essentially, certain shots take on specific meanings when they are juxtaposed with other shots. This juxtaposition is editing. It can manipulate time and create drama, tension, action, and comedy. Without editing, you'd only have disconnected pieces of an idea floating in isolation, looking for a connection.

Whether it's for TV, the Internet, podcasts, mobisodes, or a 50-monitor video wall, editing in today's media world still follows classic editing guidelines. These were established by American director D. W. Griffith, and Russian directors V. I. Pudovkin and Sergei Eisenstein, early in the last century. These pioneer filmmakers realized a century ago that film possessed its own language, with rules for “speaking” that language. They set the standards for editing that are used today by virtually all editors, no matter what the format.

During the production phase, the producer and director shoot their footage with carefully chosen camera angles and movements that tell a story from a certain narrative vantage point. The editor then takes this footage and—consulting with the producer, editor, and/or the postproduction supervisor—makes artistic decisions about how to cut the footage together. Some styles of editing include the following.

Parallel editing. Two separate yet related events appear to be happening at the same time, as the editor intercuts sequences in which the camera shifts back and forth between one event and another.

Let's say… that the main tension and conflict of the story focuses on a man on death row who is being taken down the prison hallway to the execution chamber. The audience knows he's innocent. At the same time, the state governor is trying frantically to maneuver through rush-hour traffic to save the condemned man; the innocent man trips on his shoelace and hits his head while the guards laugh; the governor is on his cell phone but no one will believe he's the governor and they hang up on him. The innocent man asks the guard to give his wife a note. And so on. This use of parallel editing heightens the tension between both situations.

Montage editing. Short shots or sequences are cut together to represent action, ideas, or to condense a series of events. The montage usually relies on close-ups, dissolves, frequent cuts, and even jump cuts to suggest a specific theme. For example, a single mother on the run from a hired killer moves to a small town with her child. A montage might show them happily moving into their new home, shopping for groceries, unpacking boxes, hanging clothes in the closet, and snuggling in bed on their first night together—all this in about a minute, and with underlying music usually fraught with some kind of emotional guidance. This montage effect gives the viewer a lot of information in minimal screen time.

Seamless editing. This style of editing is used in many dramatic series, some sitcoms, and in feature films. The viewer is unaware of the editing because it is unobtrusive except for special dramatic shots. It supports the narrative and doesn't distract with effects. The characters are the focus, and the cuts are motivated by the story's events. Seamless editing motivates the realism of the story, and traditionally uses longer takes, match cuts rather than jump cuts, and selective audio that can act as a bridge between scenes.

Quick cut editing. This style of editing is highly effective in action and youth-targeted programming, originated primarily by MTV in its infancy. It's used in music videos, promos, commercials, children's TV, UGC, and in programs on fashion, lifestyle, and youth culture. It combines fast cuts, jump cuts, montages, and special graphics effects.

Techniques in Editing

An editor looks at the footage with the producer, then edits the shots together to get from one shot to the next, telling the story. These editing techniques might include:

Cut. A quick change from one shot with one viewpoint or location to another. It's almost always better to use a cut rather than a slower transition like a dissolve or wipe. On most TV shows there is a cut every five to nine seconds, and much faster in some shows. A cut can compress time, change the scene or point of view, or emphasize an image or an idea. Most cuts are usually made on an action, like a door slamming or a slap to the face.

Match cut. A cut between two different camera angles of the same movement or action in which the change appears to be one smooth action.

Jump cut. Two similar angles of the same picture cut together, such as two close-up shots of the same actor. This style of editing can occasionally be edgy, or make a dramatic point, but it can also signal poor editing and continuity.

Cutaway. A shot that is edited to act as a bridge between two other shots of the same action. For example, an actor may look off to the distance; a cutaway shows what the actor sees. A cutaway also helps to avoid awkward jumps in time, place, or viewpoint, and can shorten the passing of time.

Reaction shot. A shot in which an actor responds to something that has just occurred.

Insert shot. A close-up shot that is edited into the larger context and provides an important detail of the scene. When an actor, for example, reads a sign on the door, a shot of the sign itself then is inserted into the edit.

Editing Pace and Rhythm

The genre of your program, and the footage you have shot, can both dictate the editing pace and rhythm. For example, an editor can start with longer cuts, then make more frequent cuts that surprise the viewer or build suspense. This rhythm can create excitement, romance, and even comedy.

Editing to Manipulate Time

Few shows on television, online, or on other platforms are viewed or seen in real time. What the viewer sees is known as screen time, a period of time in which events are happening on screen: an hour, a day, or a much longer time span. There are several devices that an editor can use to give the viewer an impression of compressed time or time that has passed or is passing.

Compressed time. The condensing of long periods of time is traditionally achieved by using long dissolves or fades, as well as cuts to close-ups, reaction shots, cutaways, montages, and parallel situations. Our experiences as a viewer can then fill in gaps of time.

Simultaneous time. Parallel editing, or cross-cutting, shifts the viewer's attention to two or more events that are happening at the same time. The editor can build split screens with several images on the screen at once, or can simply cut back and forth from one event to another. When the stories eventually converge, the passage of time stops. A show like 24 is an example.

Long take. This one uninterrupted shot lasts for a longer period of time than usual. There is no editing interruption, which gives the feeling of time passing more slowly.

Slow motion (slo-mo). A shot that is moving at a normal speed, and then slowed down. This can emphasize a dramatic moment, make an action easier to see at slower speed, or create an effect that is strange or eerie.

Fast motion. A shot that is taking place at a normal speed that the editor speeds up. This effect can add a layer of humor to familiar action, or can create the thrill of speed.

Reverse motion. By taking the action and running it backward, the editor creates a sense of comedy or magic. Reverse motion can also help to explain action in a scene or act as a flashback in time or action.

Instant replay. Most commonly used in sports or news, a specific play from the game or news event is repeated and replayed, usually in slo-mo.

Freeze-frame. The editor finds a specific frame from the video and holds on it or freezes it. This effect abruptly halts the action for specific narrative effects. A freeze frame can also create the look of a still photo.

Flashback. A break in the story in which the viewer is taken back in time. The flashback is usually indicated by a dissolve or when the camera intentionally loses focus.

Editing Transitions

A simple cut is a transition from one shot to the next—it's abrupt and quick. Some storylines require another kind of transition from one shot or scene to another that signals going from one idea to another, moving from one location to the next, or one action that changes to another. These transitions can be achieved in the editing by selectively using any of the following transition devices:

Dissolve. When one image begins to disappear gradually and another image appears and overlaps it. Dissolves can be quick (5 frames, or 1/6 of a second), or they can be slow and deliberate (20 to 60 frames). Both signal a change in mood or action.

Fade outs and fade ins. There are two kinds of fades. A fade out is when an image fades slowly out into a blank black frame signaling either a gradual transition or an ending. A fade in is when an image fades in from a black frame introducing a scene. A fade out or fade in can also be effective from a white blank frame rather than a black one; like a dissolve, this editing transition also works to show time passing or to create a special “look.”

Wipe. An effect in which one shot essentially “wipes off” another shot. There are dozens of wipes available in editing systems, though a professional editor uses them sparingly. Examples are page and circle wipes, sliding an image from right to left or vice versa, and breaking an image into thousands of particles. A wipe can be effective, or it can be a distraction; overuse of wipes can be the mark of an amateur.

Split screen. The screen is divided into boxes or parts. Each has its own shot and action that connect the story. The boxes might also show different angles of the same image, or can contrast one action with another. It works as a kind of montage, telling a story more quickly. The split-screen device can be done cleverly, though too many moving images can also strain the viewer's attention span.

Overlays. Two or more images superimposed over one another, creating a variety of effects that can work as a transition from one idea to the next.

Graphics, Animation, and Plug-Ins

Most programs or content include graphic elements of some kind. These graphics can be a simple show title and closing credits, or they can be complicated animation sequences and special effects within the program itself. Graphics can be generated in the edit session with programs such as After Effects and Photoshop, or might be created by artistic designers in a graphics design facility. Graphics, however, can be expensive, and usually require extra consideration in your budget. Following are a few examples of graphics you might use in your project:

Text. Almost every show has opening titles (including the name of the show) and a limited list of the top creative people (such as the producer, writer, director, actors, etc.); these are called opening credits. Titles that appear at the end of the show are called closing credits, and they list the actors’ names and roles, or positions on the production, as well as other detailed production information. Words that slide under someone on screen and spell out a name, location, or profession are called lower thirds because they're generally inserted in the lower-third portion of the screen.

The electronic text is known generically as chyron (pronounced ky-ron, originally the name of a company that for years was the only professional system that could output high-resolution graphics). Now, chyron can be generated by most editing software programs. The overall impression that the text conveys is determined by its size, color, font, and general style. The text can be digitally imported onto the picture with various speed, rhythms, and movement and from any angle—say, from one side of the screen to another. The graphics give the viewer an impression of the tone and pace of the show, and when combined with music, text can create a unique style for your piece.

Opening and closing credits might be superimposed over a scene from the show, or on top of stills, background animation, or simple black. Some projects require subtitles for foreign languages or close captioning for the hearing-impaired. As the producer, you're responsible for double-checking all names, spellings, and legal or contractual information for the lower-thirds and final end credits.

Animation. Simple animation can be created easily and cheaply by using software like Flash and After Effects. More complex animation is created by an animation designer who uses storyboards and narration, and manages an impressive crew of people who draw, color, and edit animated sequences.

Motion control camera. Special computer-controlled cameras that shoot a variety of flat art such as old newspapers, artwork, and photos, sometimes called title cameras. They are designed to pinpoint detail and to create a sense of motion for otherwise static material with zooms, pans, and other camera moves.

Design elements. Some project genres—documentaries, news shows, commercials, educational, and corporate industrials—depend on the use of various design elements to add depth and information to the content. These elements include logos, maps, diagrams, charts, and graphs, as well as historical photographs, still shots, and illustrations.

The look of film. Falling loosely into the graphics realm, there are several postproduction processes that give video the appearance of film by closely mirroring the color levels, contrasts, saturation, and grain patterns of film at a fraction of the cost and time of film.

Color-correction. The process of reducing or boosting color, contrast, or brightness levels can be done by using color-correcting tools such as Flame or After Effects.

Retouching. This plug-in process offers a gamut of tricks that can enhance an image, like “erasing” a boom dangling into the shot, or a wire holding up a prop. However, the time involved can be costly.

Compositing. Two or more images are combined, layered, or superimposed in the composite plug-in process.

Rotoscoping. Frame-by-frame manipulation of an image, either adding or removing a graphic component. Human action can be rotoscoped as can the blemish erased from a celebrity's face with this plug-in process.

The editing process is vital to the ultimate success of your project. It is aesthetic, intuitive, and often technically challenging. Yet the visuals are only one half of the picture. The second half is the enhancement of audio with its many layers of nuance and possibilities.

But the most important thing is that you really have to have a feel for the music and the artist, and what's going on, and creatively pull all this stuff together. The creative part is really why I'm hired and what you need from the head person pulling this all together, though as a producer, you also have to be aware of all this technical stuff.

Stephen Reed, excerpt from interview in Chapter 11

III. THE SOUND DESIGNER'S ROLE

Tones sound, and roar and storm about me until I have set them down in notes.

Ludwig van Beethoven

The sound designer, like the editor, can perform small miracles by manipulating audio to create an emotional impact on the viewer. The sound designer adds another dimension to your vision by raising or lowering levels of dialogue or ambient sounds, removing distracting background hums, adding sound effects and Foley and adding the right musical elements. His expertise lends a higher production quality to your finished project.

In a less complex project, the video editor can mix all the audio requirements and components in the edit session. However, some projects have more complicated audio elements that require an audio facility for additional work and refining. Here, the audio mixer, sometimes called the sound editor or sound designer, takes over. An audio facility might be a simple, room-sized studio with one or two sound editors who work on audio equipment that synchronizes TC and computers and, depending on the facility, can charge $50 to $200 an hour. It could also be an elaborate, theater-sized studio with several audio mixers and assistants, extensive equipment, and a setup that could be quite costly. Before you book time in an audio facility, discuss your project's audio needs and their possible costs.

The sound designer works with two contrasting “qualities” of sound (direct and studio), and approaches them differently, both aesthetically and technically:

images  Direct sound. Live sound. This is recorded on location, and sounds real, spontaneous, and authentic, though it may not be acoustically ideal.

images  Studio sound. Sound recorded in the studio. This method improves the sound quality, and eliminates unwanted background noise, and can then be mixed with live sound.

Working with the Sound Designer

There are no passengers on spaceship earth. We are all crew.

Marshall McLuhan

As the producer, you want to work closely with the sound designer: supply the necessary audio elements and logs, then discuss the final cut of your piece; offer your ideas and ask for suggestions. In the first stages of an audio mix session, you and the audio crew sit in a spotting session during which you review each area of your project that needs music and effects for dramatic or comedic tension. In this session you're listening for variations in sound levels, for hums and hisses, and anything else that wasn't caught in the rough mix. The sound designer can mix tracks, smooth out dialogue, equalize levels and intensity of sound, and add and layer other elements like music and effects that all contribute depth to the project.

The spotting process takes time. So does the mixing, or sweetening. You're paying for each minute, so based on what you decided to do in the spotting session, discuss with the audio facility how much time you will need to book. Often, an audio facility is willing to negotiate a flat fee for the whole job. You may have booked only six hours but the actual mix ran 10, or vice versa; it not only cost you more money but it placed a real strain on the facility—they may have booked the studio for another job after your estimated six hours was scheduled to end.

Working with the sound editor is much more effective if you can:

1.  Be prepared. When possible, send a rough cut of the project to the sound editor before the mix session. Come to the mix with a show run-down that lists important audio-related details like transitions and music. Provide a music cue sheet that lists all the music selection titles, the composers and their performing rights society affiliation, the recording artists, the length and timing of each cue, the name and address of the copyright owner(s) for each sound recording and musical composition, and the name and address of the publisher and company controlling the recording. You'll find an example of this music cue sheet on this book's web site.

2.  Be patient. At the beginning of the mix, the sound editor needs to do several things before the actual mix can begin, including separating the audio elements, patching them into the console, adjusting the gear, and finally, carefully listening to everything. Be patient during this stage and don't put pressure on the process.

3.  Be quiet. Although you may have worked with these audio tracks for days in the editing room, it is the first time the sound editor has heard them. Keep your conversations, phone calls, and interruptions to a minimum.

4.  Be realistic. Your mix may sound excellent in the audio mixing room because the speakers are professional quality, balanced, and the acoustics are ideal. But most TV shows and online projects are played on TV sets or computer monitors with mediocre speakers. Many of the subtler sound effects you could spend hours mixing may never be heard, so listen to the mix on small speakers that simulate the sound that the end user will hear.

The Technology of Audio Mixing

The digital revolution has provided a wealth of creative and technical opportunities for the producer. Images and sound can interact in dynamic new ways that were previously difficult to achieve if not impossible. Digital sound offers an unparalleled clarity of sound. There is no loss of quality when dubbed, and because digital requires less storage space than video, it doesn't need compression.

By editing sound in the NLE domain, the audio mixer can work freely with sound in the same way an editor can play with visuals: sound elements can be cut, copied, pasted, looped, or altered. Digital audio is easily labeled and stored, making it more efficient to keep audio in sync and to slide it around when needed. Most sound tracks are now prepared on a multitrack digital storage system. The popular professional options include:

images  DAW (digital audio workstation): Programs such as Pro Tools

images  Digital multitracks: Programs include DASH 3324 or 3348

images  Analog multitracks: 24-track Dolby SR or A

Often the editor can handle the entire audio mix in the NLE system. In other situations where the mix is more complex, the picture is first locked, or finalized, and then the audio tracks are exported, usually to a DAW. The tracks are either married to the video or are separate. In the DAW, for example, the sound designer uses software such as Pro Tools and digital storage techniques to focus on specific tracks; clean up audio problems; and record and add narration, music and special effects (M&E), and dialogue.

This process gives the sound editor an impressive range of options: moving the tracks forward and backward, looping music, and extending dialogue and effects. Finally, when everyone's satisfied with the audio, the track is ready for a lay-back where the final audio mix track is married to the picture using TC. The piece can be delivered in stereo, mono, 5.1, or in all versions.

In high-definition or DVD projects, the audio can work well for a 5.1 audio system sound mix. This gives you five full channels (left, center, right, right rear, and left rear) plus one low frequency effects channel. The result is an impressive clarity and fullness of sound that 5.1 audio lends to a final product.

The Creative Components in Sound Design

Just as visual components are edited together, so audio elements are mixed together to create new layers of sound. In larger, more complicated audio mixes, each of the following components might be supervised by an expert who specializes in that specific area. The sound designer works with any or all of these components:

images  Dialogue

images  Sound effects (SFX)

images  Automatic dialogue replacement (ADR)

images  Voice-over (VO) or narration

images  Foley

images  Music

Dialogue

Dialogue is the primary audio element. Words spoken between two (or more) actors or people onscreen is called dialogue. Sometimes it's recorded with background ambient sound, although usually it is recorded in isolation from other audio.

Sound Effects

On a set or on location, any background sounds that surround the dialogue are ideally recorded separately. These sounds include blowing wind, singing birds, insects or tree frogs, water lapping on shore, traffic, children playing, glasses clinking, and so on. These existing effects are known as wild sounds, or recorded sounds that will later be synchronized to the footage. If the sounds don't exist on that location, the sound editor can search through prerecorded sound effects available from a sound effects library. These options can range from a door slam to the howl of a monkey. Producers often buy libraries of sound effects and stock music that offer thousands of audio options and their royalty fees covered in the initial cost.

Automatic Dialogue Replacement (ADR) or Dubbed Dialogue

After all their scenes have been shot, actors may need to rerecord lines of dialogue, or add a line written after the shoot was over. In the recording studio, actors read their lines, keeping them in sync with their on-screen lip movements. Another option is to record new lines that will be mixed into the program later, either over a cutaway or in a long shot if their lips don't match the new lines. Actors might also read a script in a different language that is later dubbed over the original track. Often, a loop group of people is brought into an ADR session to create crowd sounds like background conversations, laughter, mumblings, or yelling that will be mixed into the dialogue. This area of ADR is called wallah, which is intentionally unintelligible so audible words won't intrude on the dialogue.

Voice-Over (Narration)

The narrator who reads a script or commentary adds another layer to the audio. Narration can introduce a theme or link elements of a story together. It adds extra information with an air of authority, and helps interpret ideas or images for the viewer. Often, an on-camera character speaks over the picture in the first person as though she is directly speaking to the viewer. A minor character can tell the story in the third person, or an unidentified narrator who is not on camera can distance the viewer from the image by adding an objective voice to the story. Narration is generally recorded in a separate audio session and mixed in later over the picture. Voice-over can be dialogue that is shot originally on-camera and later played over another picture. For instance, we see a two-shot of a mother who is reading aloud to her child. That shot cuts to a CU of the child's face while the mother's audio continues over the picture. Her audio is the voice-over; on a script, it is written as VO.

Foley

Foley are the sounds of an actor's movements, hands clapping, rustling clothing, a kiss, quiet footsteps, or a fistfight. If these sounds can't be found in a sound effects library, they can be created by the Foley artist who uses audio props, tools, hands and feet— objects and devices that create the right sound effect. They're recorded separately in an audio facility, often in sync with the action, and then mixed with other sound elements.

Music Options

images  Original. This is music that's been composed specifically for a project. It may include themes for the opening and closing, and/or for the body of the show; its emotional direction can highlight the action, characters, and their relationships. The composer is familiar with the creative and technical process, and either hires the musicians or creates the music alone or with a partner. A composer can use computer language known as musical instrument digital interface (MIDI). It is capable of simulating a range of music from a single guitar to an entire orchestra. The final score can go straight from the computer into the mix.

images  Stock. This is music that has been specifically composed and recorded to be available for multiple uses. The composers use audio sampling and composition software and sophisticated equipment to create vast libraries of engaging and effective music that is both versatile and inexpensive. Stock music is a creative alternative used in every genre from corporate videos, documentaries, news, and commercials, to talk shows, sitcoms, online and UGC, even drama. It's less expensive than hiring a composer, and the negotiated rights can be either exclusive or shared, depending on your budget and the end use. Stock music houses can be researched and located by an online search, and most offer samplings that can be downloaded from the Internet.

images  Prerecorded. The source of this music could range from a popular song to an obscure CD, but a strong soundtrack adds an extra appeal to your project. Regardless of the source, you'll first need to clear all music rights, a time-consuming process that is reviewed in Chapter 5.

images  Music cue sheet. Regardless of where your music comes from, you'll make a music cue sheet that lists every piece of music, its source, its length, and who holds the rights. An example of a cue sheet is on this book's web site.

Stylistic Uses of Sound

In addition to creating a clear audio track for your project, manipulation of sound can create stylistic impressions for the viewer. Here are a few classic examples.

images  Diegetic. Music that the characters in the scene hear. They're playing a guitar, or in a club with a jazz band playing behind the action.

images  Non-diegetic. Music not heard by the characters that is added later, such as a soundtrack.

images  Sound bridge. Audio elements, dialogue, sound effects, music and narration can act as a transition between one shot (or scene) to the next.

images  Selective sound. Lowering some sounds in a scene, and raising others, can focus the viewer on an aspect of the story, such as heavy breathing or quiet footsteps.

images  Overlapping dialogue. In natural speech patterns, people tend to speak over one another and interrupt. Yet dialogue is usually recorded on separate tracks without this overlap. The sound editor can recreate this authentic-sounding effect in the mix, and can also separate dialogue tracks that are too close together. Conversations between several people, like those in two different groups, are often recorded on separate tracks so they can be woven together in the mix for a natural sound.

The Steps in Mixing Audio

Steps in audio mixing vary from project to project. During your video edit session, the editor separates the dialogue, music, effects, and other audio elements onto various tracks or channels. Depending on the complexity of your project, the editor can mix the elements in the edit room, or will do a preliminary mix that needs to be completed in an audio facility. During the mix, all the separate audio elements are blended together into a final mix track that is then “married” to the picture and locked in.

The Final Cut and Locked-in Audio

Before the final audio mix begins, make sure that all the video and audio edits have been agreed upon by the clients and other creative team members, and won't require any further changes. Any revisions involving audio after the picture is locked can mean costly remixes.

Sound editors take varying routes in mixing, and each has a unique style of approaching the process. Depending on the complexity of the project, any or all of the following components are part of an audio mix.

images  Dialogue. All dialogue is cleaned up and extra sound effects or extraneous noise are either deleted or moved to separate effects tracks. Any ADR, narration, or voice-overs are also laid onto their own tracks.

images  Special effects. Any special effects tracks—wild sound, ambience, prerecorded effects, and Foley—are separated, cleaned up, and each put onto its own channel. Ideally, there is ample room tone from each location that can fill in any gaps in the audio.

images  Music tracks. The music is generally the last element that is mixed into the audio. All the musical tracks are separated and divided into two categories: diegetic or source music (music the characters or actors hear on screen, like a car radio) or underscore music (music that only the audience hears, such as an opening theme).

images  5.1 Audio. 5.1 refers to the positions in a five-speaker setup in which speakers are placed to the right, center, left, right rear, and left rear of the TV set. This kind of mixing is also called AC3 and Dolby Digital, and is prominent in DVDs, theatrically released films using SDDS and DTS systems, and in some TV broadcasts. 5.1 audio requires a specially equipped television set to hear it at home.

images  To achieve a full 5.1 sound mix, the audio is synchronized and laid off onto the field tapes that have been downconverted to 29.97, taken into editing, and then transferred as open media format (OMF) files into the audio mix. Here, the elements are synced up, mixed, and sweetened to a downconvert of the assembled HD master that can be used for HD distribution or converted to standard definition. However, because each of the five channels delivers sound to a specific spatial position, extra time is needed in the audio sessions to deliver a multi-channel mix that works in this medium.

IV. DELIVERING THE FINAL PRODUCT

As the producer, you want to deliver a project that's the highest possible quality. It may be broadcast on a network, sold to a distributor, seen online, or used for training and education purposes. All these venues require broadcast-quality work that adheres to certain technical standards, which make it possible to dub, copy, and transfer to DVD or other formats without losing quality.

The Client Deliverables

Most clients are very specific about what they expect as a deliverable or final product. Deliverables are generally part of your overall contract with a client, so you want to find out exactly what their expectations and specifications are. Ask for these deliverables in writing so there are no mistakes. The most common requirements for deliverables include:

images  Video format. If your project is being broadcast, it is usually evaluated by a station engineer to make sure it meets broadcast standards. If it's being dubbed, the dub house has technical specifications, too. You may be asked to provide a clean copy of the show that has no text superimposed on it.

images  Audio format. This might include separate mono mixes and stereo mixes, or a 5.1 mix, an M&E mix, special tracking, levels that are constant or undipped, and often one mix in English and another in a different language.

images  Length. The required program length can be quite specific. For example, some American public television stations set a standard half-hour length at 26:46 minutes, and a one-hour show at 56:46 minutes. In most cases, PBS show lengths are 6 seconds less to accommodate a PBS logo. Commercial stations may require a half-hour show to be 22 minutes, while premium and cable channels are less demanding. Most nonbroadcast projects are more flexible.

images  Dubbing. Depending on the client's requirements, you may be responsible for making protection copies, which are exact copies of your final master. These serve as backups in case of damages or loss in shipping. You might also need to provide DVD copies of the project to the client. The amount of copies and their format should be spelled out in your contract, as should any special labeling or packaging and related shipping costs.

images  Abridged versions. You may need to provide an edited version of your project in which any nudity, violence, or offensive language has been removed or “bleeped out.” This version can be required by airlines, certain broadcasters, and foreign distributors.

images  Subtitling. Written text under a picture that translates only those words being spoken on screen from one language into another; for example, the French translation of an American production. However, song lyrics or sounds are seldom subtitled.

images  Closed captioning. Also called close captions, this method of supplying visible text under a broadcast picture is mandated by law to be built into all American TV sets sold after 1993. These sets are designed with a special decoding chip that translates all the audio on the screen into text, such as spoken dialogue, and describes unseen sounds like a dog bark or a knock at the door. Especially designed for the hearing impaired, closed captioning is also useful in loud public places, when learning a language, and when the dialogue isn't clear. The text usually appears in white letters in a black box at the bottom or top of the screen. It is decoded in the TV set or with a special decoder box attached to the set.

All these deliverables are those most commonly required in television and new media projects. They should be clearly stated in all contract negotiations and included in your postproduction budget.

Be in control, but allow for creativity and open up your budget for that extra time. As a producer, your golden rule is to always be prepared. You are in a creative business, so sometimes even the best preparation is not enough. At that point, look to the future and learn from your mistakes.

Jeffrey McLaughlin, excerpt from interview in Chapter 11

ON A HUMAN LEVEL …

Finishing the postproduction process is a kind of triumph in its own right. It signals your project's completion, and it's the result of the teamwork and collaboration of everyone involved. Allow yourself time to celebrate that teamwork. And resign yourself to the fact that every single time you watch it, you'll see something you'd like to change, or wish you'd done differently. Congratulations—you're a producer.

SUMMARY

At this stage, you may have a tangible product you can see on the screen. You have delivered all the final dubs to the client, and said goodbye to the editor and audio mixer. But your project itself isn't really finished. As you'll see in Chapter 10, there are more details to wrap up, as well as guidelines for getting exposure for your project and, for yourself.

REVIEW QUESTIONS

1.  What is the producer's role in postproduction? How is it different from that of the postproduction supervisor?

2.  Name four important legal documents that are essential to check prior to the postproduction process.

3.  Why is time code so important in the editing and mixing of a project?

4.  Describe the uses and the differences between stock footage, archival footage, and footage that is public domain.

5.  What can you do as a producer to prepare for the edit session? For the audio mix?

6.  What would you look for in hiring an editor? How could you find one in your area?

7.  Compare an NLE system with linear film editing.

8.  What audio elements are needed in mixing most projects?

9.  Briefly describe the audio mixing process.

10.  Name three deliverables that are required in most contracts.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
13.59.177.14