CHAPTER 16

Editing the Production

“The three great editing tools are Context, Contrast, and Rhythm.”

Jay Ankeney, Editor

Terms

Clip: A video segment.

Cut/take: An instantaneous switch from one shot to another.

Dissolve: An effect produced by fading out one picture while fading in another.

Fade (a type of dissolve): A gradual change between black and a video image. For example, at the end of a program there is usually a “fade to black” or, if there is a “fade-up,” it means that the director is transitioning from black to a video image. A slow fade suggests the peaceful end of action. A fast fade is rather like a “gentle cut,” used to conclude a scene.

Linear editing: The copying, or dubbing, of segments from the master tape to another tape in sequential order.

Logging: Loggers view the footage and write down the scene/take numbers, the length of each shot, time code, and descriptions of each shot.

Nonlinear editing: The process where the recorded video is digitized (copied) onto a computer. The footage can then be arranged and rearranged, special effects can be added, and the audio and graphics can be adjusted using editing software.

Postproduction: Taking the video that was previously shot and assembling a program shot by shot, generally with a computer-based editing system.

Timeline: Usually includes multiple tracks of video, audio, and graphics in a nonlinear editing system.

Wipe: A novel transition that can have many different shapes.

 

Selecting the right image and skillful editing will make a vital contribution to a production’s impact. The way the shots are interrelated will not only affect their visual flow but will also directly influence how the audience reacts to what they are seeing: their interpretation and their emotional responses. Poor editing can leave them confused. Proficient editing can create interest and tension or build up excitement that keeps the audience on the edges of their seats. At worst, poor image selection can degenerate into casual switching between shots. At best, editing is a sophisticated persuasive art form.

EDITING TECHNIQUES IN TELEVISION

There are roughly two broad categories of editing

Image Live editing: The director, using live cameras and other video sources, “live edits” (directs) a production using a video switcher (Figure 16.1).

Image Postproduction: Taking the imagery that was previously shot and assembling a program shot by shot, generally with a computer-based editing system (Figure 16.2).

One style involves a director with cameras and the other style involves a director with prerecorded programming. However you do it, it is still editing—you are deciding which shot the audience will see.

image

FIGURE 16.1
Live editing occurs when a director is cutting a show together from multiple cameras.

EDITING BASICS

Editing Decisions

During the editing process, a series of decisions need to be made:

Which of the available shots do you want to use? When directing (editing) a live show, choices are irrevocable. You can select only from the shots being presented at each moment by cameras, prerecorded material, graphics, and so on. When you are editing in postproduction, there is time to ponder, to select, and to reconsider.

What is the final shot sequence? The relative durations of shots will affect their visual impact.

At exactly which moment in the action do you want to change from one shot to the next?

How will each shot transition to the next shot? Transitions include cut, dissolve, wipe, fade, and others.

How fast or slow will this transition be?

Is there good continuity between pictures as well as sound that show continuous action? These elements may not have been shot at the same time or places.

image

FIGURE 16.2
Postproduction editing

Each of these decisions involves you making both a mechanical operation and an artistic choice. Even the simplest treatment (a cut from one image to the next) can create a very different effect, based on the point at which you decide to edit the action. Let’s look at an example:

Image You can show the entire action, from start to end:

Image The intruder reaches into a pocket, pulls out a pistol, and fires it. The victim falls. (The action is obvious.)

Image You can interrupt an action, so that we do not know what is going to happen:

Image The hand reaches into the pocket/CUT/to the second person’s face. (What is the intruder reaching for?) Or the hand reaches into a pocket, and pulls out a pistol/CUT/to the second person’s face. (Is the intruder threatening, or actually going to fire it?)

Image You can show the entire action, but hold the audience in suspense about its consequences:

Image We see the pistol drawn and fired/CUT/but did the shot miss?

Editing Possibilities

As we examine editing techniques, you will see how they significantly contribute to the success of the production:

Image You can join together a series of separately recorded takes or sequences to create a continuous smooth-flowing storyline—even where none originally existed.

Image Through editing, you can remove action that would be irrelevant or distracting.

Image You can seamlessly cut in retakes to replace unsatisfactory material—to correct or improve performance; to overcome camera, lighting, or sound problems; or to improve ineffective production treatment.

Image You can increase or reduce the overall duration of the program—by adjusting the length of sequences, introducing cutaway shots, altering playing speed, or repeating strategic parts of an action sequence.

Image Stock shots can be inserted and blended with the program material—to establish location, for effects, or to introduce illustrations.

Image When a subject is just about to move out of shot, you can cut to a new viewpoint and show the action continuing, apparently uninterrupted. (Even where it is possible to shoot action in one continuous take, the director may want to change the camera viewpoint or interrupt the flow of the action for dramatic impact.)

Image By cutting between shots recorded at different times or places, you can imply relationships that did not exist.

Image Editing allows you to instantly shift the audience’s center of interest, redirecting their attention to another aspect of the subject or the scene.

Image You can use editing to emphasize or to conceal information.

Image You can adjust the duration of shots in a sequence to influence its overall pace.

Image Editing can change the entire significance of an action in an instant—to create tension, humor, horror, and so on.

Image By altering the order in which the audience sees events, you can change how they interpret and react to them.

Image Audio sweetening (adding additional sounds to the recording) can be done.

Image Graphics can be added.

Image Special effects can be added.

THE MECHANICS OF EDITING

The actual process you use can have an important influence on the ease and accuracy with which you can edit and on the finesse that is possible. There are several systems, described in the following sections.

Editing In-Camera

It is possible to edit in-camera. To do so, most cameras require that you shoot the action in the final running order, which may not be practicable or convenient. There are even cameras that will allow limited in-camera editing, such as trimming a scene or changing the order of the clips. Some of the latest smartphones and tablets actually have the ability to shoot and edit HD videos. However, cameras have varying abilities and all of them have more limitations than editing on an actual editor (Figure 16.3).

Production Switcher (Vision Mixer)

The live-edit method of using a production switcher to combine (cut, dissolve, wipe, fade, etc.) video sources such as cameras, video players, or graphics, is used in many productions. During a production, conditions in most production control rooms are generally a world apart from the relative calm of an editing suite (Figure 16.4).

image

FIGURE 16.3
Some mobile phones have the ability to both shoot and edit video. Although there are limitations to the editing functions, it can still be a valuable resource for doing a rough cut of the video.

image

FIGURE 16.4
A small “laptop” video switcher can be used with four-camera productions.

The director needs to watch the monitor wall, which is made up of small monitors showing images from cameras, video feeds, video players, graphics, and any other sources. There are also two larger monitors, a “preview” monitor and a “program” or “on-air” monitor. The director uses the preview monitor to review any video source before going to it. The program monitor shows the final program output. (See Chapter 3 for more details about the control room.)

While all of this is going on, the director’s attention is divided between the current shot in the program monitor and upcoming shots—guiding the production crew by instructing, correcting, selecting, and coordinating their work. Of course, this is all done while the director is also checking the talent’s performance, keeping production on schedule, and dealing with issues as they arise. It is no surprise that under these conditions, “editing” with a production switcher can degenerate into a mechanical process (Figure 16.1).

Linear Editing

Linear editing involves “dubbing” or copying the master tape to another tape in a sequential order. This process worked well for editors until the director or client wanted significant changes to be made in the middle of a tape. With a linear tape, that usually meant that the whole project had to be entirely re-edited; this was incredibly time consuming and frustrating. Analog linear editing also did not work well if multiple generations (copies of copies) of the tape had to be made, as each generation deteriorated a little more. Linear systems are generally made up of a “player” and a “recorder” along with a control console. The original footage is placed into the player and then is edited to the recorder (Figures 16.5 and 16.6). Although some segments of the television industry are still using linear editing, the majority of programming today is edited on a nonlinear editor.

image

FIGURE 16.5
Linear editing, or copying the contents of one tape to another tape, one clip after another linearly, is still used on a limited basis. Although the use of linear editors has significantly reduced, segments of the industry, such as news, still use them. (Photo by Jon Greenhoe)

image

FIGURE 16.6
Digital laptop linear systems have been popular with news and sports crews that are on the road. They also can be used as two separate tape decks when needed.

Nonlinear Editing

Today, almost all video and television programs, as well as films, are edited on a nonlinear editor. Nonlinear editing is the process in which the recorded video is stored on a computer’s hard drive. Then the footage can be arranged, rearranged, special effects added, and the audio and graphics can be adjusted using editing software. Nonlinear editing systems make it very easy to make changes like moving video and audio segments around until the director or client is happy. Hard disk and memory card cameras have allowed editors to begin editing much more quickly, as they do not need to digitize all of the footage. Nonlinear systems cost a fraction of what a professional linear editing system does. Once the edited project is complete, it can be output to whatever medium is desired: tape, Internet, iPod, CD, DVD, and others.

AN OVERVIEW OF THE NONLINEAR PROCESS

Step 1: Store the footage on a hard drive that can be accessed by the editing computer (Figure 16.7).

Step 2: Each video segment or clip is then trimmed (cleaned up), by deleting unwanted video frames.

Step 3: The clips are placed into the timeline. The timeline usually includes multiple tracks of video, audio, and graphics. This timeline allows the editor to view the production and arrange the segments to fit the script (Figure 16.8).

image

FIGURE 16.7
Many different devices can be used to input video into a nonlinear editor.

image

FIGURE 16.8
Screenshot showing the composition page of a nonlinear editor. The video clip bin is where video clips that are to be used in the program are stored. The preview monitor allows the editor to preview the video clips before moving them to the audio and video timeline. The program monitor allows the editor to see the audio and video in the timeline. (Photo courtesy of Avid)

Step 4: Video special effects and transitions are added. Nonlinear edit systems allow all kinds of effects, such as ripple, slow/fast motion, color correction, and others. Transitions include dissolves, cuts, and a variety of wipes.

Step 5: Additional audio may or may not be added at this point. Audio effects may be used to “sweeten” the sound. Music and/or voiceovers may be added at different points in the project (Figures 16.9 and 16.10).

Step 6: The final program is output to the distribution medium.

image

FIGURE 16.9
A nonlinear editing suite with an adjoining sound booth for voiceovers.

NONLINEAR EDITING EQUIPMENT

Editing equipment has drastically changed over the last decade. Where once a minimal editing system required two editing decks, two monitors, and an edit controller, today the equipment can be as simple as a camcorder and a computer with an editing software package installed.

image

FIGURE 16.10
The talent is doing a voiceover in an editing suite for a news story at a local news station. (Photo by Jon Greenhoe)

Higher-leveling edit suites may contain multiple types of input devices using a variety of different connectors to transport the data at a much faster speed. They may also include multiple edit screens, speakers, an audio mixer, and other tools.

Video can be imported into the computer from a camcorder, deck, or a memory storage device. One cable can move large amounts of data, as well as control signals, between the camera and computer (Figure 16.7).

Habits of a Highly Effective Postproduction Editor

Adapted from Mark Kerin’s Six Habits of Highly Effective Editors

1. Schedule enough time to make a good edit. Quality editing takes time. Be realistic, and then pad your schedule with a little extra time. It is always better to be done a little early than late. Plus, you can always use a little more time to refine the edit.

2. Get a little distance from the project occasionally. It is easy to become emotionally involved with a specific element of a project. Take a break from it; when you come back, your perspective may have changed. Ask others for their opinion—there is a good chance that they will see things that you didn’t see.

3. List the issues before fixing them one by one. It is good to come up with an organized plan for editing the project. Although it takes time to think it through, it is worth it.

4. Know the priority of your editing elements. The most important editing elements are the emotion and story. If you lose those two elements, you lose the production.

5. Keep a copy of each edited version. Each time you make changes to the project, keep the original (or previous) version. That way you have something to go back to if you run into problems.

6. Focus on the shots that you have. By the time you sit down to edit, it is time to get the best project that you can possibly get from the footage you have recorded. You may even have to forget about the script.

POSTPRODUCTION LOGGING

An often-neglected important aspect of the postproduction process is logging the recorded material. Logging saves time, which also translates into budget, during the actual editing process, as it can be completed before the editing session. After logging the footage, the editor can then move the specific clips that will be used in the program instead of taking time on the editor to search through all of the clips. By digitizing specific clips instead of all of the footage, it also saves hard drive space. Generally, some type of log sheet is used on which notes can be written that include the time code (the address where the footage is located), scene/take numbers, and the length of each shot. The notes may also included a description of the shot and other comments like “very good,” “blurry,” and so on. Logging can be simple notes on a piece of paper or can done with logging software. An advantage to some of the logging software is that it can work with the editing software, importing the edit decisions automatically into the computer (Figures 16.11 and 16.12).

image

FIGURE 16.11
Sample of a log sheet. (Courtesy of the Avanti Group)

Shots can be identified for the log a number of different ways:

Image Visually: “the one where he gets into the car.”

Image By shooting a slate (clapboard) before each shot, which contains the shot number and details (or an inverted board, at the end of shots; Figure 16.13).

Image By timecode, a special continuous time-signal throughout the tape, showing the precise moment of recording.

image

FIGURE 16.12
Logging can be done on paper or via software. Here a camera is connected directly into the computer to capture still frames from each clip and automatically import time code ins and outs. The screenshot shows the stored thumbnail frame, duration, and description. (Photos courtesy of Imagine Products)

THE ART AND TECHNIQUES OF EDITING: MULTIPLE CAMERAS AND POSTPRODUCTION

Directors edit by:

1. Selecting the appropriate shots (camera shots or prerecorded video)

2. Deciding on the order and duration of each shot

3. Deciding on the cutting point (when one shot is to end and the next to begin)

4. Deciding on the type of transition between shots

5. Creating good continuity

Let’s look at these points in more detail.

image

FIGURE 16.13
Slates, or clapboards, are often used to identify each shot taken. The numbers on the slate are transferred to the logging sheet.

EDITING IN 3D

“You have to take into account that what seems slow in 2D may be just what the eye wants to see in 3D. There are times when your instinct tells you to cut faster in flat space but you have to slow it down in this new dimension. For example, in the situation where a person is sliding down a zip line toward the camera, I’d chop it into many individual shots if it was a fast music video in 2D. But, in 3D, I’d lock off the camera and let the person come up to the lens and fly out of frame.”

Shane Marr, Editor

image

FIGURE 16.14
Editing in 3D may reduce the number of edits in a production. (Photo courtesy of Panasonic)

Selecting the Appropriate Shots

MULTICAMERA EDITING

The director needs to review the available shots on the monitors and determine which one works best to tell the story (Figure 16.15).

image

FIGURE 16.15
This sitcom director, sitting out on the set, is deciding which of the cameras best communicates the story.

POSTPRODUCTION EDITING

It is a normal practice to shoot much more material than can be used on the final video. As the video can be immediately checked for quality, the director knows when the needed material has been captured. When the shooting is finally complete, it is time to review the footage. Generally, the following shots are found:

Image Good shots that can easily be used

Image Shots that cannot be used due to defects or errors of various types

Image Repeated shots (retakes to achieve the best version)

Image Redundant shots (too similar to others to use)

So the first stage of editing is to determine which of the available video should be used. Once the shots are chosen, the next step is to decide on the order in which they will be presented.

The Order of Shots

To edit successfully, the editor must imagine being in the position of the audience. He or she is seeing a succession of shots, one after another, for the first time. As each shot appears, the editor must interpret it and relate it to previous shots, progressively building up ideas about what he or she is seeing.

In most cases, the shots will be shown in chronological order. If the shots jump around in time or place, the result can be extremely confusing. (Even the familiar idea of “flashbacks” works only as long as the audience understands what is going on.)

When a series of brief shots are cut together, the fast pace of the program will be exciting, urgent, and sometimes confusing. A slow cutting rhythm using shots of longer duration is more gentle, restful, thoughtful, and/or sad.

In most circumstances, you will find that the order in which the series of shots are presented will influence your audience’s interpretation of them. Even a simple example shows the nuances that easily arise: a burning building, a violent explosion, men running toward an automobile. Altering the order of these shots can modify what seems to be happening:

Image Fire—automobile—explosion: Men killed while trying to escape from fire.

Image Fire—explosion—automobile: Running from fire, men escaped despite explosion.

Image Automobile—explosion—fire: Running men caused explosion, burning the building.

Not only is the imagination stimulated more effectively by implication rather than direct statements, but indirect techniques overcome many practical difficulties.

Suppose you join two shots: a boy looking upward, and a tree falling toward the camera. One’s impression is that a boy is watching a tree being felled. Reverse the shots and the viewer could assume that the tree is falling toward the boy who, sensing danger, looks up. The actual images might be totally unrelated—they’re just a couple of shots from a stock library.

Where Should the Edits be Made?

The moment chosen for a cut affects the visual flow of the program.

Directors usually transition at the following points:

Image At the completion of a sentence, or even a thought

Image When the talent takes a breath

Image Whenever a reaction or clarifying shot is needed

Image About a third of the way into an action, such as standing up (this is a rule of thumb that can be broken)

If the first shot shows a man walking up to a door to open it, and the second shot is a closeup of him grasping the handle, the editor usually has to make sure that there is:

Image No missing time (his arm hasn’t moved yet … but his hand is on the handle in the close-up)

Image No duplicated time (his hand takes hold of the handle in the first shot, then reaches out and grasps it again in the close-up)

Image No overextended time (his hand takes the handle in the first shot, and holds it … and, in the second shot, is still seen holding it waiting to turn it)

There are occasions when editors deliberately “lose time” by omitting part of the action. For instance, a woman gets out of a car, and a moment later we see her coming into a room. We have not watched her through all the irrelevant action of going into the house and climbing the stairs. This technique tightens up the pace of the production and leaves out potentially boring bits during which audience interest could wane. Provided that the audience knows what to expect, and understands what is going on, this technique is an effective way of getting on with the story without wasting time.

Similarly, it is possible to “extend time,” creating a dramatic impact. We see someone light the fuse of a stick of dynamite—cut to people in the next room—cut to the villain’s expression—cut to the street outside—cut to him/her looking around—cut to the fuse, and so on, building up tension in a much longer time than it would really have taken for the fuse to burn down and explode the dynamite.

Special Effects

Most nonlinear editing systems include a number of special effects that can be used to enhance the project. However, directors must be careful to use them appropriately. Overuse of special effects is the sign of an amateur production. Here is a brief list of typical effects:

Image Freeze frame: Stopping movement in the picture, and holding a still frame.

Image Strobe: Displaying the action as a series of still images flashed onto the screen at a variable rate.

Image Reverse action: Running the action in reverse.

Image Fast or slow motion: Running the action at a faster or slower speed than normal.

Image Picture in picture: A miniature picture inserted into the main shot.

Image Mosaic: The picture is reduced to a pattern of small single-colored squares of adjustable size.

Image Posterizing: Reduces tonal gradation in image.

Image Mirror: Flipping the picture from left to right, or providing a symmetrical split screen.

Image Time lapse: Still frames shot at regular intervals. When played back at normal speed, the effect is of greatly speeded-up motion.

What Transition Should be Used?

Transitions play a significant role in the audience’s understanding of what is going on in a scene (Figure 16.16).

image

FIGURE 16.16
A cut portrays something happening in real time, a dissolve and wipe imply a change of time and/or location, and a fade implies the end of a segment or show. (Photos by Josh Taber)
(A) The cut (or take): An instantaneous switch from one shot to another.
(B) The dissolve: An effect produced by fading out one picture while fading in another.
(C) Fade: A fade signifies a dissolve transition to or from black.

CUT

The cut or take is the most common, general-purpose transition. It is an instantaneous switch from one shot to another—a powerful dynamic transition that is the easiest to make.

DISSOLVE

The dissolve is an effect produced by fading out one picture while fading in another; a quiet, restful transition. A quick dissolve tends to imply that the action in the two scenes is happening at the same time. A slow dissolve suggests the passing of time or a different location. If a dissolve is stopped halfway, the result is a superimposition.

WIPE

The wipe is a novel transition that can have many different shapes. While it is occasionally effective, it can be easily overused and quickly become the sign of an amateur.

FADE

A fade is a gradual change (dissolve) between black and a video image. For example, at the end of a program there is usually a “fade to black,” or if there is a “fade-up,” it means that the director is transitioning from black to a video image. A slow fade suggests the peaceful end of action. A fast fade is rather like a “gentle cut” used to conclude a scene.

Good Continuity

Let’s say we are watching a dramatic television show. As the director switches from one camera to the next, we notice that in the close-up, the talent’s hair is askew, but in the second camera’s medium shot, the talent’s hair is perfect. Cutting between the two shots in the editing room exposes a continuity error. If we see a series of shots that are supposed to show the same action from different angles, we do not expect to see radical changes in the appearance of things in the various images. In other words, we expect continuity.

If a glass is full in one shot and empty in the next, we can accept this—if something has happened between the two shots. But if someone in a storm scene appears wet in all the long shots, but dry in the close-ups, something is wrong. If they are standing smiling, with an arm on a chair in one shot but with a hand in a pocket and unsmiling when seen from another angle in the next shot, the sudden change during the cut can be very obvious. The sun may be shining in one shot and not in the next. There may be aircraft noises in one but silence in the next. Somebody may be wearing a blue suit in one shot and a gray one in the next. These are all very obvious—but they happen. In fact, they are liable to happen whenever action that is to appear continuous in the edited program stops and restarts.

There is an opportunity for a continuity error when the crew:

Image Stops shooting, moves the camera to another position, and then continues the shoot

Image Repeats part of an action (a retake); it may be slightly different the second time, so you cannot edit unobtrusively with the original sequence

Image Shoots action over a period of time: part of it one day, and the rest of the scene on the next day

Image Alters how they shoot a scene, after part of it was already shot

The only way to achieve good continuity is to pay attention to detail. Sometimes a continuity error will be much more obvious on the screen than it was during shooting. It is so easy to overlook differences when concentrating on the action, and the 101 other things that arise during production. If there are any doubts, there is a lot be said for reviewing the recording to see previous shots of the scene before continuing shooting.

How to Use Transitions

THE CUT

The cut is the simplest transition. It is dynamic, instantly associating two situations. Sudden change has a more powerful audience impact than a gradual one, and that is the strength of the cut.

Cutting, like all production treatment, should be purposeful. An unmotivated cut interrupts continuity and can create false relationships between shots. Cutting is not the same as repositioning the eyes as we glance around a scene, because we move our eyes with a full knowledge of our surroundings and always remain correctly oriented. On the screen, we know only what the camera shows us, although guesses or previous knowledge may fill out the environment in our minds.

THE FADE

Fade-in

A fade-in provides a quiet introduction to action. A slow fade-in suggests the forming of an idea. A fast fade-in has less vitality and shock value than the cut.

Fade-out

A quick fade-out has rather less finality and suspense than a cut-out. A slow fade-out is a peaceful cessation of action.

Crossfade or Fade-out/-in

Linking two sequences, the crossfade introduces a pause in the flow of action. Mood and pace vary with their relative speeds and the pause time between them. This transition can be used to connect slow-tempo sequences in which a change in time or place is involved. Between two fast-moving scenes, it may act as a momentary pause, emphasizing the activity of the second shot.

THE DISSOLVE

As mentioned earlier, a dissolve is produced by fading out one picture while fading in the next. The two images are momentarily superimposed; the first gradually disappears, being replaced by the second.

Dissolving between shots provides a smooth restful transition, with minimum interruption of the visual flow (except when a confusing intermixture is used). A quick dissolve usually implies that their action is concurrent (parallel action). A slow dissolve suggests differences in time or place.

Dissolves are often comparative:

Image Pointing out similarities or differences

Image Comparing time (especially time passing)

Image Comparing space or position (dissolving a series of shots showing progress)

Image Helping to relate areas visually (when transferring attention from the whole subject to a localized part)

Dissolves often show a change of time or location and are widely used as “soft” cuts, to provide an unobtrusive transition for slow-tempo occasions in which the instant nature of a cut would be disruptive. Unfortunately, they are also used to hide an absence of motivation when changing to a new shot!

A very slow dissolve produces sustained intermingled images that can be tediously confusing or boring.

THE WIPE

The wipe is a novel visual transition that is often used to provide a change of time, change of location, or is just used for decorative transitions.

Although the wipe can add a novelty to transitions, it can easily be overused. The audience can easily pay more attention to the wipe effect than the storyline that it is intended to be moving forward. The wipe can also draw attention to the flat nature of the screen, destroying the three-dimensional illusion.

Wipes have many geometric forms with a variety of applications. For example, a rectangular wipe may be used as a transition between close-up detail (entertainers) and an extreme long shot of the venue

The Wipe’s Split Screen

If a wipe is stopped before it is complete, the screen remains divided, showing part of both shots. In this way, you can produce an inset, revealing a small part of a second shot or, where the proportions are more comparable, a split screen (Figure 16.17).

image

FIGURE 16.17
Wipes can be used in many different situations to enhance the viewing experience.

The split screen can show us things simultaneously:

Image Events taking place at the same time

Image The interaction of events in separate locations (such as a satellite feed)

Image A comparison of appearance and/or behavior of two or more subjects

Image A before-and-after comparison (developments, growth, etc.)

Cause–Effect Relationships

Sometimes images convey practically the same idea, whichever way they are combined: a woman screaming and a lion leaping. But there is usually some distinction, especially where any cause–effect relationship is suggestible.

Cause–effect or effect–cause relationships are a common link between successive shots. Someone turns his or her head—the director cuts to show the reason. The viewer has become accustomed to this concept. Occasionally, you may deliberately show an unexpected outcome:

1. Two men are walking along a street.

2. Close-up shot of one of the men who is intently telling a story and eventually turns to his companion.

3. Cut to shot of his companion far behind, window-gazing.

The result here is a bit of a surprise and a little amusing. However, sometimes the viewer expects an outcome that does not develop and then feels frustrated or mystified, having jumped to the wrong conclusions:

1. Shot of a lecturer in long shot beside a large wall map.

2. Cut to a close-up of map.

3. Cut back to the lecturer who is now in an entirely different setting.

The director used the close-up of the map to relocate the speaker for the next sequence, but the viewer expected to find the lecturer beside the map, and can become disorientated.

Even more disturbing are situations where there is no visual continuity, although action has implied one:

1. Hearing a knock at the door, long shot of the girl as she turns.

2. Cut to a shot of a train speeding through the night.

The director thought that this would create tension by withholding the identity of the person who was outside the door. However, this inadvertently created a false relationship instead. Even where dialogue or action explains the second shot, this is usually an unsuitable transition. A mix or fade-out/in would have prevented the confusion.

Montage

In a montage, a series of images are presented that combine to produce an associative effect. These images can be displayed sequentially or as a multiple-image screen (Figure 16.18).

SEQUENTIAL MONTAGE

One brief shot follows another in rapid succession, usually to convey a relationship or an abstract concept.

image

FIGURE 16.18
Montages can be created a number of ways, including: (A) a rapid succession of related images and (B) juxtaposed images, such as this quad split. (Photos in bottom image courtesy of the U.S. Department of Defense)

MULTIPLE-IMAGE MONTAGE

Several images can be shown at the same time by dividing the screen into two or four segments. Although more segments can be used, images become so small that they can lose their impact. These images may be of the same subject, or of several different subjects. They can be stills (showing various stages as an athlete completes a pole vault) or moving pictures (showing different people talking from different locations).

Multiple-image montages can be used for many different purposes—to show steps in a process, to compare, to combine different viewpoints, to show action taking place at different places, to demonstrate different applications of a tool, to show variety, and so on.

Duration of Shots

If a shot is too brief, the viewer will have insufficient time to appreciate its intended information; if it’s held too long, their attention wanders. Thoughts possibly begin to dwell on the sound and then eventually to channel switching. The limit for most subjects is roughly 15 seconds, depending on the complexity of the shot. A static shot has a much shorter time limit!

The “correct” duration for a shot depends on its purpose. We may show a hand holding a coin for half a minute as its features are described by a lecturer, whereas in a drama, a onesecond shot can tell us that the thief has successfully stolen it from the owner’s pocket.

Many factors influence how long a shot can be held:

Image The amount of information you want the viewer to assimilate (general impression, minute detail)

Image How obvious and easily discernable the information is

Image Subject familiarity (its appearance, viewpoint, associations, etc.)

Image How much action, change, or movement the shot contains

Image Picture quality (detail and strong composition hold most interest)

During an exciting scene, for example, when the duration of shots is made shorter and shorter as the tension grows, the audience is conscious only of growing agitation, and fastmoving action (Figure 16.19).

Audience attention is normally keyed to production pace. A short flash of information during a slow-tempo sequence may pass unnoticed, yet in a fast-moving sequence it would have been fully comprehended.

image

FIGURE 16.19
Tension can be increased by quicker cutting. Here an increasing cutting rate is combined with closer and closer shots. (Photos by Tyler Young)

Priority: Video or Sound?

It is worth remembering during the editing phase that either the pictures or the audio may be given priority. For example, the dialogue has priority when shooting an important speech. Although the camera should focus on the speaker, a single unchanging shot would become visually boring, even with changes in shot size. To make it more interesting, a number of “cutaway shots” are usually used of the audience, special guests, reactions, and so on. But the dialogue is continuous and unbroken—even when editing the image.

If the speech was too long, it may need to be edited in postproduction in order to hold the audience’s attention. Generally the most important passages are then edited together. In this situation, visually it would be easy for the audience to see that segments had been removed, therefore shots of the audience may need to be placed over the edits.

There are occasional scenes in which two people are supposed to be speaking to each other, although they were actually shot separately. For instance, all the shots of a boy stranded on a cliff would be taken at the same time (with dialogue). All the shots and comments of his rescuer at the top of the cliff would be shot at another time. During editing, the shots with their respective speech would be cut together to provide a continuous conversation.

So there are times when the images have priority, and the sound must be closely related to what we are seeing. Other times, the sound will be the priority, and everything has to be edited to support that sound.

Good Directing/Editing Techniques

If editing is done well, the audience does not notice it, but is absorbed in its effect. There are certain established principles in the way one edits, and although like all “rules” they may be occasionally disregarded, they have been created out of experience. Here are a few of the most common:

Image Avoid cutting between shots of extremely different sizes of the same subject (close-up to long shot). It is a bit jolting for the audience.

Image Do not cut between two shots of the same size (close-up to close-up) of the same subject. It produces a jump cut (Figure 16.20).

Image If two subjects are going in the same direction (chasing, following), have them both going across the screen in the same direction. If their screen directions are opposite, it suggests that they are meeting or parting.

image

FIGURE 16.20
Matching cuts: people.
When cutting between images of people, avoid the following distracting effects:

(A) Mismatched camera angles.

(B) Changes in headroom.

(C) Jump cuts: Avoid cutting between shots that are only slightly different in size. The subject suddenly appears to jump, shrink, or grow.

(Photos by Josh Taber)

Image Avoid cutting between still (static) shots and moving images (panning, tilting, zooming, etc.), except for a specific purpose.

Image If you have to break the continuity of action (deliberately or unavoidably), introduce a cutaway shot. But try to ensure that this relates meaningfully to the main action. During a boxing match, a cutaway to an excited spectator helps the tension. A cutaway to a bored spectator (just because you happen to have the unused shot) would be meaningless, although it can be used as a comment on the main action.

Image Avoid cutting to shots that make a person or object jump from one side of the screen to the other.

Anticipating Editing

It does not matter how good the video images are; if they are inappropriate, they may not be able to be used. As you plan your shots, keep in mind that the transition from one image to another has to work smoothly. Following are some of the issues to think about when shooting.

MULTICAMERA AND POSTPRODUCTION EDITING

Image Edits should be motivated. There should be a reason for the edit.

Image Avoid reverse-angle shots (shots from the other side of the axis of action) unless needed for a specific reason (such as slow-motion shots of a sports event), or if it is unavoidable (such as when crossing the road to shoot a parade from the other side). Include head-on shots (frontal shots) of the same action. These shots can work as transitional shots.

Image Keep “cute shots” to a minimum, unless they can really be integrated into the program. These include subjects like reflections, silhouettes against the sunset, animals or children at play, footsteps in the sand, and so on. They take up valuable time and may have minimal use. However, there are times when beauty shots have their place, such as an establishing shot.

Image Where possible, include features in shots that will help provide the audience with the context of the event. This helps them identify the specific location (such as landmarks).

Image Always check what is happening in the background behind the talent or subject. Distractions, such as people waving, trash cans, and signs, can take the audience’s attention away from the main subject. When shooting multiple takes of a scene, watch the background for significant changes that will make editing the takes together difficult.

POSTPRODUCTION EDITING

Image Include cover shots (long shots) of action wherever possible to show the overall view of the action.

Image Always leave several seconds of run-in and run-out (sometimes called heads and tails) at the start and finish of each shot. Do not begin recording just as the action is beginning or the talent is about to speak or stop immediately when action or speech finishes. Spare footage at the beginning and end of each shot will allow more flexible editing.

Image Include potential cutaway shots that can be used to cover edits when any sequence is shortened or lengthened. These could include crowd shots, longs shots, and people walking by.

Image Try to anticipate continuity. If there are only a few shots taken in daylight and others at night, it may not be practical to edit them together to provide a continuous sequence.

Image Where there is going to be commentary over the video (voiceover), allow for this in the length and pace of takes. For example, avoid inappropriately choppy editing due to shots being too brief. (Editors sometimes have to slow-motion or still-frame a very short shot to make it usable.)

Image Plan to include long shots and close-up shots of action, to provide additional editing options. For example, where the action shows people crossing a bridge, a variety of angles can make a mundane subject visually interesting. For example, an LS—walking away from camera toward bridge; MS—walking on the bridge, looking over; XLS—shooting up at the bridge from the river below; LS—walking from the bridge to the camera on the far side; and so on.

Image Remember that environmental noises can provide valuable bridging sound between shots when editing. They can be recorded as a wild track (unsynced sound).

Image Wherever possible, use an identifying board or slate at the start of each shot. Otherwise, the talent or camera operator can state the shot number so that the editor knows where it goes in the final production.

DIRECTING/EDITING ETHICS

Editing is a powerful tool. And we cannot forget that the way a sequence is directed or edited can and should strongly influence an audience’s interpretations of what is happening. Editing can manipulate—sometimes unwittingly—and, particularly in factual programs (newscasts, documentaries), one needs to be aware of the underlying ethics of certain treatment. A sequence of pictures can be selective, misleading the audience:

Image One could deliberately avoid showing significant reactions by cutting out enthusiastic applause or heckling during a speech.

Image Omitting important action or dialogue: When a person rises, turns to another, bows reverently, and slowly leaves the room, this action could be edited so that we see the person rise, open the door, and exit—apparently departing abruptly and unceremoniously, and giving a very different impression of events.

Image Introduce misleading or ambiguous action: During a speech, cutting between shots of people leaving or of a person in the audience yawning or sleeping.

Image Introducing false material: Showing enthusiastic applause that is actually associated with a speech different from the one we have been watching.

REVIEW QUESTIONS

1. Describe the two main types of editing (live and postproduction) and explain how they are alike.

2. How does the act of editing two clips together affect the audience?

3. What is the difference between linear and nonlinear editing?

4. Explain the nonlinear editing process.

5. Why log video footage?

6. How do you determine the order of shots?

7. What are the basic switcher transitions and when are they used?

8. Why is ethics an issue when editing?

INTERVIEW WITH A PROFESSIONAL: SCOTT POWELL

Briefly define your job: My job is to take images and sounds and mold them into a coherent story.

What do you like about your job? One of the things I like about my job is that I get to treat people to the fruits of their labor. After many battles have been fought over the script, the casting, the production design, and all of the many frustrations of production, I get to (hopefully) show them that it all turned out well.

What are the types of challenges that you face in your position? The types of challenges that face me have to do with taking whatever is dropped in my lap and weaving it into something that stirs the senses. Whether it’s exciting, emotional, suspenseful, or sad, the editor’s job is to make the most out of each moment. We shape an actor’s performance, one line, sometimes one word at a time.

Are their specific things that you do to specifically prepare to work on a production? What I do to prepare is read the script and then just dig in. As the scenes come in I cut them. I use the lined script as a road map to my material and I let the material guide what I do.

What suggestions or advice do you have for someone interested in a position like yours? My best advice is to go with your gut. Use your own instincts to find the emotional rhythm of a scene. Learn the rules of editing and make them second nature before you start breaking them.

image

FIGURE 16.21
Scott Powell, Editor

Scott Powell is an editor who has worked on almost 30 different television shows including: Hawaii Five-O, 24, The Lot, and Lost City Raiders.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.148.107.255