Chapter Five
When to Cut and Why: Factors that Lead to Making an Edit

 

Information

Motivation

Shot Composition

Camera Angle

Continuity

Sound

Editing a motion picture is more than just assembling a bunch of shots one after the other. The editor is tasked with creatively arranging various picture and sound elements so that the audience who view the finished piece will be moved, entertained, informed, or even inspired. This last statement really highlights a fundamental aspect of any motion media production (whether it is a feature film, a commercial, a situation comedy, a corporate video, etc.). The main purpose behind almost any edited project is for it to be shown to an audience. As an editor, you are challenged with crafting an experience that will affect a viewer in some way – hopefully in a way the producers of the project intend.

The material for the project was written with an audience in mind, the footage was recorded with an audience in mind, and you must edit it together with an audience in mind. And this is not just any audience, but the specific audience that the project is targeting. The same people who would watch a documentary about late nineteenth-century North American textile mills may not care to see a movie about the pranks at a college fraternity. Understanding audience expectations and their degree of involvement in the program will be an excellent skill to develop during your editing career. Anticipating the needs of the viewer will go a long way toward aiding your approach to editing the material.

Of course, different genres of motion pictures, television/episodic programming, and web videos, etc. may all require different editorial approaches. The unique content, exhibition avenues, and specific target audiences will necessitate the use of different editing styles, techniques, effects, etc. As you may be just starting out on your filmmaking journey and your editing career path, you should be watching as many of these motion media products as possible. You will begin to see how they are each treated, the common approaches taken, the presence or lack of certain shared aspects or elements, etc. Really watching and getting a feel for the work of other editors is a great way to help to train your brain. Over time, you will most likely develop a solid interest and rather strong skill set in just one of the programming formats (commercials, documentaries, feature films, news, etc.) and you may spend the majority of your editing career in that genre.

But before we get lost in such specifics about the future job you might have, let us return to the goal of our book, which is to discuss the basic grammar of the edit. Although it is true that different editing jobs will call for different editing techniques, it is also true that there are some attributes common to most styles of editing. These common attributes are the elements that your viewing audience (including yourself) have been trained to expect and comprehend when watching a motion picture. People are rarely conscious of these elements, but, through viewing film and television imagery over their lifetime, they subconsciously know how to “read” certain edits and they can easily decipher meaning in the flow of images across the screen. Just as the basic shot types have meaning in the language of film, how an editor strings those shots together in the motion picture also has meaning to the viewer. There is grammar in the edits.

In the previous chapters, we have outlined the stages of the editing process, reviewed the basic shot types of film language, listed source sound files, and examined criteria that will help an editor to assess the best and most usable qualities of the production material. We now need to focus more on the decision-making processes involved with assembling the picture and sound tracks. What factors or elements compel an editor to want to make an edit in the first place? Why cut from one shot to another very particular shot at that very precise moment in time?

The following list is meant to serve as a jumping-off point. These criteria are some of the major reasons for considering a cut when dealing with most material, but, as with many things in motion media production, other factors not mentioned here might come into play. However, using this list will put you in very good shape when editing decisions need to be made.

Information

Motivation

Shot Composition

Camera Angle

Continuity

Sound

Information

A new shot should always present some new information to the viewer. In a motion picture, this may primarily be visual information (a new character entering a scene, a different location shown, an event whose meaning is not quite known yet, etc.), but it may also be aural (voice-over narration, the clatter of horse hooves, a speech, etc.). A smart editor will ask herself or himself several questions regarding the “showing” of the story: What would the audience like to see next? What should the audience see next? What can’t the audience see next? What do I want the audience to see next?

One of the many tasks set up for the editor is to engage the audience both emotionally (to make them laugh, cry, scream in fright, etc.) and mentally (to make them think, guess, anticipate, etc.). Asking the above questions can generate clever and inspired ways of showing the same story. In a mystery, you may purposefully show misleading information, and in a romantic melodrama, you may show the audience information that the characters do not yet know. Regardless of the kind of information presented, the fact that it is there to engage the audience, get them involved, and get them thinking helps to keep them interested in the story. When audience members are thinking and feeling, they are not paying attention to the physical act of the edit, and this engagement helps to keep the motion media pieces running strong and smooth regardless of the genre, content, or storytelling style. It also means that the editor has done his or her job well.

It must be understood that this element of new information is basic to all editing choices. Whenever one shot has exhausted its information content, it may be time to cut to another shot. As you construct a scene in a narrative film or a segment within non-fiction programming, is there new information in the next shot to be added to your sequence? Is there a better choice? Is there another shot, perhaps, from the same scene or from some other source, that does provide new information and fits into the story to make it flow? No matter how beautiful, cool, or expensive a shot may be, if it does not add new information to the progression of the story, then it may not belong in the final edit. Keep in mind, however, that some shots may not possess observable, “physical” information, but they may provide significant tonal value to the scene or the overall mood of the story. These kinds of shots add a different type of “sensory” information to the audience experience.

fig5_1a.jpg

FIGURE 5.1 Each shot presented to the audience should provide new information about the story to keep them engaged and attentive. The MLS of this cartoon detective cuts to the CU of the nameplate and door buzzer. We learn where, who, and why just in these two juxtaposed images.

Motivation

The new shot that you cut to should provide new information, but what about the shot that you are cutting away from? What is the reason to leave that shot? When is a good time to leave the current shot? There should always be a motivation for making a transition away from a shot. This motivation can be visual, aural, or temporal.

PICTURE – In picture terms, the motivating element is often some kind of movement by a subject or an object in the current shot. It could be as big as a car jumping over a river, or as simple as a slight wink of an eye. What if you are shown a close-up of a young woman reading a book? During the shot, she glances to her left as if looking at something off screen. Editing logic would hold that you might then want to cut away from her close-up and cut to the shot that reveals the object of her interest – what she might be looking at in the film space. The motivation to cut away comes from the movement of the actor’s eyes and the desire that this action initiates in the viewer to see what it is that the character is seeing.

A cut to the next shot, of a wolf in Grandma’s clothing, provides the audience with new information. It may show them what the woman is looking at, and, in this case, it may keep them wondering. Is this wolf character in her reality – actually across the room? Is it simply an unrelated daydream? Is it a flashback based on an entry in the journal that she is reading? Or could the wolf character shot be a representation of her imagining what the content of the novel she is reading actually looks like (Figure 5.2) ? The story, genre, concept, and purpose of this motion media piece will help the audience to understand the broader meaning behind why these two video clips are united, but the important thing to remember is that the outgoing clip content (woman, book, eye movement) provides simple yet ample motivation to cut to the incoming clip of the Grandma-wolf.

fig5_2a.jpg

FIGURE 5.2 The picture content and subtle character movement of the woman directing her eyes away from the book and onto some unseen object provide the motivation for the editor to cut to the reveal shot.

fig5_3a.jpg

FIGURE 5.3 The picture content and subtle character movement of this horrified young man looking off screen at some unseen object motivate the cut to a new shot. It is obvious that the snow-covered cabin is not the object seen by the man. The reveal shot has been denied or at least delayed. (Photo credit: A – Anthony Martel)

Another tactic might be that you choose to not show the audience what the character is looking at off screen, but, instead, you cut to some other scene altogether for entirely different information and story progression. Not fulfilling the audience’s curiosity keeps then wondering, guessing, and in suspense as to what the person was looking at and what happened to that person after we cut away (Figure 5.3). This approach, of delayed gratification, can work in drama, horror, and comedy – and just about any genre, provided you eventually return to that initial character and show something that is the “logical” result of what he was looking at earlier in the narrative. The audience will be pleased that you closed the loop on that particular plot point by finally showing them the object of interest. Either delaying revealing this information for too long or never answering that question can be a gamble.

SOUND – If you wish to use sound as a motivating element, then you would need to address the editing of both picture and sound tracks more precisely near that transition. The sound that motivates the cut could be generated by something visible in the shot currently on the screen. As a simple example, a medium long shot shows a stoic farmer standing in his kitchen. A teakettle, visible on the stove in this wider shot, begins to whistle. The whistle on the audio track of the MLS can motivate an informational cut to a detailed close-up of the teakettle on the stovetop. Steam shoots up from the spout and the whistling seems louder (Figure 5.4). It should be noted that because the close-up shot actually magnifies the visual size and importance of the teakettle, it could be appropriate to raise the level or volume of the whistle in your audio track as well. This lets the sound produced by that object match the size of the visual object on screen and reflects the “perspective” or proximity of the close-up shot.

fig5_4a.jpg

FIGURE 5.4 The sound of the teakettle whistling in the wide shot is ample reason to cut to the close-up shot of the kettle steaming away on the stovetop. As a visual metaphor, the boiling kettle can represent the undercurrent of anger being experienced by the frustrated farmer.

Now imagine that this story of the stoic farmer and the teakettle gets a bit more involved. Let’s say that this farmer has just found out that if he doesn’t pay his substantial back taxes, a railroad development company is going to take control over his ranch and evict him. The same simple cut to the steaming teakettle can take on symbolic meaning: it acts as a visual metaphor for the farmer’s frustration and anger boiling just under the surface of his stoic façade. This is an example of a concept edit, discussed in more detail in Chapter Six.

A third and more advanced way of using audio as a transition motivator becomes rather complex in its design. An editor may create what is called a sound bridge. A sound, seemingly generated by something not seen or known to the audience, begins underneath Shot A. It motivates the transition into Shot B, where the origin of the strange sound is revealed. To modify our teakettle example slightly, let us say that we are seeing the same frustrated farmer in the kitchen with the teakettle in the MLS. The audience see the steam rise out of the kettle spout and begin to hear what they may interpret as the kettle whistling. This motivates the cut to a new shot of an old-time steam engine train’s whistle blowing.

The sound of the train whistle, laid under the picture track of the kitchen MLS, acts as a motivator to leave that shot – even though the audience may not have initially caught on that the sound of the “teakettle” is, in fact, the sound of a train whistle. The incoming audio, starting earlier than the picture track edit, acts as a bridge leading the viewer into the new location and the new information of Shot B. The audience follow the unexpected transition because the new image of the train whistling gives them information to process and provides them with an unexpected surprise. They quickly figure out that the harsh whistle sound represents the farmer’s building anger toward the manipulative railroad company (Figure 5.5). As you will learn later, this type of sound bridge edit is often referred to as a J-cut.

fig5_5a.jpg

FIGURE 5.5 The incoming audio track for Shot B (the train whistle) starts playing underneath the outgoing video track of Shot A with the farmer and teakettle apparently whistling. The audience believe the train whistle is the kettle until we cut to the shot of the train. This is an example of a sound bridge – or sound leading picture.

TIME – An editing choice may often come down to the timing. Feeling when to transition from one clip to another may be motivated by the overall pace of the project combined with the rhythm of each particular scene or segment of the video.

If your project is a fictional narrative film, then the pace of a scene – the duration of each shot on the screen – has a lot to do with the energy of the content, emotion, and purpose of the scene. Does your story involve a dramatic bank robbery gone wrong? Is there a tango-like argument between coworkers that shows unexpected romantic tension? Is there a somber scene where a parent must pack up the belongings of her deceased child? Of course, non-fiction programs also require an attention to timing and pacing. A segment for a kids’ show highlighting young teen skateboarders may have a frenetic pace to the edits as opposed to the slower changeover of clips found in a documentary about elderly victims of predatory mortgage lending practices.

The motion picture example of the bank robbery may benefit from quick cutting of different and perhaps odd or canted angles on the scene indicating the confusion, fear, and unpredictable danger brought on by the disruptive and violent act. Both the participants in the scene and the audience are equally disoriented.

The flirtatious argument between unlikely romantic partners may emulate the rhythm of a tango where you cut back and forth and back and forth in equal time, as each character tries to outdo the other. Perhaps the rhythm quickens, as their argument escalates and the shots get tighter and closer until the characters are united in an “intimate” profile two-shot. Are they about to kiss? Then you choose to cut back out to a wide shot that lasts longer on screen to show them regain their composure and walk away in separate directions, having resisted their unanticipated romantic urges.

The grieving parent may be shown in wide shots of very long duration to indicate how still time is now that her child is gone and how isolated she feels – lost in this new world, sad and alone. Perhaps the few transitions in the scene are elliptical in nature (moving forward in time), showing the mother in different stages of grief while handling different objects in her child’s room, as the lighting from the window changes to show the passage of time. Here, “film time” – different from real time – is absolutely under the control of the editor.

The pacing of your overall motion media piece can be much like a traditional roller coaster. There could be slow scenes, moderately fast scenes, and fast scenes combined together at different moments of the story’s progression. As on a coaster, in order to go fast, you must first go slow: a long, slow ride up the first big hill, then a fast race down, then some little bumps, then a big hill with an unexpected turn, then racing down again and finally a few minor bumps and it’s all over before you know it. If it’s all too slow, it may feel like you are not going anywhere and become uninteresting. If it’s all too fast, it may just become an annoying sensory overload without any breaks. Again, the pacing decisions can be motivated by genre, content, and intent, but varying the pace appropriately can give the audience a more engaging ride.

Shot Composition

Traditionally, the film editor could not significantly alter the composition of visual elements in the footage that he or she was given to edit. The relatively high resolution of 35mm film negative did allow for a certain amount of blow-up or frame enlargement. Standard-definition NTSC video (720 3 486) fell apart rather quickly when you tried to scale the frame. HD (1920 3 1080), being of higher resolution, allows for some blow-up and re-composition. It is only now, with high-end digital video imagers that can achieve an effective 4K, 5K, or 8K image resolution, that more substantial and sometimes requisite reframing can be done. However, none of these scaling options for the frame can significantly reposition subjects or objects within the depth (3D space) of the film space of the shots; that was the job of the director and director of photography during production.

The editor can certainly choose which two shots get joined together at a transition. Provided the correct composition is in the visual material, the editor can help to make the viewing of these images more engaging for the audience.

The easiest or most straightforward option for an editor can be to simply edit in all of the footage from one, long, beautifully composed shot – be it simple, complex, or developing. The beautiful, well-balanced shot was designed by the filmmakers to be a showpiece: it looks great and plays well once cut into the sequence. The audience are given the time to appreciate the shot for what it is as it unfolds in its entirety. Everybody is happy. However, do not be afraid to cut into this showpiece if the pacing, story, or characterizations can benefit from an added visual interruption.

Another simple technique is to take two basic but properly composed shots and transition them one after the other. A two-person dialogue presents the perfect scenario to demonstrate this. Your scene starts with a wide shot of two people sitting across the table from one another having a discussion. As Character A speaks, you cut to a close-up of Character A. He is sitting frame left with his look room opening across to frame right. Audiences have grown to expect that you will cut over to Character B listening in a matching close-up. She is sitting over on frame right with her look room opening across to frame left (Figure 5.6).

Using the alternating placement of characters frame left and frame right generates interest in the audience and causes them to stay engaged with the progression of the motion picture. As you cut from CU to CU, the audience get to experience eye-line match and eye trace across the screen and across the edit.

When the viewers are watching Character A speak over on frame left, their attention is over on frame left. They are aware, however, that Character A’s attention is actually across the screen over on frame right. When the cut to Character B comes, the audience trace Character A’s eye-line across the empty screen and rest upon the new face of Character B over on frame right. The compositional placement of Character B should be in such a screen location as to match the eye-line from Character A, so the audience are rewarded for following the eye trace across the screen and across the edit.

fig5_6a.jpg

FIGURE 5.6 Simple, traditional compositions like these engage the viewers by asking them to trace the matching eye-line across the empty look room on screen.

Like a tennis ball bouncing back and forth across the court, the eyes of the viewing audience will travel back and forth across the screen seeking the new information from each character as you cut from one shot composition to the other. You want to engage the audience’s eye trace without making them search too hard for the next bit of information. Subtle searches around the screen’s compositional elements will keep the viewing experience interesting, and more elaborate searches can make it more surprising. Complex and multi-layered shot compositions can look great on screen, but be aware of how you cut into and out of them. Think of how the audience will locate the new, important visual information within the more complex arrangements of on-screen elements. The more time required to find the new area of interest, the more likely they may get frustrated and momentarily pull themselves out of the viewing experience.

Camera Angle

In Chapter Four, we discussed how to review your footage and watch for shots that may have been taken from two positions on set less than 30 degrees apart around the 180-degree arc of the action line. This is one of the key elements of a shot that will help you to determine if it should be cut next to another shot from the same coverage. There has to be reasonable difference in the camera angle on action for two shots to be “comfortably” edited together.

When the coverage is planned for a scene, certain camera placements or camera angles are considered to be the most advantageous, and they are the shots eventually recorded by the filmmakers. Due to factors of time and money, only certain shot types from certain angles will be recorded and the production team try to fit the most information into the fewest, but best-looking, shots that they can. But an editor will never know from where around the 180-degree arc the camera was placed to record the actions of the scene until he or she reviews the footage. The editor can only do his or her best to place shots of differing horizontal angles (greater than 30 degrees apart) next to one another in the edit – particularly with dialogue scenes covered in the traditional master scene method.

The reason for this is simple. If two shots are recorded with similar framing from two, almost identical angles on action, then their resulting images will look too similar to one another, even though they are slightly different. This similarity will register with the viewer as he or she watches the program and it may appear to the eye as if there is a glitch or a jump in the image at the cut point.

The expression jump cut is used frequently in the analysis of editing for motion pictures. In this case, as in most, it simply refers to the fact that while watching the motion images, the viewer perceives a jump, a glitch, or an extremely brief interruption or alteration to the pictures being shown. In our current example of these clean single shots with angles on action that are too close, we will find that the images of Shot A and Shot B are too similar in their appearance (Figure 5.7).

fig5_7a.jpg

FIGURE 5.7 Editing together two shots from similar camera angles will cause a jump at the cut point. Differing camera angles and framing will help to prevent the jump cut in the mind of the viewer.

The audience will not see these shots as providing sufficiently different views of the same information. In their eyes, the image will merely jump, which they will consciously notice, and as a result it will pull them out of the viewing experience, which is something that the editor should try to prevent if such a treatment is not part of the project’s established visual style. Jump cuts have become popularized in recent times via music videos, film trailers, and with certain feature film directors, but that does not mean that they are always excusable. To learn more about the purposeful usage of jump cuts in filmmaking, you should research the French New Wave director, Jean-Luc Godard.

Continuity

In traditional editing methodologies, providing smooth, seamless continuity across transitions is a very important element to keeping edits unnoticed by the viewer. This is called continuity editing or invisible editing. Unlike the purposeful use of jump cuts skipping across time chunks, the story of a traditionally paced, continuity-style edit is supposed to move forward, uninterrupted, where the shots of differing magnification (LS, MS, CU, etc.) flow from one to the next as if presenting continuous events. Experimental films, and the French New Wave movement of the early 1960s, established that visual continuity was not absolutely required. Today, many filmmakers disregard the strictness of continuity concerns in favor of best performance editing. Jump cuts, time shifts, and repeated and alternate action and line delivery have become part of an accepted style of editing and storytelling. Starting with the traditional continuity style, however, is a good way to learn the basic approach to visual story editing.

Once again, editors are not responsible for the quality of the footage that they are given, but they are responsible for assembling that material into the best motion media piece possible. If the production team and subjects have not provided visual material with compositional or performance-based continuity, it will be the editor’s job to make up for that deficiency in some way in the editing. And to make matters more interesting, there are actually several different forms of continuity that need to be addressed at various points throughout the editing process. Let us take a look.

Continuity of Content

Actions performed by the on-camera talent should ideally match from one shot to the next. Because actors are obliged to perform the same actions take after take, for each shot covered in the scene, you hope that the overlapping actions are consistent. This is not necessarily always the case. The continuity of content should be watched for but may not be so easily fixed.

As an example, if the family dog is sitting in a chair during the wide shot of a dinner table scene, then the dog should also be seen in the tighter shots used to show the remainder of the scene. If the dog had been taken off set and there are no other shots with the dog sitting at the table with the family, then, as the editor, you get to make a choice. Do you begin the family dinner scene without the wide establishing shot that shows the dog? Perhaps you start the scene on a close-up of the character speaking the first line. Perhaps you start with a close-up of a plate of food, then move out to a two-or three-shot. Additionally, you have the option of showing the dog in the wide shot and then laying in the sound effect of the dog walking away on the hardwood or tiled flooring while you show the tighter shots of the family without the dog at the table. Perhaps you cut in a shot of the dog lying on the floor in a different part of the house. Regardless of your approach, you are searching for a solution to a continuity problem.

If a man picks up a telephone in an MLS using his right hand, then the telephone should still be in his right hand when you cut to an MCU of him speaking on the phone. If, for whatever reason, the production team have not provided any MCU shots of the man with the phone in his right hand, but only in his left, then you could cut away to some other shot after the MLS and before the phone-in-left-hand MCU. This will give the audience enough of a break from continuous action so that they can either forget which hand the phone was in, or believe that the man had time to transfer the telephone from his right hand to his left while he was off screen. In this case, the cutaway is any brief shot that will provide the appropriate distraction and time filler to allow the audience to make the leap in logic of object continuity adjustment (Figure 5.8).

So either the footage already contains the proper material to edit with the correct continuity of content, or the editor must create some means of hiding, masking, or “explaining” the visual incongruity. If the audience can be “tricked” into seeing something else, or if the performance presented is sufficiently strong, then the questionable content will most likely be overlooked. No matter the approach taken, the editor is like a sleight-of-hand magician purposefully distracting the eyes of the audience to cover the glitch in the picture.

fig5_8a.jpg

FIGURE 5.8 Using a cutaway shot may provide the needed break from inconsistent content so that the audience do not consciously notice the discontinuity of the telephone switching in the man’s hands.

Continuity of Movement

Screen direction is the movement of subjects or objects toward the edges of the frame. This should be maintained as you transition from one shot to the next, if that following shot still covers the same movement of the same subjects or objects. The rules of three-dimensional space (left, right, up, down, near, far) still apply inside the fictional, pretend space of the film world. The production team should have respected the scene’s screen direction and the 180-degree rule during the shooting of coverage. If they did not, and the new shot that you would like to use continues your talent’s movement contrary to the established screen direction, then you may have to insert a neutral shot that will continue the narrative and still provide a visual break from the discontinuity of movement. This other inserted shot, of whatever material you have that fits the narrative flow, will offer the audience a visual break that allows the character time to reverse his direction in the third shot continuing the previous action (Figure 5.9).

fig5_9a.jpg

FIGURE 5.9 Subject movements should maintain screen direction across the edit point. If you wish to cut together two shots that reverse screen direction, then it may be advisable to use an insert shot to break the audience’s attention on the direction of movement.

Continuity of Position

The film space itself has direction and also a sense of place. Subjects or objects within the shot occupy a certain space within the film world as well. It is important for the editor to string together shots where that subject or object placement is maintained continuously. If an actor is shown frame right in the first shot, then he should be somewhere on frame right in any subsequent shots during that scene. Of course, if the actor physically moves during the shot to a different location within the film space, then it is logical to show him on a new side of the frame. A moving camera may also cause the orientation of subjects to shift during a shot. Cutting together two shots that cause the subject or object to jump from one side of the screen to the other will distract the viewer and the illusion of “invisible” editing will be broken (Figure 5.10).

fig5_10a.jpg

FIGURE 5.10 The physical position of subjects and objects within the film space and the shot composition should stay consistent across edit points. This woman appears to jump from screen right to screen left after the cut to the other character.

Sound

The continuity of sound and its perspective is of critical importance to an edited motion picture. If the scene depicts action that is happening in the same place and at the same moments in time, then the sound should continue with relative consistency from one shot to the next. If there is an airplane in the sky in the first shot, and it can be seen and heard by the viewer, then the sound of that airplane should carry over across the transition into the next shot from that scene. Even if the airplane were not physically seen in the next shot of this sequence, the sound of it would still be audible to the characters and therefore it should still be audible to the audience. An abrupt end to the airplane sound at the cut point would stand out, so the editor would need to add it back in underneath the picture and sound tracks of the second shot – although it should probably have its levels lowered in the mix.

Sound levels for voices and subjects and objects should be mixed consistent with their proximity to the main area of action and with their narrative importance throughout an edited scene. Changes in object distances from camera, either through shot choices or talent movements within the film space, should also be accounted for through raising or lowering volume and panning levels in the audio mix for those shots. The increase or drop-off of perspective, proximity, and presence should be appropriately represented.

Additionally, all spaces have a background noise level. It may be soft, deep, high, or loud, depending on the environment depicted on screen. As we saw in Chapter Three, this ever-present layer of sound is commonly called ambience, but it may also be referred to as atmosphere or natural sound (NATS for short). It is responsible for creating a bed of consistent audio tone over which the dialogue and other more prominent sound effects etc. are placed. This extra, uninterrupted sound bed is either lifted from the production audio recordings (sometimes called room tone), or generated by an editor or sound designer from other sources. This ambience track adds a mood or feeling and generates a layer of believability to the location of the scene for the audience. Its presence should be regulated in the mix so that it is minimized under dialogue, etc., but may become more prominent when it does not compete with other, more important sounds.

Chapter Five – Final Thoughts: Is There a Right or Wrong Reason for a Cut?

Yes and no. As with anything that involves a craft, there are the technical methods and reasons for doing things certain ways, but then there are the aesthetic or creative reasons for doing other things in other ways. How you make an edit and why you make an edit are two different aspects of the process, but they are always interrelated. You can edit a project as you see fit, but in the end, it will be the viewing audience who decide whether your choices were right or wrong. Did the edits work or not? Did the audience notice them or not? As long as you have reasons why you made each edit, you are on the right path. Keeping the various elements mentioned in this chapter in mind and anticipating what your audience would appreciate will keep you thinking about why you choose to edit picture and sound elements when you do.

Related Material Found in Chapter Eight – Working Practices

#9, 12, 14, 15, 16, 21, 22, 23, 25, 27, 28, 31, 32, 33, 35, 36, 43, 48, 51

Chapter Five – Review

  1. Know your audience and remember that you are really editing a story for them to experience.

  2. Each shot you transition into should provide the viewer with new information, or define a tone or mood that progresses the story of the project.

  3. Each transition you create should be motivated by some visual or aural element within the shot you are leaving.

  4. The timing of the shots in each scene and the overall pacing of the entire motion picture should reflect the energy of the content, mood, and emotional levels of the characters and the development of the story.

  5. Juxtaposing appropriately dissimilar shot compositions across the transition leads the viewers’ eyes around the frame as they seek new visual information, and therefore keeps them engaged.

  6. Present differing camera angles and shot types to the viewers within a given scene or sequence so that they will not experience a temporal or spatial jump cut – unless that is a deliberate visual style in your project.

  7. Ensure, as best as possible, that your transitions conform to the appropriate continuity of content, movement, position, and sound if you are going for the “invisible” editing style.

Chapter Five – Exercises

  1. Watch a scene or section from any motion media project. Get a feeling for the frequency of shot transitions and see whether it remains relatively constant or if it increases or decreases at particular moments. Does the frequency change with changes in the emotional content of the program?

  2. Using the same scene or section, determine what factor(s) led to the edits occurring when they do. Is it information, motivation, composition, camera angle, continuity, or a combination of several? Is it something entirely separate from these factors?

  3. Edit together a brief sequence (maybe ten clips) of any source video that you have available. It helps if the material is of all different subject matter. When you edit it together, take notes on what object or portion of the screen you look at just after each cut point. Show this same sequence to a friend or classmate and have him or her tell you the first thing he or she looks at after each cut. Screen the sequence several more times to other individuals and record their objects of interest or the areas of the frame that they look at. Compare what you, the editor, chose to look at in each new clip with the responses of your selected viewers. Do they match? Is there a trend? Are they all different? What might these results mean?

  4. Create a quick scenario where a sound bridge across the cut could be applied. You only need two shots: the end of one segment, A, and the beginning of the next segment, B. Cut two versions of the transition: one with a straight cut from A to B, and one with the same straight cut but with your bridging sound clip across the cut underneath both the end of A and the start of B. Play the two versions back to back. Which do you prefer? Which do others prefer?

Chapter Five – Quiz Yourself

  1. You are given five shots: 1. a WS of a high school cafeteria; 2. an ECU of a mobile phone text message saying, “Duck!”; 3. as MS of a boy being hit by a volley of green peas; 4. a CU of the same boy looking off screen at something; 5. an MLS of the same boy seated at his table taking his phone out to look at it. In what order would you edit these five clips and what factors play into your decisions?

  2. What significance does the shot composition have when you cut from one shot to the next? How can these compositions engage the audience?

  3. How can a mismatch in screen direction or screen position from two different coverage shots challenge an editor cutting a dialogue scene?

  4. The background sound of an environment or location within a film has several names. List as many of the names as you remember.

  5. How can the pacing of a motion picture be like an amusement park ride?

  6. What is a “visual metaphor“ and how can it be used in motion media pieces, both fiction and non-fiction?

  7. How could a cutaway shot help to join together two other shots that display performance or action discontinuities?

  8. What was at least one trait popularized by films made under the French New Wave movement of the early 1960s?

  9. True or false: pacing is not a significant aspect of editing videos or a major concern for editors fine cutting a sequence.

10. Create two fiction film scenarios: one where a simple, visible action motivates a cut to a new shot, and one where a sound element within the scene motivates a cut to a new shot.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset