CHAPTER

2   The Production Process: Analog and Digital Technologies

•  What are the three stages of production?

•  What are the differences between digital and analog production techniques?

•  Why is production terminology different?

•  Who makes up the production team?

•  What is production aesthetics?

•  How has the history of production developed?

Introduction

Media production requires both analog and digital technologies. The advent of digital technologies stimulated a number of important changes in media production, including the convergence of technologies as well as corporate integration. This chapter explores significant developments encouraged by digital media at the same time that it confirms the continuing value of some analog technologies and provides an overview of the media production process.

The digital revolution describes a process that started several decades ago. Technicians developed uses for the technology based on a two-value or binary system of “1” and “0” (“on” and “off”) instead of a multiple continuous-value analog system of recording and processing audio and video signals. Rather than a revolution, it has been an evolution, as digital equipment and techniques have replaced analog equipment and processes where practical and efficient. Digital equipment may be manufactured smaller, requiring less power and producing higher-quality signals for recording and processing. As a result, reasonably priced equipment, within the reach of consumers, now produces video and audio productions that exceed the quality of those created by professional equipment of two decades ago. But every electronic signal begins as an analog signal and ends as an analog signal, as the human eye and ear cannot directly translate a digital signal (Figure 2.1). Nonetheless, digital technologies have increased the efficiency, flexibility, duplicability, and, in some cases, the reproductive quality of media work in all three stages of production: preproduction, production, and postproduction.

FIGURE 2.1 (A–D) Digital equipment and technologies have entered all phases of audio, video, film, and graphics production. Although digital production techniques are basically the same as analog techniques, there have been marked increases in efficiency, flexibility, lossless duplicability, and, in some cases, reproductive quality. (Courtesy KCTS 9, Seattle, ElectroVoice, Audio-technica, Canon, Vinten, Ross, and Sony Corporations.)

image

STAGES OF PRODUCTION

The production process can be organized into three consecutive stages: preproduction, production, and postproduction. Everything from the inception of the project idea to setting up for actual recording is part of the preproduction stage. This includes the writing of a proposal, treatment, and script, and the breakdown of the script in terms of production scheduling and budgeting. The second major phase of production is the production stage. Everything involved in the setup and recording of visual images and sounds, from performer, camera, and microphone placement and movement to lighting and set design, makes up part of the production stage. Postproduction consists of the editing of the recorded images and sounds, and all of the procedures needed to complete a project in preparation for distribution on various media.

Preproduction

Preproduction consists of the preparation of project proposals, premises, synopses, treatments, scripts, script breakdowns, production schedules, budgets, and storyboards. A proposal is a market summary used to promote or sell a project. A premise is a concise statement or assertion that sums up the story or subject matter. For example, the basic premise of Joan Didion’s film The Panic in Needle Park (1971) is “Romeo and Juliet on drugs in New York’s Central Park.”

A synopsis is a short paragraph that describes the basic story line. Treatments are longer plot or subject-matter summaries in short-story form, which often accompany oral pitches of a premise or concept, and scripts are virtually complete production guides on paper, specifying what will be seen and heard in the finished product. One can break down a script by listing all equipment and personnel needs for each scene so that a production can be scheduled and budgeted. A budget describes how funds will be spent in each production category. A storyboard provides a graphic visualization of important shots that the camera will eventually record.

Production

Production begins with setup and rehearsal. The film, video, or multimedia director stages and plots the action by rehearsing scenes in preparation for actual recording. Charting the movement of talent on the set is known as performer blocking, while charting the movements of the cameras is called camera blocking. Every camera placement and movement of the talent must be carefully worked out before recording. If the action cannot be controlled, as in the live transmission of a sporting event or the production of a documentary, the director must be able to anticipate exactly where the action is likely to go and place the camera or cameras accordingly. During actual production, the entire project is essentially in the hands of the director. In multiple-camera studio or location production, for example, the director often selects the shots by commanding the technical director (TD) to press certain buttons on a device called a switcher, which makes instantaneous changes from one camera to another. In single-camera production, the director remains on the set and communicates directly with the talent and crew. The script supervisor or continuity person watches the actual recording session with a sharp eye to ensure that every segment in the script has been recorded. Perfect continuity between shots, in such details as a consistent left-to-right or right-to-left direction and identical flow of performer movements (matched action) from one shot to the next, must be maintained so that these shots can be properly combined during editing.

In an audio production or recording session, the producer maintains the same authority and responsibilities as a video or film director: rehearsing the musicians, instructing the engineer, and supervising the actual recording session. In a digital multimedia production or an interactive session, whether for a computer game, CD-ROM, blue-laser disc, or DVD recording, the producer’s authority and responsibilities are the same except that the producer may be gathering and working totally with digital material instead of people. In multimedia and interactive production sessions, the producer may very well perform all aspects of the production—from writing the entire process including preproduction through postproduction to creating the graphics, entering code in order to create the program in a digital form, and performing final editing functions.

Postproduction

Postproduction begins after the visual images and sounds have been recorded (although in live television, production, and postproduction stages occur simultaneously). Possible edit points can be determined during the preview stage, when the recorded images and sounds are initially viewed. Pictures and accompanying sounds are examined and reexamined to find exact edit points before various shots are combined. Separate soundtracks can be added later to the edited images, or the sounds can be edited at the same time as the pictures. The postproduction stage ties together the audio and visual elements of production and smoothes out all the rough edges. The visual and audio elements must be properly balanced and controlled. Sophisticated digital devices help editors and technical specialists mold sounds and images into their final form.

In audio postproduction, the emphasis is placed on choosing the best of the many sound takes and combining the various tracks onto one or, in the case of stereo, two finished tracks, or, as in the case of audio for high-definition television (HDTV) for theaters and home theaters, as many as six or more tracks. In motion picture production, the sound editor may use as many as 64 or more tracks to complete the production. Games and other interactive and animated productions also require multichannel audio tracks. Signal processing, including equalization, adding effects, and balancing tracks against each other, is often performed during the sound mix, that is, during the final process of combining various soundtracks. Such processing operations may be performed either in an analog or in a digital format. The tendency is to manipulate audio in a digital format to avoid any degeneration or degradation of the signal.

The three stages of production are separate only in a chronological sense. Proficiency in one stage of the production process necessarily requires some knowledge of all other stages. A director or writer cannot visualize the possibilities for recording a particular scene without having some awareness of how images can be combined during editing. In short, although the overall organization of this text into three stages (preproduction, production, and postproduction) follows a logical progression, mastery of any one stage demands some familiarity with other stages as well.

DIGITAL VERSUS ANALOG TECHNOLOGY

Although all three stages of media production have been affected by the advent of digital technologies, analog technologies continue to play important roles in each stage as well. For many years, equipment used in media production was exclusively analog, and many analog technologies, including motion picture film, are still used widely today. In fact, the size and quality of images recorded by some film technologies have never been surpassed. The potential screen size and image detail, or resolution, of projected large-format film images, such as IMAX and even standard 35 mm film, are still superior to video projection systems, including digital HDTV projection systems, and it is likely to remain so for some time. The look of film, the softness of the film image, the intense saturation of colors, and film’s superior reflectance contrast range (from bright white to dark black) over electronic media (300+:1 versus 100:1) translate into a sophisticated and subtle visual medium. As Nicholas Negroponte, MIT Media Lab founder and author of Being Digital, said about 10 years ago, “The subtlety and nuance of film is so far out of reach right now. Film is still by far the highest, best-resolution medium we have” (American Cinematographer, May 1995, p. 79). But advances in digital equipment and technology have made the visible differences to the audience less discernible between the two media. The increased efficiency in production, creative options, and greater choices in distribution methods make digital techniques in feature film and commercial productions much more attractive to producers.

New media technologies rarely eliminate older technologies, although they often make the use of older technologies more specialized. The use of film has become more and more specialized with every advance in electronic imaging technology. For example, advances in videotape recording and editing made it less advantageous for television news operations to use news film during the 1970s, but many prime-time dramatic programs used film and continue to do so. Today, digital editing systems offer a number of advantages over conventional film editing, and digital technologies have virtually replaced analog audiotape and videotape recording and editing technologies in most situations as well.

Digital systems encode audio and video information as a series of ones and zeros, or “on” and “off” signals. A full range of sounds (from loud to soft and from high pitch to low) and images (from bright to dark and from high color saturation to low) can be digitally encoded as a series of zeros and ones. Analog audio and video information, on the other hand, contains a vast range of incremental electrical or photochemical values that are analogous to the sound and image spectrum.

Digital recordings are more permanent and are much less likely to experience a loss in quality when they are copied from one generation to the next than are analog recordings, because only one value is used for encoding “on” or “off,” one or zero. Digital encoding also offers increased flexibility and efficiency in terms of manipulating and shaping recorded sounds and visual images during postproduction editing and special effects, because it is easier to manipulate a two-value or binary system (one or zero; on or off) than a vast range of incremental values (Figure 2.2).

FIGURE 2.2 Digital signals are determined by periodically sampling the values of comparable analog signals: the greater the number of samples per second, the higher the quality of the digital signal.

image

Despite the many advantages of digital signals in the production process, there are some disadvantages. The conversion of an analog signal to a digital signal by the nature of sampling—that is, trying to rapidly and sequentially encode incremental values as a series of ones and zeros—drops a portion of the original analog signal in order to convert the constantly changing analog signal to the stair-step digital signal. In audio, these omissions are small enough that they are seldom missed. In video, the missing portions are small details not easily missed by the human eye. A greater problem with digital signals is the need to compress signals to save bandwidth, or transmission and storage space. Compression also removes portions of each digital signal that is not missed in the reconstruction of the signal in recompression.

Digital technologies have increased the speed, efficiency, duplicability, and flexibility of film and TV production. It is important to recognize some of the contributions that digital technologies have made to each stage of media production, including preproduction writing, producing, and storyboarding; production recording and lighting; and postproduction editing and special effects. Overall, digital technologies have significantly increased production efficiency in each of these areas, and they have begun to alter conventional notions of reality and history through a proliferation of imaginative and realistic special effects.

Digital Technologies Used in Preproduction

Film and TV preproduction stages consistently use digital computers. Scriptwriting and word-processing computer software programs help writers efficiently format and revise scripts. Producers and production managers use scheduling and budgeting software programs to quickly break down scripts and preplan their productions. Breakdown sheets list all equipment and personnel needs for each scene in a film or TV script. The cost of each of these items can quickly be totaled to produce an overall budget, whereas the duration and sequence of recording each scene can be used to create an overall production schedule. Computers quickly and efficiently make changes in a script, a budget, or a schedule.

Computer graphics software facilitates the creation of storyboards, which can provide visual guidelines for camera shots, editing, and overall storytelling. A storyboard consists of a series of graphic images that indicate the camera framing and composition for each shot in a film or TV program. Previsualization (pre-viz) has extended the use of preproduction storyboards by creating digital storyboards that may be full-motion, full-color scenes created in a graphics computer. These carefully planned computer scenes help the director share his vision of what he wants a scene to look like and how it is to be blocked and shot with the key members of the creative staff: the director of photography, scenic designers, and the actors. The pre-viz storyboard artist becomes the writer, director, camera operator, lighting director, and art director by converting what is written in the script to a digital visual form. This computer program may then be easily manipulated as each member of the creative staff works with the director to make suggestions and modifications to reach an understanding of the common goal of the production. It is much easier and less expensive to make the changes in the computer program before sets are built and time-consuming rehearsals have begun. Other graphics programs allow sets and costumes to be visualized and coordinated before they are actually made. Lighting plots can be revised quickly when computer programs offer the potential to visualize the lighting effects on simulated characters and settings before actually hanging the lights.

Computerized casting networks and performer databases help talent agencies to promote actors they represent and casting directors to find them. Location scouting has been facilitated by computer databases, and the World Wide Web’s ability to provide pictures of possible locations via computer networks offers the potential to both cut down on travel expenses and shorten preproduction schedules. The ability to capture and send images and sounds as well as text around the world via digital computer networks, such as the Internet and the World Wide Web, offers tremendous potential regarding the international flow of information. The Internet and its developing potential for video streaming also offers a new means of marketing motion pictures, and as Negroponte has suggested, “the Net will perhaps be the primary form of world commerce. … And the cinematography community will enjoy an extraordinary new marketplace” (American Cinematographer, May 1995, p. 80). Web sites such as NetFlix and 2.0 sites such as YouTube have already demonstrated the importance and viability of the Internet.

Digital Technologies Used in Production

New digital recording devices for video cameras offer a number of advantages for news recording. For example, a computer hard disk, digital disk, or solid-state RAM chip or a digital videotape recorder can be built into a portable video camera to record news stories for television. Dockable (camera-attachable) hard disks, flash drives, digital disks, or RAM chips, memory drives, such as P2, or digital videotape recorders allow up to two hours or more of professional-quality video to be recorded. Digital images and sounds can be edited immediately on a digital nonlinear editing system, greatly speeding up the production of news stories. Just as analog videotape recording and editing offered significant advantages over news film in the 1970s, digital recording and editing devices offer potential advantages over conventional analog videotape recording and editing news stories today.

Computerized digital lighting boards facilitate production by allowing a cinematographer or lighting director to preprogram a lighting setup and hang several lighting setups simultaneously using different computer files on the same lighting board program. Special lighting effects, such as projecting images and graphics on the walls of sets to add atmosphere or create laser light shows, can also be preprogrammed into a computerized lighting board and software program. Virtual sets created in a computer program inserted behind performers bypass the time-consuming and expensive process of set construction, assembling, and lighting. For example, the lighting director for a popular American film, Batman Forever (1995), made extensive use of digital lighting techniques and equipment during production in order to control more complicated lighting setups and changes than would be possible using conventional analog technology. This film returned to the original 1939 comic-book source of the mythical crime fighter, the Caped Crusader, to create a more active and action-oriented hero than the 1989 version of Batman, as well as active comic-book villains. In one scene at a circus, where the villain, Two-Face (played by Tommy Lee Jones) staged a deadly threat to Batman (played by Val Kilmer), more than 225 Xenon lamps were controlled by a computerized lighting board so that they could do color changes and chases. In another “pan-Asian” sequence within Gotham City, the lighting director used computer-controlled lighting to project saturated colors and Chinese motifs onto the sides of the buildings on Figueroa Street in downtown Los Angeles, where filming was done. The lighting director’s extensive experience with rock and roll concerts and theatrical shows greatly facilitated his use of computerized lighting equipment in the film.

Sound recording has been greatly facilitated by digital audiotape (DAT) direct to MiniDisk, CD-R, solid-state memory (flash drives), and audio-DVD recording processes and equipment.

Flash Drives

USB flash drives are flash memory data storage devices integrated with a universal serial bus (USB) connector. Originally known as thumb drive or jump drive, the more common names of USB drive, memory drive, memory data storage drive, or compact flash drive. The units are small enough to be attached to a key chain or carried in a pocket and sturdy enough to stand a fair amount of physical abuse because there are no moving parts within the case. Instead a series of microchips on a circuit board process, store, and distribute the data presented in a form recognized by virtually all operating systems, whether PC, Mac, or Linux based. The connector is a standard USB male connector. Some 2.0 drives require more power than the standard model, but they all derive their operating power directly from the computer they are plugged into.

Flash drives and memory cards are superior to portable hard drives because they have no moving parts and operate with most equipment without special drivers. They are quite handy in terms of passing files back and forth between computers without wires or complex connectors, sometime called “Sneaker Drive” referring to walking in sneakers between computers. They are easily rewritable without destroying other data on the drive or card. Unless they have special circuits, most are not copy protected, however, and can be easily overwritten if they are not carefully handled.

Other solid-state storage devices include magnetic cards like Mini and MicroSD cards, smart media, and multimedia cards (MMC). Each of these must be matched to a specific circuit like those in cameras and recorders.

Digitally recorded sounds can be filtered more effectively and efficiently on location than analog recordings, for example, to remove unwanted background sounds. Digital sound recordings also minimize hiss as well as generational loss in sound quality when they are copied and dubbed for editing purposes, and they blend well with digitally recorded sound effects, music, and automatic dialogue replacement (ADR) recorded in a sound studio during postproduction. Multimedia audio stays in the digital domain throughout the production process.

Digital Technologies Used in Postproduction

Some of the most significant contributions of digital technology to film and TV production have come in the postproduction area. Digital videotape recorders and direct-to-digital servers facilitate the creation of special effects during final or online editing by allowing images to be layered on top of one another in successive recordings without loss of quality.

Digital editing systems make editing and revising a film or video as simple and quick as operating a computer word processor. In addition to increasing postproduction efficiency, digital editing systems allow an editor to visualize a final edited program. Special effects such as conforming and answer printing or digital video editing may be previewed before the final stages of video or film postproduction. Remarkably versatile and sophisticated three-way color correction and audio mixing and sweetening features and programs are becoming standard in basic digital video editing software systems.

Both film and TV postproduction use digital editing systems throughout the processes. Most digital editing systems employ computer hardware that is capable of processing and storing vast amounts of visual and audio information. A digital editing system may include a central processing unit (CPU) that has a processing speed of more than 2 gigahertz (GHz), eight or more gigabytes (GB) of random access memory (RAM), a keyboard, a mouse, one or two computer monitors, a digital recorder, an amplifier and loudspeakers, and one or more hard disk drives designed for large memory storage (in excess of 1 terabyte or 1,000 gigabytes).

Digital editing software offers several advantages over conventional means of editing film, audiotape, and videotape, including increased flexibility, as well as potential time and cost savings. A common cliché is that digital editing is the equivalent of word processing and desktop publishing for audio, film, and video postproduction. The analogy holds for many aspects of editing that are shared by word processing and various digital editing software programs. For example, most word processing software programs allow a writer to cut, copy, paste, and delete words, paragraphs, and pages of text. Digital editing affords an editor similar flexibility in terms of instantaneously changing the order and duration of sounds and images. For example, clips of video or audio information can be cut, trimmed, copied, pasted, inserted, and deleted along a timeline (Figure 2.3).

FIGURE 2.3 (A–D) Digital equipment may be used to record, control, or edit audio, video, or lighting signals. (Courtesy of KCTS 9, Seattle, Yamaha, Fairfield, ETC, and Ross Corporations.)

image

A clip is usually the smallest unit of digital video (or audio) information that can be stored and manipulated during editing. It can range in duration from just one frame to an entire movie, but it often consists of a single shot, that is, a continuous camera recording or take. Digitized clips are usually imported (or copied) into a particular editing project file where they are edited along a timeline with other images and sounds. Images and sounds from each clip are often displayed as a series of representative still frames along the timeline. Clips can be copied and inserted at various points along the timeline, and they can also be deleted from the timeline and the remaining images and sounds attached to one another.

Every edit made using a digital software program is usually a virtual edit. No digitized material is discarded when clips are trimmed, cut, or deleted along an editing timeline, because each clip is usually stored separately outside the timeline window. Every clip stored on a disk drive is instantaneously accessible in its entirety and can be grabbed in the project or clip window and reinserted at any point along the timeline. Many alternative versions of a scene or sequence can thus be quickly edited and examined without prematurely eliminating material that may be needed later. Transitions from one shot to another can be previewed, as can the superimposition of titles and various digital video effects without ever actually cutting, discarding, eliminating, or deleting any originally digitized video or audio.

The ability to manipulate clips of video and sound along a timeline not only adds flexibility to the editing process; it can also make editing more efficient and cost-effective. Clips can be rapidly trimmed, cut, inserted, and deleted. Digital editing is extremely fast compared with physically cutting and splicing a conventional feature film, and the time it takes to find and insert videotape images and sounds from a source onto a master videotape can be dramatically reduced by using instantaneously accessible digital clips along a timeline. The amount of time scheduled for postproduction editing can be significantly diminished, facilitating the editing of projects that require a short turnaround time, such as topical news magazine segments and minidocumentaries. Increased editing efficiency can also translate into cost savings that affect the overall budget of longer-term projects when an editor’s time and salary can be reduced. Clearly, digital editing offers a number of advantages in terms of flexibility and efficiency over conventional videotape and film editing.

Digital editing systems can be used at many different production levels. Some low-end digital editing systems are designed for use on home computers, and the range of graphics and special effects that are available on relatively inexpensive consumer programs to edit home videos is truly remarkable. Inexpensive digital editing systems also provide an excellent means of teaching video and sound editing, graphics, and special effects to students at a variety of educational levels. High-end professional systems can be used to efficiently edit feature films, relying on Kodak film KeyKode numbers and SMPTE timecode numbers (discussed in Chapter 6, Audio/Sound) as references for film conforming and online editing. Large corporations whose video production units use high-end digital editing systems sometimes finish their programs in digital form, avoiding the added time and cost of online videotape editing. A large number of audio tracks can usually be edited initially using digital editing software, and additional editing, mixing, and sound “sweetening” can be done later using a compatible digital audio workstation (DAW).

Special effects techniques have been greatly expanded and enhanced by digital technologies, and the Academy of Motion Picture Arts and Sciences recently granted full-branch status (similar to the acting and cinematography branches of the academy) to visual effects supervisors and artists. Digital effects are often combined with miniatures (smaller copies of objects) and models (full-size mockups) to produce startlingly realistic special effects in Hollywood feature films, such as J. K. Rowling’s Harry Potter series. (Miniatures and models are discussed more fully in Chapter 11, Graphics, Animation, and Special Effects.) Computer graphics hardware and software have played an important role in the creation of special effects.

Many of the special effects used in the Hollywood film Apollo 13 (1995), for example, were achieved by compositing (combining digital images) miniatures and models using computer graphics and digital video effects. Apollo 13 focuses on the nearly tragic story of the Apollo 13 astronauts: an accident occurred on-board their spacecraft in April of 1970 that forced their moon landing to be aborted and nearly left them stranded in space. These events took place at the height of the space race and the Cold War competition between the United States and the Soviet Union.

The ability to manipulate and control individual pixels (single dots of colored light on a dimensional graphic image) has led to a proliferation of effects that challenge conventional conceptions of history and reality. For example, the NASA space program astronaut who advised the producers of Apollo 13 regarding the authenticity of various space-launch procedures and restagings asked the producers where they had obtained actual documentary footage of the launch of the Apollo 13 spacecraft, when in fact the images were digital special effects created on a computer.

The Hollywood film Forrest Gump (1994) placed the fictitious main character inside the frame of an actual documentary recording of former President Lyndon B. Johnson, while President Johnson’s lips were animated and he appeared to speak to Forrest Gump. These examples illustrate the power of digital special effects to potentially rewrite history and to create artificial worlds that sometimes seem more real than authentic documentary recordings. Film history has itself been revised and manipulated through the digital colorization of old black-and-white feature films. Colorizing old (and sometimes new) Hollywood films has clearly distorted the original artist’s intentions and has altered film history. In the hands of media moguls, such as Ted Turner, who acquired the MGM library for use on Turner Network television, it has also significantly added to the television markets and viewing audiences for older films. Digital film and TV technologies have had a significant impact on conventional notions of history and reality, and they have challenged traditional legal, ethical, and aesthetic conceptions as well.

Digital technologies that offer potential connections to film and video are CD-ROM, DVDs, Blu-ray, high-definition versatile multilayer disc (HD-VMD), holographic video disc (HVD), computer games, and other interactive software. Many Hollywood film companies work with CD-ROM producers to create interactive computer games in conjunction with the release of feature films to theaters. The same settings or locations used in a feature film can be recorded in virtual 360-degree space using several film cameras. These images can then be mapped three dimensionally via various computer programs so that a DVD player can move throughout the space, interacting with characters and situations from the film. DVDs that incorporate film and video material also offer tremendous potential in education, including interactive film and TV production training, such as learning how to operate specific pieces of equipment. Films and television programs released on DVDs include many extra features and supplementary materials, which expand the information on how and why a production was completed. Digital technologies have clearly increased the speed, efficiency, and flexibility of media production in all three stages of film and TV production: preproduction, production, and postproduction. Although they have not yet eliminated superior analog technologies, such as film recording and theater screenings, digital devices have made the use of traditional analog technologies more specialized. In addition, digital technologies have begun to alter conventional notions of history and reality through the use of sophisticated computer graphics and special effects. Finally, as we move through the twenty-first century, new digital technologies, such as various forms of digital discs, hard drives, the Internet, and the World Wide Web (WWW), will undoubtedly provide new markets for films and videos and new educational opportunities for film and TV students and scholars.

The advent of digital technology has blurred many traditional distinctions in media production, such as offline versus online editing, film versus video production, and (active) artists versus (passive) viewers. Originally, videotape editing occurred in two stages, often referred to as offline and online editing. During offline editing, the sequential order of visual images and sounds were arranged and rearranged using small-format dubs or copies of the original videotape recordings. During online editing, the final decisions resulting from offline editing were performed again using the original, high-quality videotape or digital recordings and more sophisticated equipment. Today a high-quality digital editing system can perform both functions; that is, it can be used for both (preliminary) offline editing and for (final) online editing, making the passage from one stage to the next less distinct. In addition, the flexibility afforded by digital editing systems is similar to the flexibility of traditional film-cutting techniques but much more efficient. For example, a digital editing system allows an editor to reduce the overall length or duration of a program by deleting or removing images and sounds along a timeline and simply bringing the remaining images and sounds together so that they directly precede and follow one another. This technique is similar to the removal of frames of film or a shot and accompanying sound from the middle of a film. The potential to change the overall program duration at will is often referred to as a nonlinear approach to editing because the order of shots and sounds can be reordered at any time. Digital systems can be used to edit productions that were originally recorded on either film or videotape using a nonlinear approach. In so doing, digital editing equipment has brought film and video editing closer together.

Digital HDTV cameras capture electronic images that have an aspect ratio (width-to-height ratio) and resolution (clarity and amount of detail in the image) that more closely approximates some wide-screen theatrical film formats than it does traditional television or video images. Professional as well as consumer HDTV cameras are designed to operate in either 16:9 or 4:3 (consumer and prosumer HDV cameras, for example, can often record in non-HDTV formats, such as DVSP or DVCAM) aspect ratios and in a variety of scan formats (see Chapter 9, Recording). Finally, computers and interactive multimedia software allow traditionally passive media viewers and listeners to analyze and manipulate audio, video, and film productions and to actively recreate their own versions of existing media texts, as well as to learn new skills interactively, play computer games, and control where they find and use media productions. Traditional distinctions between (active) artists versus (passive) viewers or listeners, film versus video production, and offline versus online editing are becoming less meaningful as digital technologies erase differences between traditional forms of media production.

PRODUCTION TERMINOLOGY

Acquiring basic media production terminology is crucial to understanding the entire production process. The use of production technology and techniques requires a rather specialized vocabulary. As key words are introduced in this text, they are usually defined. When chapters are read out of sequence, the reader can refer to the glossary and index at the end of the book to find a specific definition or the initial mention of a term. We almost intuitively understand the meaning of such words as television, video, audio, and film, but it is important to be as precise as possible when using these and other terms in a production context. Television refers to the electronic transmission and reception of visual images of moving and stationary objects, usually with accompanying sound. The term television has traditionally referred to images and sounds that are broadcast through the airwaves. Electronic television signals can be broadcast by impressing them on a carrier wave of electromagnetic energy, which radiates in all directions from a television tower and is picked up by home receivers. Electrical energy travels in waves, much like ocean waves that crash on the sand at the beach. A carrier wave is usually rated in thousands of watts of power channeled through a television transmitter (Figure 2.4).

FIGURE 2.4 Broadcast carrier frequencies share space on the international electromagnetic spectrum with a variety of other signals and broadcast users.

image

Electromagnetic energy ranges from sound to long radio waves to very short radio waves and on to light and gamma rays. The radio waves can travel through the atmosphere, the sea, and the earth, and they can be picked up by receivers. Television signals can also be sent through closed-circuit or cable television systems, that is, along electrical wires rather than through the airwaves. Before the 1930s, experimental television was primarily closed circuit, but the commercial exploitation of this technology as a mass medium and as a means of distributing television to large numbers of private homes, known as cable television, did not occur until much later. Since the 1960s, it has been possible to transmit television signals via satellites across continents and around the world. Satellites are communications relay stations that orbit the globe. Line-of-sight microwave (i.e., high-frequency) transmissions of television signals are frequently used for live, nondelayed, real-time news reports in the field and for sending signals to outlying areas where broadcast signals are not well received. Satellite program distribution systems now compete with cable program distribution systems. Telephone companies through the installation of fiber-optic cables directly to the home also are capable of distributing programming to the home in competition with both cable, satellite, and off-the-air broadcasting. All except broadcasting and satellites are capable of also offering Internet service.

The terms television and video are sometimes used interchangeably, but it is generally agreed that television is a means of distributing and exhibiting video signals, usually over the air. Video, on the other hand, is a production term. Also, the term video is used narrowly to refer to the visual portion of the television signal, as distinguished from the audio or sound. The more general definition of video as a production term refers to all forms of electronic production of moving images and sounds. This is the preferred use in this text. The term video can refer to a three- to five-minute popular song with accompanying visuals on a videotape, DVD, or Blu-ray disc, and it is actually a shortened form of the term music video. A video can also refer to a videotape copy of a feature film available at a video rental store. Videotape refers to magnetic tape, which is used to record both the video and the audio portions of the television signal. Videotape, digital servers, and other digital media allow television signals to be transmitted on a time-delayed basis, rather than live, and when used with various electronic editing devices, they allow recorded images and sounds to be changed and rearranged after they have been recorded. A videotape and a DVD or Blu-ray disc also allows feature films to be played at home on a VCR (video cassette recorder) or a DVD/Blu-ray disc player, respectively. Videotape (VTR) traditionally has meant the tape is mounted and played from an open reel. A VCR is a tape that has been encased in a closed cassette from which the tape is withdrawn as it is played. DVDs are disks containing video and audio recorded using a laser beam to embed the digital information in the surface of the disk. New forms of DVDs continue to be developed offering longer recording time, higher-quality signals, and increased interactive features, Blu-ray, HD-VMD, and HVD all are marketed with the expectation of replacing the original DVD.

Film has a variety of meanings in production. Defined narrowly, it simply refers to the light-sensitive material that runs through a motion picture camera and records visual images. When film is properly exposed to light, developed chemically, and run through a motion picture projector, moving objects recorded on the film appear to move across a movie screen. In a more general sense, the term film can be used interchangeably with such words as motion picture(s), movie(s), and cinema. The first two words in the singular refer to specific products or works of art that have been recorded on film, whereas in the plural they can also refer to the whole process of recording, distributing, and viewing visual images produced by photochemical and mechanical, that is, nonelectronic, means.

Audiotape refers to magnetic tape that is used to record sounds. Digital audiotape (DAT) is used to record audio signals in digital form on high-density magnetic tape in a DAT recorder. Compact discs (CDs) are digital audio recordings that are “read” by a laser in a CD player for high-quality audio reproduction.

Making clear-cut distinctions between video and film is becoming increasingly difficult, especially in the context of new digital technologies. For example, when a feature film, a television series, a music video, or a commercial advertisement is initially recorded on film but edited in digital form and distributed on digital tape, broadcast on television, satellite transmitted, or sold or rented as a DVD or Blu-ray disc, is this a video, a film, or a new type of digital hybrid? Using a single video camera to record a program in segments, rather than using multiple video cameras to transmit it live or to record it live on tape, has frequently been called film-style video production because the techniques used in single-camera video production are often closer to traditional film practice than to those of multiple-camera studio or remote video production. On the other hand, the techniques of multiple-camera film production used to record stunts in feature films are often closer to traditional multiple-camera studio television practice than to traditional single-camera film practice. As mentioned earlier, digital editing techniques often combine traditional film and videotape editing techniques at the same time that they mimic computer word processing. All of these developments make it difficult to make firm distinctions between film, video, and digital media today. Too often the term film is inaccurately applied to video and digital media productions when the term motion pictures would be more appropriate. The term digital motion pictures refers to distributing films to a theater via digital means rather than physically shipping the film to each theater.

Multimedia refers to the creation of works that combine video, audio, graphics, and written text. Combining these media involves digitizing all of the various elements so that they can be computer controlled and stored in a variety of forms, such as on hard disk drives, memory flash dives, and DVDs. Interactive media refers to various forms of viewer/reader/listener manipulation of and interaction with computer-controlled multimedia forms. Multimedia and interactive media have both been widely used in training and education, as well as computer games, but they are also developing into important new art forms and means of personal expression, especially as distributed on the Internet and the World Wide Web. These terms are often used in combination, with interactive multimedia, referring to works that are interactive and involve the use of multimedia.

New forms of media are constantly being developed and existing media forms are constantly changing. Thus, although it is important to be as precise as possible in the use of media terms, it is equally important to realize that the meanings of these terms can change over time, reflecting changes in the technology on which these media are based and the ways in which that technology is used.

SINGLE-CAMERA VERSUS MULTIPLE-CAMERA PRODUCTION, AND STUDIO VERSUS LOCATION PRODUCTION

A producer or director of a live-action production must make two basic decisions before production begins. First, she must decide whether one or more than one camera should be used to record or transmit images. Using one camera is called single-camera production, whereas using more than one camera is referred to as multiple-camera production. Second, a decision must be made about whether the images should be recorded inside or outside the studio. Recording inside the studio is known as studio production, whereas recording outside the studio is called location production in film and remote production (involving cable/microwave links to the studio) or field production in video.

Multiple-camera production techniques are used to record continuous action quickly and efficiently without interruption. Such techniques are the basis for television news programs, entertainment programs involving a studio audience, as well as much corporate, educational, and religious programming. Remote coverage of sporting events almost always requires multiple cameras. Multiple film cameras are frequently used to record dangerous stunts simultaneously from a variety of angles for feature films. Multiple film cameras also are used to film some television situation comedies.

In single-camera production, each separate shot is set up and recorded individually. The main artistic advantage of single-camera production is that few compromises have to be made in lighting or microphone placement to accommodate the viewing requirements of several different cameras. Logistically, only one camera needs to be set up or carried into the field at a time. Single-camera production of dramatic fiction usually begins with the recording of a master shot, which covers as much of the action in a scene as possible from a single camera position. Then the same actions are repeated and recorded again with the camera placed closer to the action. The resulting material is combined during postproduction editing. Single-camera production techniques are used to record feature films, documentaries, and television commercials, as well as in news recording. Except for live coverage of sports events, single-camera production is the norm for location and remote production situations.

In some production situations, it is simply impossible to record events inside a studio, even though studio production facilities and techniques are usually more efficient and economical. Lighting and sound recording are more easily controlled in a studio than at a remote location. Most production studios are designed to provide ideal recording conditions by insulating the recording space from outside sounds, reducing the echo of interior sounds, and allowing easy overhead or floor positioning of lights and access to power supplies. Location production can give a film or television production a greater sense of realism or an illusion of reality.

Exterior locations often create a sense of authenticity and actuality. But location settings rarely provide ideal lighting and acoustical environments. Extraneous sounds can easily disrupt a production. Confined settings often create sound echo and make it difficult to position lights and to control the shadows they create. Inclement weather conditions outdoors can delay the completion of a project. Because location production sometimes increases production risks and costs, a producer must have strong justification for recording outside the studio. Of course, the construction of sets inside a studio can also be extremely expensive, in addition to creating an inappropriate atmosphere, and location production in this case is easily justified on the basis of both costs and aesthetics.

Planning for Positive Production Experiences

Everyone wants to have positive production experiences. Although no secret formula for success exists, a thorough understanding of production principles and a positive attitude toward the overall production process is certainly helpful. Exuding confidence in a project enlists the support of others. This requires knowing what is needed and how to get it. Making good creative choices demands careful advance planning of every logistical and conceptual aspect of production.

Many production techniques can be mastered through practice exercises, such as those recommended at the end of each chapter in this book, and through actual production experience. Truly benefiting from these experiences requires taking risks and learning from one’s mistakes. Learning to work within present levels of ability, avoiding unnecessary or repeated errors through careful planning, and the development of strong conceptualization skills are also essential.

Avoiding Negative Production Experiences

The first law of production is Murphy’s law: Anything that can go wrong, will go wrong.

Every production person has vivid memories of his or her first encounter with Murphy’s law, such as an essential piece of equipment that the camera crew forgot to take on location or one that failed to work properly. The second law of production is an antidote to Murphy’s law: Proper prior planning prevents poor productions.

Many production problems are preventable. Ignoring conceptual and aesthetic considerations, failing to learn how to operate a camera properly, forgetting to bring necessary equipment, and having no backup or replacement equipment are preventable mistakes. No one is beyond the point of needing to think carefully about what he or she is doing or learning how to use new equipment. Everyone should use detailed equipment checklists, specifying every necessary piece of equipment, which are checked and rechecked before going into the field. Every production needs to have some backup equipment and contingency plan to turn to when things start to go other than planned.

Some production problems are not preventable. No one can precisely predict the weather or when a camera will stop working. But everyone must have an alternative or contingency plan if such a problem occurs. Equipment should be properly maintained, but not everyone can or should try to repair equipment in the field. Certainly, the option to record another day, if major problems should occur, must be available. Good quality productions are rarely made in a panic atmosphere, and careful planning is the best anecdote to panic, Murphy’s law, and negative production experiences (Figure 2.5).

FIGURE 2.5 Each of the three stages of media production fulfills critical facets of the production process. Each of the three stages relies on the professional completion of the other two stages. No one stage is more important than any other.

image

Quality productions are shaped and reshaped many times on paper before they are recorded and edited. Preproduction planning is extremely important. It is always cheaper and easier to modify a project before actual recording takes place than to do so after production is under way. The organization of this text reflects the importance of preproduction planning and the development of conceptualization skills. The first section is devoted entirely to preproduction planning. Some degree of advance planning and conceptualization is implicit in later stages of production and postproduction as well.

THE PRODUCTION TEAM IN AUDIO, VIDEO, FILM, AND MULTIMEDIA PRODUCTION

The production team can be organized hierarchically or cooperatively. In a hierarchical situation, the commands flow downward from the producer to the director, and from her to the rest of the creative staff or production crew. In a cooperatively organized production, every member of the production team has equal authority and control, and decisions are made collectively. Most production situations combine aspects of both the hierarchical and the cooperative models, although the former approach is clearly dominant in the commercial world. Combining approaches, the producer or director makes most of the important decisions, but the help, support, guidance, and input of all the creative staff and some of the technical crew is actively sought and obtained (Figure 2.6).

FIGURE 2.6 Parallels exist between the organization of production teams, depending on the media used, but each medium has unique personnel categories for each level depending on whether the classification is above the line or below the line.

image

Production is rarely a purely democratic process, but it is almost always a collective process that requires the support and cooperation of large numbers of people.

The members of any media production team usually can be divided into two distinct groups: the creative staff and the technical crew. This basic division is often used for budgeting purposes. Dividing the team and costs into above-the-line creative aspects and below-the-line technical aspects allows for a quick financial comparison between investments in the creative and technical sides of a production. The costs of paying the producer, director, scriptwriter, and performers are considered above the line, whereas those for equipment and the crew are below the line. The two should be roughly equivalent in terms of the allocation of financial support to ensure that neither the creative nor the technical side of the production is being overemphasized (Figure 2.6).

Creative Staff in Media Production

The creative staff in audio, video, multimedia, and film production includes the producer, director, assistant director, scriptwriter, designers, and the talent or performers.

Producer

There are many different types of television and film producers: executive producers, independent producers, staff producers, line producers, and producer hyphenates (e.g., producer-writer-directors). The exact responsibilities of the producer vary greatly between different commercial and noncommercial production categories and levels. In general, the producer is responsible for turning creative ideas into practical or marketable concepts. The producer secures financial backing for a television or film production and manages the entire production process, including budgeting and scheduling, although production managers often handle many of these tasks at major studios. Some producers become directly involved in day-to-day production decisions, whereas others function as executive managers who largely delegate production responsibilities to others. The producer ensures that the financial support for a production is maintained and usually represents the views of his or her clients, investors, or superiors, as well as those of prospective audiences, throughout the production process. The producer in radio or in an audio recording session also may hire the musicians, arrange for facilities and studio time, and in some cases actually operate the audio board or other recording medium (Figure 2.7).

FIGURE 2.7 The organization of motion picture companies varies with the number of films in production at any one time. Some company services are shared between productions, and others are unique to each individual production.

image

Director

The director creatively translates the written word or script into specific sounds and images. He or she visualizes the script by giving abstract concepts concrete form. The director establishes a point of view on the action that helps to determine the selection of shots, camera placements and movements, and the staging of the action. The director is responsible for the dramatic structure, pace, and directional flow of the sounds and visual images. He or she must maintain viewer interest. The director works with the talent and crew, staging and plotting action, refining the master shooting script, supervising setups and rehearsals, as well as giving commands and suggestions throughout the recording and editing.

The director’s role changes with different types of production situations. In live, multiple-camera video, the director usually is separated from the talent and crew during actual production, remaining inside a control room. In the control room, the director supervises the operation of the switcher, a live-television editing device that controls which picture and sound sources are being recorded or transmitted. The director also gives commands to the camera operators from the control room. A stage manager or floor manager (FM) acts as the live television director’s representative in the studio, cueing the talent and relaying a director’s commands. In single-camera production, the director remains in the studio or on the set or location and works closely with the talent and the director of photography (DP) (Figure 2.8).

FIGURE 2.8 The organization of a television station varies considerably depending on the size of the market and whether the station is a network affiliate, network owned and operated, or a totally independent operation.

image

Assistant/Associate Director

The assistant or associate director helps the television or film director concentrate on his or her major function, controlling the creative aspects of production. In feature film production, the assistant director (AD) helps break down the script into its component parts before the actual production for scheduling and budgeting purposes. The AD then reports to the production manager, who supervises the use of studio facilities and personnel. During actual production, the AD becomes involved in the day-to-day paperwork and record keeping, sometimes actually taking charge of a shooting unit, but always making sure that the talent and the crew are confident, well-informed, and generally happy. In studio video production the associate director (AD) keeps track of the time, alerts the crew members and performers of upcoming events, and sometimes relays the director’s commands to the camera operators, video switcher, and other crew members.

Scriptwriter

The scriptwriter is a key member of the production team, particularly during preproduction. A scriptwriter outlines and, in large part, determines the overall structural form of a production project. He or she writes a preliminary summary of a production project, called a treatment. A treatment lays the groundwork for the script and is written in the third person, present tense, much like a short story. The script provides a scene-by-scene description of settings, actions, and dialogue or narration, and functions as a blueprint that guides the actual production.

The Production Crew in Media Production

The production crew in media production includes the director of photography, camera operator, lighting director, art director or scenic designer, editors, and perhaps a number of specialized engineers and technicians, depending on the size and sophistication of the production. Figures 2.7 and 2.8 illustrate a more complete breakdown of the organization of a motion picture company and a television station.

Director of Photography

The overall control of film lighting and cinematography, or the creative use of a movie camera, is usually given to one individual, the director of photography (DP). A DP supervises the camera crew, who are sometimes called cameramen (referred to as camera operators in this text), assistant camera operators, grips, and the electrical crew, who are sometimes called engineers or gaffers and who actually control the lighting setup. The DP works closely with the director to create the proper lighting mood and camera coverage for each shot.

The DP is considered an artist who paints with light. He or she is intimately familiar with composition, as well as all technical aspects of camera control and is frequently called on to solve many of the technical and aesthetic problems that arise during film recording. The DP rarely, if ever, actually operates the camera.

Lighting Director

In video production, the camera operation and lighting functions are usually kept separate. The lighting director is responsible for arranging and adjusting the lights in the studio or on location according to the director’s wishes or specifications. The lighting director supervises the lighting crew, who hangs and adjusts the various lighting instruments.

Camera Operator

The camera operator controls the operation of the film or video camera. Many adjustments of the video camera must be made instantaneously in response to movements of the subject or commands from the director, such as changing the positioning of the camera or the focus and field of view of the image. The director’s commands come to the camera operator in the studio via an intercom system connected to the camera operator’s headset. The camera operator must smoothly, quietly, and efficiently control the movement of the support to which the camera is attached in the studio and avoid any problems with the cable, which connects the camera to the switcher or videotape recorder. A film camera operator works much more independently than a video camera operator, following the directions that the DP and the director give before the camera rolls. While shooting, it is the operator’s responsibility to maintain framing and follow the action. In 2D animation, the camera operator shoots each cel or sets of cels following the director’s instructions.

Art Director or Scenic Designer

The art director (film or graphics) or scenic designer (video) supervises the overall production design. He or she determines the color and shape of sets, props, and backgrounds. Art directors frequently work closely with costume designers and carpenters to ensure that costumes and sets properly harmonize or contrast with each other. In feature film, the art director delegates the supervision of set construction and carpentry to the set designer, and in video the scenic designer often supervises both the abstract design of a set on paper and its actual construction.

Technical Director

The technical director (TD) operates the switcher, a multiple-video-camera editing device, in the control room. At the director’s command, the TD presses the buttons that change the television picture from one camera or playback device to another. In some television studios, the technical director supervises the entire technical crew, including relaying the director’s commands to the camera operators, while also operating the switcher.

Editor

In video postproduction, the editor operates an editing system that electronically connects the individually recorded segments into a sequential order. A film editor physically cuts together various pieces of film into a single visual track and an accompanying soundtrack. The sound editor is a specialist who constructs and organizes all the various sound elements so that they can be properly blended or mixed together into a final soundtrack. In film, the sound segments can be physically spliced together, but in video they are edited electronically. Film and film audio can also be transferred to digital formats for digital editing.

Audio Engineer or Mixer

In video production the individual responsible for all aspects of initial audio recording is called the audio engineer. In film production, this person is referred to as the mixer or audio (or sound) recordist. In studio video production, the audio engineer sits behind a large audio console in the control room, where he or she controls the sound from the microphones and playback units. The audio engineer also supervises the placement of microphones in the studio. The film mixer or audio recordist, like the audio engineer in video, adjusts and controls the various audio recording devices, but unlike the audio engineer, he or she remains on the set rather than in the control room. The film mixer usually operates an audiotape recorder that runs synchronously with the film camera. The mixer tries to record a consistent, balanced audio signal throughout all the different single-camera setups so that a smooth, even soundtrack can be created during subsequent editing and mixing.

Video Engineer or Laboratory Color Timer

The quality of video and film images depends on technical specialists who can control image, color, brightness, and contrast levels. In video production, a video engineer usually controls the setting and adjustment (shading) of camera recording and transmission levels. The engineer is responsible for ensuring that all cameras are functioning properly and that multiple cameras all have comparable image qualities. A video engineer can also make color corrections to individual shots during postproduction. The color timer at a film laboratory performs a similar role, but does so after the film has been edited and before copies are made. In video postproduction and in film that has been transferred to video, the color in each shot can be adjusted using special digital equipment. Color can also be adjusted within individual frames using computer-controlled colorizing equipment, which digitizes images and allows a colorist to control individual pixels in the frame. Many digital editing programs contain a wide range of image-control devices, allowing precise adjustments of the color, brightness, and contrast of scenes, sequences, shots, and individual frames during postproduction. Including three-way color correction controls that allow an editor to separately adjust shadow areas, midtones, and highlights for color hue and saturation, ensures proper color balance.

The Production Team in the Recording Industry

The employees involved with the actual production of audio programs in the recording industry may be as few as one: the producer/operator working with the musician or performer. Or the team may be as complex a group as arrangers, producers, engineers, operators, as well as the musicians or performers (Figure 2.9).

FIGURE 2.9 The organization of a recording studio varies depending on the size of the studio, the type of recordings that are made in that facility, and the number of artists using the facility on a regular basis.

image

Producer and Operator

The producer and operator function the same as in a video production, except that they are concerned only with the sound of the program. The operator will spend much time and effort adjusting equalization, levels, and effects of each input channel several times on a single microphone. Overall equalization, levels, and effects must then be set relative to all inputs for the best balanced sound.

Arranger

Arrangers work with the musicians to assemble the best possible musical composition. Often the musicians will arrange their own music or arrive at a recording session with the composition ready to record.

The Production Team on an Interactive Multimedia Production

The interactive multimedia creative staff and production team includes the developer, publisher, producer, designer, writer, video director, graphic artist/animator, and programmer.

Developer

The developer is the individual or corporation that creates an interactive multimedia product. The developer oversees program content and programming and delivers the product to the publisher. Developers are analogous to the production side of the film and video business. Production companies are developers.

Publisher

The publisher provides financial backing and ensures that the product will be successfully distributed. Publishers are analogous with the distribution side of the film and video business.

Producer

The producer manages and oversees a project, interacts with the marketing people and executives in the publishing company, and coordinates the production team. The producer is sometimes referred to as a project director, project leader, project manager, or a director.

Designer

The designer is basically a writer who visualizes the overall interactive multimedia experience and then creates the design document that specifies its structure, that is, a product’s necessary and unique attributes. In an interactive computer game, for example, the design document may include the basic elements of the story, locations, and problems to be solved by the person playing the game. Animation, music, and sound effects are included in the design document, which is then handed over to a producer, who coordinates the creation of the design elements into a product.

Writer

The writer may consult with the designer who initially envisioned a particular product but is usually hired by the producer to help create and develop design elements and to shape them into their final form. The writer may flesh out the characters, dialogue (which may appear as text in one platform but may be spoken in another), music, sound effects, and possible scenarios envisioned by the designer, or she may invent entirely new material. Interactive multimedia writers are generally very skilled at nonlinear storytelling.

Video Director

If a video recording is needed to add live-action material to an interactive multimedia experience, a video director is brought in to handle staging the action, actors, and crew.

Art Director

The art director is responsible for converting the written script into a visual production. Usually the art director draws or supervises the production of the storyboards, especially in animation and games, that guide the production from conception to completion. The art director develops the basic concept of color, style, and character design before assigning the work to individual artists.

Graphic Artist/Animator

The graphic artist or animator draws and animates characters, backgrounds, and computer environments, which make up the visual elements of an interactive multimedia product.

Programmer

The programmer is the person who develops computer programs that integrate the various aspects of a multimedia product and facilitate interaction with it on various computer platforms, such as on a Macintosh computer. The programmer is responsible for writing each line of computer code that controls each line of dialogue and movement of a character. Several computer programmers may be involved in the creation of interactive multimedia, which will be used on different platforms or computer environments, such as Windows, Mac, and NT. Programmers are sometimes referred to as engineers (Figure 2.10).

FIGURE 2.10 The organization of a multimedia operation varies widely depending on the type, number, and budgets of the projects the individual studio specializes in within its specialized field.

image

VISUALIZATION: IMAGES, SOUNDS, AND THE CREATIVE PROCESS

Visualization can be defined as the creative process of translating abstract ideas, thoughts, and feelings into concrete sounds and images. This demands strong conceptualization skills and a thorough understanding of media production methods and techniques. Scriptwriters and directors must have something significant to say and the technical means to say it. Quality production work requires an ability to organize one’s creative thoughts and to select and control many devices that record, edit, and transmit visual images and sounds. Scriptwriters and directors must acquire a basic understanding of the overall production process before they can fully develop their visualization skills. A knowledge of production principles and practices stimulates the search for innovative ways to translate abstract ideas into concrete sounds and images. It also sets limits on a writer’s creative imagination. A scriptwriter must be practical and realistic about production costs and logistics. An imaginative script may be too difficult or expensive to produce. A scriptwriter must also have some knowledge of camera placement, graphics design, composition, timing, and editing, even though his or her work is basically completed during the preproduction stage.

To visualize means to utilize the full potential of audio, video, film, and multimedia for creative expression, film, video, audio, and multimedia communicators must be constantly open to new ideas, technologies, and techniques, because these media are constantly changing. But they cannot ignore traditional communicative practices and ways of structuring messages. Other media and older forms of communication provide a wealth of information about the communication process.

In a sense, the attempt to use visual images and sounds to communicate with others is as old as the human species. Early human beings, for example, drew pictures of animals on the walls of caves. Cave drawings may have been created out of a desire to record a successful hunt for posterity, to magically influence the outcome of future hunts by controlling symbolic images, or to express the feelings and thoughts of an artist toward an animal or hunt. These three purposes of communication can be summarized as conveying information, rhetorical persuasion, and artistic expression. To some extent, these explanations are also applicable to contemporary uses of video, film, audio, and multimedia.

Conveying Information

Communicating with pictures and sounds may have a single purpose, to convey information. What is communicated, the specific content or meaning of the message, consists of informative signs and symbols, images and sounds, which are transmitted from one person to another. We tend to think of certain types of films, television, and multimedia programs, such as documentaries, educational films, videotapes, audio recordings, news programs, and interactive programs, as primarily intended to convey information. Few media messages are exclusively informational, however. Other types of communication are needed to arouse and maintain audience interest and to enliven an otherwise dull recitation of facts.

Rhetorical Persuasion

Rhetoric is the art of waging a successful argument. Persuasive devices and strategies are designed to shape opinions, change attitudes, or modify behavior. The term rhetoric has been applied to the use of stylistic as well as persuasive devices and techniques in artistic works such as novels. An artist can select a rhetorical device, such as the point of view of a specific character, to tell or stage a story, so that the reader or audience becomes more emotionally involved. Rhetorical devices often stimulate emotions. They can make a logical argument more persuasive and a work of fiction more engaging and emotionally effective. In television, radio, film, and multimedia we tend to think of editorials, commercials, political documentaries, and propaganda as rhetorical forms of communication. Many fictional dramas can also be thought of as rhetorical in structure and intent.

Artistic Expression

Artistic works often communicate an artist’s feelings and thoughts toward a person, object, event, or idea. Sometimes artistic expressions are extremely personal, and it is difficult for general audiences to understand them. At other times, the artist’s thoughts and feelings are widely shared within or even between cultures. An artistically expressive film, graphic, television, audio, or multimedia program can convey a culture’s ethos and ideology, its shared values and common experiences, in unequaled and innovative ways. Works of art can communicate an artist’s unique insight into his or her own self, culture, and medium of expression. Artists often experiment with new expressive techniques and devices, presenting ordinary experiences in new and provocative ways. They can challenge a viewer’s preconceptions and stimulate a serious and profound reexamination of personal or cultural goals and values. They can also reinforce traditional conceptions and cultural values.

PRODUCTION AESTHETICS

Media production requires more than a mastery of technology and techniques. It is also an artistic process that demands creative thinking and the ability to make sound aesthetic judgments. How should you approach a specific topic? What techniques should you use? Important aesthetic choices have to be made. To make these decisions, you must be aware of many different possibilities and approaches. Every production choice implicitly or explicitly involves aesthetics. Some production techniques go unnoticed and enhance an illusion of reality, for example, whereas others are devices that call attention to themselves and the production process. The aesthetic alternatives from which you must choose at each stage of the production process can be divided into three basic categories: realism, modernism, and postmodernism.

Realism

A realist approach to production creates and sustains an illusion of reality. Realist techniques rarely call attention to themselves. Spaces seem to be contiguous to one another within a specific scene, and time seems to flow continuously and without interruption, similar to our experience of the everyday world.

Many Hollywood films, for example, sustain an illusion of reality and make us forget that we are watching a movie. This illusion of reality in Hollywood films is based, however, on stylistic conventions that audiences readily accept as real. Conventional editing techniques, such as matching a character’s action over a cut from one shot to the next shot in traditional Hollywood films, help to sustain an illusion of reality, and are often referred to as realist conventions of classical Hollywood cinema (Figure 2.11).

FIGURE 2.11 A realist production strives to create an illusion of reality through spatial and temporal continuities that mirror real-life experiences, as Tom Hanks and his platoon are depicted in the Dreamworks production of Saving Private Ryan. (Courtesy of Dreamworks Pictures.)

image

As William Earl has suggested in his cogent description of modernism as a revolt against realism in films, realist films, which include many classical Hollywood films, define reality as familiar, recognizable, and comprehensible. Realist art feels most at home living among familiar things in their familiar places, or among persons with recognizable characters acting or suffering in comprehensible ways (Earl, “Revolt Against Realism in the Films,” Journal of Aesthetics and Art Criticism, 27(2), Winter 1968).

Modernism

A modernist approach to production, which is reflected by many avant-garde works of video and film, often calls attention to forms and techniques themselves. Modernist works fail to create a realistic world that is familiar, recognizable, and comprehensible. A modernist media artist instead feels free to explore the possibilities and limitations of the audio, video, or film medium itself without sustaining an illusion of reality (Figure 2.12).

FIGURE 2.12 A modernist production goes beyond realism to create an artist’s symbolic or imaginative world, as is depicted in Tim Burton’s production of Big Fish, in which Ewan McGregor tries to come to an understanding of his past life and relationship with his father. (Courtesy of Columbia Pictures.)

image

As viewers of modernist works, we often see familiar objects and events portrayed in a new light through the use of innovative techniques. Modernist art often appears less objective than realist art. Modernist works sometimes probe the subjectivity or inner psychological world of the individual artist/creator. In addition to self-expression, modernist productions often reflect feelings of ambiguity, as opposed to objective certainty. Time is not always continuous, and space is not always contiguous. Modernist art tends to be more elitist and private, as opposed to popular and public. The surrealist, dreamlike images in paintings by Salvador Dali and early avant-garde films such as Salvador Dali’s and Luis Bunuel’s Un Chien Andalou or Andalusian Dog (1929) illustrate these aspects of modernism, as do some experimental dramatic films, such as the beginning of Swedish film director Ingmar Bergman’s film Persona (1966). Many other European art films and some contemporary music videos are distinctly modernist in approach as well.

Postmodernism

The emergence of digital technologies coincides with the rise of postmodernist films, videos, and audio art. Postmodernism literally means “after” or “beyond” modernism. Whereas modernist art emphasizes the individual artist’s self-expression and the purity of artistic form, postmodernist art is anything but pure. It often features a collage or grab bag of past styles and techniques, rather than a pure or simple form. What emerges from this menagerie of styles and grab bag of techniques is not an individual artist’s self-expression but rather a hodgepodge of different expressive forms from different periods and artists.

The absence of a single artist as a controlling presence (controlling what a piece of art means or how it should be viewed) encourages the viewer or listener to interact with the artwork, to play with it, and reshape it into another form. The artist doesn’t control the meaning of a postmodernist text; the viewer or listener does. Postmodernist works often question human subjectivity itself. Sometimes they seem to suggest that the world is made of simulations rather than real experiences. Human characters can become indistinguishable from cyborgs in postmodernist films, just as individual artists become less distinguished by their unique styles and somewhat indistinguishable from audiences who create their own texts through viewer/listener free play.

Postmodernist films and television programs often combine popular culture with classical and elite art, mixing a variety of traditionally distinct genres or modes, such as documentary and dramatic fiction, and encouraging viewer and listener interaction with (if not the actual recreation of) art works. Postmodernist art borrows images and sounds from previous popular and classical works of art with which most viewers and listeners are often already familiar. Rather than inventing entirely new and perplexing original forms (modernism) or trying to establish explicit connections to the real world (realism), postmodernist art plays with previously developed images and sounds and recreates a self-contained, playfully simulated world of unoriginal forms, genres, and modes of expression. Interactive multimedia works, such as Peter Gabriel’s CD-ROM music videos, allow viewers and listeners to adjust the volume and remix separate music tracks, creating their own versions of his music. A modernist piece of music would never allow the viewer/listener to freely play in this way with the work of art, because the artwork is presumed to have been perfected and completed by the artist (Figure 2.13).

FIGURE 2.13 A postmodernist production far exceeds realism and modernism by suggesting that the audience must contribute to the production by mixing genres, such as science fiction and hard-boiled detective, as well as other styles. A postmodernist production may question, for example, what it means to be a human by making cyborg simulations indistinguishable from “real” people, such as the character depicted by Harrison Ford in the Ladd Company’s production of the Blade Runner. (Courtesy of the Ladd Company.)

image

Postmodernism may be more difficult to define than realism or modernism because it is a more recent development, and it is still evolving. Nonetheless, some of the main characteristics associated with postmodernism are already apparent. These include the production of open-ended works that encourage viewer participation and play, rather than a concern for the human subjectivity of either the individual artist or the main character in a fictional drama or a social actor in a documentary or docudrama. Postmodernist art frequently offers a pastiche or collage of simulated images and sounds drawn from a variety of different modes and genres (both fiction and nonfiction, for example), a feeling of nostalgia for the past, a plundering of old images and sounds from previous works, simulations rather than “real experiences,” and a mixture of classical and contemporary forms as well as popular and elite culture.

Combining Aesthetic Approaches

Obviously the three aesthetic movements and choices that have just been described (realism, modernism, and postmodernism) are neither definitive nor exhaustive. Many projects combine aesthetic approaches in various ways. Modernist sequences can be incorporated into realist movies, such as dream sequences in classical Hollywood cinema. Some Hollywood movies, such as The Lord of the Rings series (2001–2004), The Triplets of Belleville (2003), and Last Samurai (2003), seem to combine realism with postmodernism. The choice of one aesthetic approach is neither absolute nor irreconcilable with other approaches. But although different approaches can be combined, the decision to combine them should be a matter of conscious choice.

Because aesthetic decisions are basic to different stages of the production process, many of the chapters in this text begin with a discussion of realist, modernist, and postmodernist approaches. This is followed by a discussion of production practices that are relevant to the use of digital and analog technologies. Combined with actual hands-on production experience, this text provides the basic information needed to make valid production decisions with the confidence that many possible alternatives have been explored and the best possible approach and techniques have been selected.

A SHORT HISTORY OF AUDIO, FILM, AND VIDEO PRODUCTION TECHNOLOGY

The basic technology for the eventual development of radio, audio recording, motion pictures, and television was available as early as the beginning of the nineteenth century. The predecessors for motion pictures may be considered to be still photography; for audio systems, the telegraph and telephone; and for television, the electrical discharge of light-sensitive materials. Each of these was discovered or invented before 1840 (Figure 2.14).

FIGURE 2.14 Early media production equipment by today’s standards was primitive and ineffective. But production techniques that were developed to produce shows on that early equipment still provide the basis for today’s digital media production. (A) An early twentieth-century film camera compared with (B) a modern 35 mm camera. (Courtesy of Arri USA, Inc.)

image

From 1839 to the end of the nineteenth century, experiments and practical models of both selenium-based electrical systems and rotating disc-based systems were developed to convert and transmit visual images. By 1870, Thomas Edison had produced a primitive mechanical cylindrical audio-recording device called the phonograph. During the last half of the nineteenth century, a variety of toylike machines, such as the Thaumatrope, Phenakistoscope, Zoetrope, and lantern shows with delightful titles such as Phantasmagoria and Magasin Pittoresqueó were used to display projected pictures that appeared to move.

Before the introduction of film, a perceptual mechanism of the illusion on which motion pictures depend was based on and is called “an instance of the phi phenomenon” by the early twentieth-century perceptual psychologists Wertheimer and Termus. Gestalt psychologists were fascinated by perceptual tricks and illusions because they provided a convenient means of studying the way our brains process sensory information. The phi phenomenon produces apparent motion out of stationary lights or objects. It occurs when two lights, separated by a short distance, are flashed or strobed very rapidly. Above a certain threshold of flashes per second, the human eye is deluded into thinking that one light is moving, rather than those two stationary lights flashing. This same phenomenon may help to explain the perception of apparent motion from rapidly flashed still photographs. Some researchers believe the mind fills in the gaps between frames and produces apparent, not real, motion.

A period of invention at the end of the nineteenth century brought about the telephone, the electric telescope that was designed to convert moving images into an electrical signal, and early carbon, crystal, and ceramic microphones. Before the turn of the century, the disc record player and recorder were improved in France as well as the beginnings of motion pictures. In this country, Edison and W. K. L. Dickson developed a workable motion picture camera and projector, and George Eastman invented the flexible film base that made motion pictures possible. The century ended with the first “wireless” broadcasts, film projected on a screen for an audience, and a working model of a wire recorder.

Television experiments continued into the early twentieth century, alternating between rotating disc and electrical scan systems. Motion picture sound systems early in the century utilized sound recorded on discs with primitive methods designed to maintain synchronization between picture and sound. Many of the frustrations of workers in all three industries—motion pictures, radio, as well as television—were partially solved by Lee De Forest’s invention of the triode-amplifying vacuum tube. This invention provided the means to send voices over the air for the first time and for the motion picture industry to use sound-reinforcing systems for theater.

Before 1900, the all-electronic television system now in use was described by a variety of experimenters, but it took 17 years before a practical model became operational. Today’s television technology is based on light coming through a camera lens and striking a light-sensitive surface on the surface of one or more charged couple device (CCD) chips in the camera. Fluctuations in current on the surface of the chip are read by the circuitry of the camera as direct variations in the light striking the surfaces. These fluctuations in electrical current are then fed to a television screen, which reverses the process. Bright light striking specific points on the camera pickup chip correspond to bright light emitted by the phosphors of the television receiver’s monitor. A television screen is scanned completely 30 times every second; thus, the images move at a speed of 30 frames per second, rather than the 24 frames per second of sound film. Some believe television, like film, depends on the phi phenomenon to produce apparent motion, but it also relies on persistence of vision to fuse the continuous scanning of the picture tube into complete frames of picture. Persistence of vision refers to the temporary lag in the eye’s retention of an image, which some researchers believe may fuse one image with those that immediately precede or follow. This phenomenon does not explain apparent motion, because the fusion of images in the same position within the frame would result in a confused blur, rather than the coherent motion of objects.

The first two-color, two-negative Technicolor film process was developed in 1917. It was followed three years later by the first AM radio stations in the United States receiving licenses and some experimental television stations being licensed to use the spinning disc system. By the early 1920s, the Hollywood motion picture industry had become pervasive enough to dominate foreign screens and to be threatened with domestic censorship. The first sound-on-film system was developed by De Forest in 1930, the same year that Vladimir Zworykin invented the iconoscope television camera tube, which opened the way for an all-electronic television system. The all-electronic system was first demonstrated in 1927 by Philo Farnsworth in the United States and in 1925 by John Logie Baird in Britain. Within that same decade, the recording industry moved from acoustical recording to electronic methods, and AT&T started the first radio network. Twentieth Century Fox first used the Movietone sound-on-film system for newsreels. Warner Bros. used the Vitaphone disc system for its first sound features.

During the 1930s, modern dynamic and ribbon microphones were invented, and both British and American inventors continued to experiment with audio wire recorders. By 1932, Technicolor introduced their three-color, three-negative film process, and FM radio continued to be developed. German scientists perfected an audiotape recording system based on paper coated with iron oxide, and Eastman Kodak introduced 16 mm film as an amateur format. The format quickly became popular with professional military, educational, and industrial filmmakers, as well as documentary producers.

Immediately preceding the entry of the United States into World War II, RCA promoted its all-electronic television system with the Federal Communications Commission (FCC), who later approved that system over the CBS rotating disc system in 1953.

Although World War II interrupted the rapidly expanding field of electronics, many developments in communication technology came from the war effort. The higher frequencies, miniaturization of equipment and circuits, and advances in radar that were used in television, and eventually computers, all were perfected. Following the war, magnetic tape became a standard for recording audio, television stations and receivers increased in number rapidly, and motion picture studios experimented with theater TV and close relationships with television stations and networks. The Paramount 1948 decree by the U.S. Supreme Court motivated the film industry to divorce the motion picture production and distribution from theater exhibition, bringing an end to the major studio era and stimulating greater independent production. The transistor was invented, CBS developed the 33⅓ long-playing (LP) records, and RCA followed with the 45 rpm.

Television saved the record business by forcing radio stations to turn to all-music formats, and the motion picture industry felt compelled to turn to widescreen, 3D, and all-color films to compete with the small-screen, black-and-white television systems of the 1950s. Eventually, greater interaction occurred between film and television as film studios produced television series and feature films that were shown on television. By the middle of the decade, the FCC approved the National Television Standards Committee (NTSC) color-TV standard, and stereo recordings on tape were marketed, leading to the development of multitrack recording techniques. Within the next two years, all three industries moved forward: television with the invention of the quadraplex videotape recorder, motion pictures with the Panavision camera and lens systems, and audio with the perfection of stereo discs.

The beginning of the rapid acceleration of technical developments occurred in 1959, when the integrated circuit was invented, leading to the development of computer chips. For the next 20 years, computers moved from room-sized operations that could perform limited calculations (by today’s standards) to pocket-sized computers and a variety of other applications priced for small companies and individuals. Within the next 10 years, professional helical videotape recorders and electronic editing of videotape were developed; satellites were launched to permit transmission of audio, visual, and digital information instantaneously worldwide; the FCC approved a stereo standard for FM; quadraphonic and digital sound systems were developed; and cable moved from the country to the cities. During the period of these great advances in the electronic communication fields, motion pictures also utilized the same inventions to improve sound recording, lighting and editing systems, and theater exhibition systems. The expansion of cable television brought television to many rural areas that were out of reach of the TV stations of the time.

During the 1970s, miniaturization produced smaller cameras, recorders, and receivers, leading to new production techniques in both radio and television. Videotape formats began to proliferate, with systems both for the home (Betamax and VHS) and for the professional (U-matic and 1-inch). Cable became a major player in distributing both films and video productions as pay channels took to the satellites. HBO provided movies, ESPN provided sports, and CNN provided 24-hour news.

Technical advances continued through the 1980s, with two events setting the stage for massive changes in all communication fields: In 1981, HDTV was first demonstrated, and in 1982, a consent decree between the Department of Justice and American Telephone and Telegraph (AT&T) separated the long-distance and equipment-supply portions of the corporation from the individual local telephone systems. Less earth-shattering but still important developments were the authorization of lower-power TV (LPTV) stations, direct broadcast satellite (DBS) systems, the invention and rapid spread of compact discs (CDs), and the agreement on a Musical Instrument Digital Interface (MIDI) standard. The FCC approved a stereo TV standard, and RCA introduced the charge coupled device (CCD) camera, which used computer chips in place of camera tubes. By the middle of the 1980s, digital systems were used in new videotape formats, motion picture editing and synchronizing systems, and digital audio decks and editing systems (Figure 2.15).

FIGURE 2.15 (A) Television camera tubes passed through a series of modifications and shapes and sizes from the 1930s until chips replaced tubes. From left to right: an Image Orthicon (IO) tube, the first practical camera tube used until the mid-1960s. The original color cameras required four IOs, one each for red, green, and blue colors and a fourth for luminance. Later cameras used the green tube for luminance. Next to the IO are a series of vidicon, saticon, and newvicon tubes, each smaller and offering higher resolution, requiring lower power, and allowing a smaller camera. (B) The Image Orthicon tube’s two-inch, light-sensitive surface compared to the one-half-inch or smaller, light-sensitive surface of a camera chip.

image

Fox, Universal-Paramount (UPN), and Warner Bros. (WB) television networks began operations as the other three networks—American Broadcast Company (ABC), Columbia Broadcasting System (CBS), and National Broadcasting Corporation (NBC) — changed ownership. Later UPN and WB merged to form CW, which stands for CBS plus Warners. Experiments with teletext and videotext found limited use, and a once-failed system, videodisc, returned and began making inroads in the home market. Professional videotape formats shrunk in size as half-inch BetaCam and Recam were followed by BetaSP and MII, which became the standards of the production and broadcast studios before digital cameras and recording formats were developed.

In the 1990s, computer workstations and digital audiotape (DAT) integrated audio production into a complete digital world, and nonlinear digital editing systems for video programs became the standard. The motion picture industry turned to digitized video for postproduction and special effects, as the two visual industries began to share many more technologies. Black-and-white movies were colorized, and graphics were created through the expanded use of digital systems. Interactive multimedia production of CDs incorporated audio, video, text, and graphics into interactive computer programs. The computer slowly encompassed virtually the entire field of communications in rapid sequences of developing technologies.

At the beginning of the twenty-first century, each of the production areas—audio, video, and motion pictures—has continued to merge, overlap, and grow closer together through the use of digital technology and equipment. A fourth area of production, multimedia, emerged during the last decade of the twentieth century and has become a dominant force in media production of the twenty-first century. By FCC ruling, broadcasters are required to replace analog NTSC broadcasting with digital TV in one of the 18 different formats by February 2009. By the end of the first quarter of 2004, over 99 percent of all television homes had access to digital television broadcasts over the air, but fewer than 16 percent actually owned and used full digital HDTV receivers and monitors. More than 1000 TV stations were broadcasting some programs in digital format. Whether the TV signal may be viewed and heard in its full digital format depends on special HD on the television receiving equipment in each individual home. This transition from analog to digital by a portion of the audience at a time is comparable to the conversion from monochrome to full-color television in the 1950s except the new signal is not compatible with NTSC, and there will no longer be any analog broadcasts or reception. The availability of digital TV and HDTV equipment and consumer interest has increased during the first decade of the century. Distribution of home video on digital video disk, or digital virtual disk (DVD), quickly surpassed that of VHS videocassettes. The high quality of audio and video information contained on DVDs, as opposed to VHS cassettes, including Dolby Digital five- and six-track audio for home theater sound and component video for various wide-screen and HDTV formats, stimulated the proliferation of DVDs. The vast number of VHS decks in homes has kept VHS as a relatively popular home video recording system, but the ability to record (burn) directly onto DVD or CD-ROM disks on many personal computers makes the end of VHS a certainty. Newer forms of DVD, including Blu-ray discs, have been developed and probably will continue to be invented in order to pack high-quality and more information on each side of the disk. By late 2003, high-quality, easy-to-operate digital recording and editing equipment and software became a reality on most home computers. Radio stations, graphics, animation, and postproduction techniques now all relied heavily on digital technology.

The problem of copyright violations through the use of MP3 equipment for downloading and distributing music reached a legal stalemate in 2003. As music producers lowered their prices and the Recording Industry Association of America (RIAA) filed hundreds of civil lawsuits, the number of illegal downloaded records began to fade. Apple computers and others began paid download systems that provided a practical and affordable alternative to illegal downloading of music. Comparable systems are in development for video programs, but the battle over copyright infringement will continue.

The wars fought in the first five years of the century saw the development of miniaturized cameras, direct-to-satellite video and audio feeds from cell phones, and other digital technologies that were used by both the military and news-gathering organizations. Handheld or camera-mounted satellite transmission equipment allowed live footage to be fed to the world from the battlefield. Cell phones allow reporters to feed live sound and low-level video from the field without the need to any base operation. Cell phones also serve as video receivers via the web, and more and more shortform, small-screen video programming is being produced for cell phones and personal digital assistants (PDAs). Miniature and night-vision cameras provide footage not attainable before the conflicts in the Middle East. Satellite maps show accurate locations and relationships of battles, cities, and areas of conflict as background for stories. New high-definition cameras mounted on guns show targets and the results of firing of weapons. High-frequency radio signals allow for smaller and more portable communication equipment. Each of the many newer digital systems developed for the military will eventually reach both the consumer and professional communication operations by the time this conflict ends or shortly thereafter.

Summary

Production is divided into three stages: preproduction, production, and postproduction. Preproduction designates all forms of planning that take place before actual recording, including producing, production management, and writing. Production begins with the director’s preparations to record sounds and images. It includes all aspects of sound and image recording. Postproduction refers to the last stage of production, when the editing of recorded images and sounds begins and the completed project is distributed and exhibited.

Digital technology has revolutionized media production and is replacing analog technology in a number of media production areas, but analog technologies, such as film, continue to play important roles in each stage of production. Digital technology has opened up a wide range of fascinating production and postproduction techniques, such as special effects that have begun to alter conventional notions of history and reality. Digital technology significantly reduces, if not eliminates, the degradation of sounds and images when copies are made. It has also blurred traditional distinctions and has brought media technologies closer together.

Careful advance planning during the preproduction stages is the best way of avoiding negative production experiences. A producer initiates a project by drafting a proposal, obtaining financial support, and attempting to circumvent the operation of Murphy’s law: Anything that can go wrong, will go wrong. Production can take place either in a studio or on location, again depending on the nature of the events to be recorded.

The production team is usually organized somewhat hierarchically, in the sense that a producer or director is in charge, and everyone is accountable to a staff head, who specializes in a particular area. But to work together effectively, a production team should also be cooperatively organized, so those individual specialists function collectively as a team.

Visualization is the creative process of image and sound construction. Video, film, and multimedia record moving images and sounds. These recordings can be edited. Writers and directors must be skilled at visualization. They must understand the relation between abstract words in a script and the concrete sounds and images that are recorded and edited.

There are three basic aesthetic approaches to media production: realism, modernism, and postmodernism. A realist approach relies on techniques that enhance an illusion of reality, a modernist approach emphasizes the artist’s active shaping and manipulation of his or her material, and a postmodernist approach offers a pastiche of simulated images and sounds, questioning human subjectivity and the centrality of the individual artist. The choice of an aesthetic approach guides the selection of specific production techniques.

The histories of film, television, and audio technology are interrelated and overlap. Film is a nineteenth-century technology based on photochemical means. Television and video technology, developed commercially somewhat later, reproduces images by electronic means. Audio technology developed simultaneously with film and television because both visual media eventually required sound to match their pictures. All three media underwent substantial changes during the twentieth century. During the twenty-first century, all media production technology used in video, film, and audio productions will continue to converge within the realm of digital formats. Computers will continue to play a greater role in all aspects of media production from preproduction, to production, on through postproduction stages. Solid-state digital equipment will replace all equipment requiring moving parts.

EXERCISES

1.  Find examples of realistic, modernistic, and postmodernistic films. Compare how each tells its story, and describe why each fits in the category you have chosen.

2.  Find examples of television/cable programs that fit the three aesthetic categories. Compare how each tells its story, and describe why each fits in the category you have chosen.

3.  Using both the Internet and your library, find references to the early development of technologies that led to modern-day motion picture, television, and audio production techniques. Arrange your findings in chronological order.

4.  Watch an evening of network programming on one network. Make a list of the programs, and determine by watching whether each program was originally produced on film or video (or totally digitally) and whether the production used a single camera or multiple cameras.

5.  Call a local television station and ask to visit the station for a tour. While there, ask if an organizational chart of the station is available or if someone would explain while you take notes how the station is organized by departments and what each department is responsible for.

6.  Follow the same steps as requested in Exercise 5 but for a recording studio, a film studio, or a graphics studio.

Additional Readings

Badal, Sharon. 2008. Swimming Upstream: A Lifesaving Guide to Short Film Distribution, Focal Press, Boston.

Benedetti, Robert. 2002. From Concept to Screen: An Overview of Film and TV Production, Allyn & Bacon, Boston.

Block, Bruce. 2008. The Visual Story: Creating the Visual Structure of Film, TV and Digital Media, second ed. Focal Press, Boston.

Bordwell, David, Thompson, Kristin. 2006. Film Art, eighth ed. McGraw-Hill, New York.

Braudy, Leo, Cohen, Marshall. 2004. Film Theory and Criticism, sixth ed. Oxford University Press, New York.

Cook, David A. 2004. A History of Narrative Film, fourth ed. W.W. Norton & Company, New York.

Earl, William. 1968. Revolt against realism in films. Journal of Aesthetics and Criticism, 27(2), Winter.

Ellis, Jack C, McLane, Betsy A. 2005. A New History of Documentary Film, Continuum, New York, NY.

Everett, Anna, Caldwell, John T. 2003. New Media: Theories and Practices of Digitextuality, Routledge, New York.

Grant, August, Meadows, Jennifer. 2008. Communication Technology Update and Fundamentals, eleventh ed. Focal Press, Boston.

Gross, Lynne, et al. 2005. Programming for TV, Radio & The Internet: Strategy, Development & Evaluation, Focal Press, Boston.

Harrington, Richard, Weiser, Mark. 2008. Producing Video Podcasts: A Guide for Media Professionals, Focal Press, Boston.

Hofstetter, Fred. 2005. Internet Literacy, McGraw-Hill, New York.

Hutchison, Tom. 2008. Web Music Marketing and Promotion, Focal Press, Boston.

Levison, Louise. 2007. Filmmakers and Financing: Business Plans for Independents, fifth ed. Focal Press, Boston.

Orlik, Peter B. 2001. Electronic Media Criticism: Applied Perspectives, second ed. Erlbaum, Mahwah, NJ.

Perebinossoff, Phillipe. 2008. Real-World Media Ethics: Inside the Broadcast and Entertainment Industries, Focal Press, Boston.

Rayburn, Dan. 2007. Streaming and Digital Media: Understanding the Business and Technology, Focal Press, Boston.

Roberts-Breslin, Jan. 2007. Making Media: Foundations of Sound and Image Production, second ed. Focal Press, Boston.

Sterling, Christopher H, Kittross, John Michael. 2002. Stay Tuned: A History of American Broadcasting, third ed. Erlbaum, Mahwah, NJ.

Udelson, Joseph H. 1982. The Great Television Race: A History of the American Television Industry, 1925–1941, University of Alabama Press, Tuscaloosa, AL.

Wasko, Janet, MacDonald, Paul. 2008. Contemporary Hollywood Film Industry, Wiley & Sons, Hoboken, NJ.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset