Chapter 12

Sound Design

Myths and Realities

As my partner Ken Sweet and I seriously immersed ourselves into sound effect development chores for John Carpenter's The Thing, we decided to try using a sound designer for the first time. The term had hardly been born, and few pictures presented the sound crew with the time and budget to allow experimentation. We called some colleagues to get recommendations for sound design talent.

The first place we contracted used a 24-track machine, something we did not use much in feature film sound work. The concept of layering sounds over themselves for design purposes appeared promising. The film-based Moviola assembly at sound-editorial facilities like ours just could not layer and combine prior to the predub rerecording phase. Ken and I sat patiently for half the day as the multitrack operator ran the machine back and forth, repeatedly layering up the same cicada shimmer wave. The owner of the facility and the operator kept telling us how “cool” it would sound, but when they played their compilation back for us, we knew the emperor had no clothes on, at least at this facility.

We called around again, and it was recommended that we try, a young man who used something called a Fairlight. We arranged for a design session and arrived with our 1/4” source material. We told the eager young man exactly what kinds of sounds we needed to develop. Ken and I could hear the concepts in our heads, but we did not know how to run the fancy gear to expose the sounds within us. We explained the shots we wanted, along with the specially designed cues of sound desired at those particular moments. The young man turned to his computer and keyboard in a room stacked with various equipment rented for the session and dozens of wire feeds spread all over the floor.

Once again, Ken and I sat patiently. An hour passed as the young man dabbled at this and fiddled with that. A second hour passed. By the end of the third hour, I had not heard one thing anywhere near the concepts we had articulated. Finally, I asked the “sound designer” if he had an inkling of how to make the particular sound we so clearly had defined.

He said that he actually did not know how to make it; that his technique was just to play around and adjust things until he made something that sounded kind of cool, and then he would lay it down on sound tape. His previous clients would just pick whatever cues they wanted to use from his “creations.”

Deciding it was going to be a long day, I turned to our assistant and asked him to run to the nearest convenience store to pick up some snacks and soft drinks. While I was digging in my pocket for some money, a quarter slipped out, falling onto the vocoder unit. Suddenly, a magnificent metal shring-shimmer ripped through the speakers. Ken and I looked up with a renewed hope. “Wow! Now that would be great for the opening title ripping effect! What did you do?”

The young man lifted his hands from the keyboard. “I didn't do anything.”

I glanced down at the vocoder, seeing the quarter. It was then that I understood what had happened. I dropped another quarter, and again an eerie metal ripping shring resounded. “That's it! Lace up some tape; we're going to record this.”

Ken and I began dropping coins as our sound designer sat helpless, just rolling tape to record. Ken started banging on the vocoder, delivering whole new variants of shimmer rips. “Don't do that!” barked the young man. “You'll break it!”

“The studio will buy you a new one!” Ken snapped back. “At least we're finally designing some sound!”

Ken and I knew that to successfully extract the sounds we could hear within our own heads, we would have to learn and master the use of the new signal-processing equipment. It was one of the most important lessons we learned.

I always have had a love-hate relationship with the term sound designer. While it suggests someone with a serious mindset for the development of a soundtrack, it also rubs me the wrong way because so many misunderstand and misuse what sound design truly is, cheapening what it has been, what it should be, and what it could be. This abuse and ignorance led the Academy of Motion Picture Arts and Sciences to decide that the job title sound designer would not be eligible for any Academy Award nominations or subsequent awards.

I have worked on megamillion-dollar features that have had a sound designer contractually listed in the credits, whose work was not used. One project was captured by a sound-editorial company only because it promised to contract the services of a particular sound designer, yet during the critical period of developing the concept sound effects for the crucial sequences, the contracted sound designer was on a beach in Tahiti. (Contrary to what you might think, he was not neglecting his work. Actually, he had made an agreement with the supervising sound editor, who knew he had been burned out from the previous picture. They both knew that, to the client, sound design was a perceived concept—a concept nonetheless that would make the difference between contracting the sound job or losing the picture to another sound-editorial firm.)

By the time the sound designer returned from vacation, the crucial temp dub had just been mixed, with the critical sound design already completed. Remember, they were not sound designing for a final mix. They were designing for the temp dub, which in this case was more important politically than the final mix, because it instilled confidence and comfort for the director and studio. Because of the politics, the temp dub would indelibly set the design concept, with little room for change.

Regardless of what you may think, the contracted sound designer is one of the best in the business. At that moment in time and schedule, the supervising sound editor only needed to use his name and title to secure the show; he knew that several of us on his editorial staff were more than capable of accomplishing the sound design chores for the picture.

THE “BIG SOUND”

In July 1989 two men from Finland came to my studio: Antti Hytti was a music composer, and Paul Jyrälä was a sound supervisor/mixer. They were interested in a tour of my facility and transfer bay, in particular. I proudly showed them through the sound-editorial rooms as I brought them to the heart of our studio: transfer. I thought it odd that Paul simply glanced over the Magna-Tech and Stellavox, only giving a passing acknowledgment to the rack of processing gear. He turned his attention to studying the room's wraparound shelves of tapes and odds and ends.

Paul spoke only broken English, so I turned to the composer with curiosity. “Antti, what is he looking for?”

Antti shrugged and then asked Paul. After a short interchange Antti turned back to me. “He says that he is looking for the device that makes the Big Sound.”

I was amused. “There is no device that makes the Big Sound. It's a philosophy, an art—an understanding of what sounds good together to make a bigger sound.”

Antti interpreted to Paul, who in turn nodded with understanding as he approached me. “You must come to Finland so we make this Big Sound.”

I resisted the desire to chuckle, as I was up to my hips in three motion pictures simultaneously. I shook their hands as I wished the two men well, assuming I would not see them again. Several weeks later, I received a work-in-progress video of the picture on which Paul had asked me to take part. Still in the throes of picture editorial, it was falling increasingly further behind schedule. My wife and I watched the NTSC (National Television System Committee) transfer of the PAL (phase alternating line) video as we sat down to dinner. I became transfixed as I watched images of thousands of troops in this 1930s-era epic, with T-26 Russian armor charging across snow-covered battlefields.

The production recordings were extremely good, but like most production tracks, focused on spoken dialog. In a picture filled with men, tanks, airplanes, steam trains, and weaponry, much sound effect work still had to be done. The potential sound design grew within my head, and my imagination started filling the gaps and action sequences. If any picture cried out for the Big Sound, this was the one. Talvisota (The Winter War) was the true-life story of the war between Finland and the Soviet Union in 1939. It proved to be one of the most important audio involvements of my professional career. It was not a question of money. It was an issue of passion. The heart and soul of a nation beckoned from the rough work-in-progress video.

I made arrangements to go to Helsinki and work with Paul Jyrälä to co-supervise this awesome challenge. I put together a bag filled with sound effect DAT tapes and a catalog printout to add to the sound design lexicon that Paul and I would have to use.

This project was especially challenging, as this was pre-nonlinear workstation. The film was shot 35 mm with a 1:66 aspect. They didn't use 35 mm stripe or fullcoat for sound editing. But they did use 17.5 mm fullcoat, and had one of the best 2-track stereo transfer bays I have had the privilege to work in. To add to the challenge, we only had nine 17.5 mm Perfectone playback machines on the rerecording stage, which meant that we had to be extremely frugal about how wide our “A,” “B,” and “C” predub groups could be built out. We had no Moviolas, no synchronizers, no Acmade coding machines, no cutting tables with rewinds, and no Rivas splicers. We did have two flatbed Steinbecks, a lot of veteran know-how, and a sauna.

AMERICAN SOUND DESIGN

Properly designed sound has a timbre all its own. Timbre is the violin—the vibration resonating emotionally with the audience. Timbre sets good sound apart from a pedestrian soundtrack.

Sound as we have seen it grow in the United States has distinguished American pictures on a worldwide market. Style, content, slickness of production, and rapidity of storytelling all made the sound of movies produced in the United States a benchmark and inspiration for sound craftsmanship throughout the rest of the world. For a long time, many foreign crews considered sound only a background behind the actors. They have only in recent years developed a Big Sound concept of their own.

For those clients understanding that they need a theatrical soundtrack for their pictures, the first hurdle is not understanding what a theatrical soundtrack sounds like but how to achieve it. When they hear soundtrack, the vast majority of people think only of the music score. The general audience believes that almost all nonmusical sounds are actually recorded on the set when the film is shot. It does not occur to them that far more time and effort were put into the nonmusical portion of the audio experience of the storytelling as was put into composing and orchestrating the theme of the music score itself.

Some producers and filmmakers fail to realize that theatrical sound is not a format. It is not a commitment to spend gigadollars or hire a crew the size of a combat battalion. It is a philosophy and an art form that only years of experience can help you understand.

The key to a great soundtrack is its dynamics and variation, with occasional introductions of subtle, unexpected things: the hint of hot gasses on a close-up of a recently fired gun barrel, or an unusual spatial inversion, such as a delicate sucking-up sound juxtaposed against a well-oiled metallic movement for a shot of a high-tech device being snapped open.

The Sound Design Legacy

When you ask a film enthusiast about sound design, the tendency is to recall legendary pictures with memorable sound, such as Apocalypse Now and the Star Wars series. I could not agree more. Many of us were greatly influenced by the work of Walter Murch and Ben Burtt (who worked on the above films, respectively). They not only had great product opportunities to practice their art form, but they also had producer-directors who provided the latitude and financial means to achieve exceptional accomplishments.

Without taking any praise away from Walter or Ben, let us remember that sound design did not begin in the 1970s. Did you ever study the soundtracks to George Pal's The War of the Worlds or The Naked Jungle? Have you considered the low-budget constrictions that director Robert Wise faced while making The Day the Earth Stood Still or the challenges confronting his sound-editorial team in creating both the flying saucer and alien ray weapons? Who dreamed up using soda fizz as the base sound effect for the Maribunta, the army ants that terrorized Charlton Heston's South American plantation in The Naked Jungle? Speaking of ants, imagine thinking up the brilliant idea of looping a squeaky pick-up truck fan belt for the shrieks of giant ants in Them. Kids in the theater wanted to hide for safety when that incredible sound came off the screen.

When these fine craftspeople labored to make such memorable sound events for your entertainment pleasure, they made them without the help of today's high-tech digital tools—without harmonizers or vocoders, without the hundreds, if not thousands of plug-in variants to shape and sculpt the audio cues. They designed these sounds with their own imaginations of working with the tools they had at hand and being extraordinarily innovative: understanding what sounds to put together to create new sound events, how to play them backward, slow them down, cut, clip, and scrape them with a razor blade (when magnetic soundtrack became available in 1953), or paint them with blooping ink (when they still cut sound effects on optical track 35 mm film).

DO YOU DO SPECIAL EFFECTS TOO?

In the late summer of 1980 I had completed Roger Corman's Battle Beyond the Stars. I was enthusiastic about the picture, mainly because I had survived the film's frugal sound-editorial budget as well as all the daily changes due to the myriad special effect shots that came in extremely late in the process.

I had to come up with seven different sounding spacecraft with unique results, such as the Nestar ship, which we created from human voices—the Community Choir from my hometown college of Coalinga. Choral director Bernice Isham had conducted her sopranos, altos, tenors, and basses through a whole maze of interesting vocal gymnastics, which were later processed to turn 40 voices into million-pound thrust engines for the Nestar ship, manned by clone humanoids. You can listen to several examples of the sound effect creations developed from the choral voices, which continue to be used in many films to this day.

We had developed Robert Vaughn's ship from the root recordings of a dragster car and then processed it heavily to give it a menacing and powerful “magnetic-flux” force—just the kind of quick-draw space chariot a space-opera gunslinger would drive.

The day after the cast and crew screening, I showed up at Roger's office to discuss another project. As was the custom, I was met by his personal secretary. I could not help but beam with pride regarding my work on Battle, so I asked her if she had attended the screening, and if so, what did she think of the sound effects?

She had gone to the screening, but she struggled to remember the soundtrack. “The sound effects were okay, for what few you had.”

“The few I had?”

The secretary shrugged. “Well, you know, there were so many special effects in the picture.”

“Special effects! Where do you think the sound for all of those special effects come from?” I snapped back.

She brightened up. “Oh, do you do that too?”

“Do that too?” I was dumbfounded. “Who do you think makes those little plastic models with the twinkly-lights sound like powerful juggernauts? Sound editors do, not model builders!”

It became obvious to me that the viewing audience can either not separate visual special effects from sound effects or has a hard time understanding where one ends and the other begins.

A good friend of mine got into hot water once with the Special Effects Committee when, in a heated argument he had the temerity to suggest that to have a truly fair appraisal of their work in award evaluation competition they need to turn the soundtrack off. After all, the work of the sound designer and the sound editors were vastly affecting the perception of visual special effects. The committee did not appreciate or heed my friend's suggestion, even though my friend had over 30 years and 400 feature credits of experience behind his statement.

I have had instances where visual special effects artists would drop by to hear what I was doing with their work-in-progress “animatic” special effect shots, only to be inspired by something that we were doing that they had not thought of. In turn, they would go back to continue work on these shots, factoring in new thinking that had been born out of our informal get-together.

SOUND DESIGN MISINFORMATION

A couple of years ago I read an article in a popular audio periodical in which a new “flavor-of-the-month” sound designer had been interviewed. He proudly boasted something no one else supposedly had done: he had synthesized Clint Eastwood's famous .44 Magnum gunshot from Dirty Harry into a laser shot. I sighed. We had done the same thing nearly 20 years before for the Roger Corman space opera Battle Beyond the Stars. For nearly two decades Robert Vaughn's futuristic hand weapon had boldly fired the sharpest, most penetrating laser shot imaginable, which we developed from none other than Clint Eastwood's famous pistol.

During the mid-1980s, I had hired a young enthusiastic graduate from a prestigious Southern California university film school. He told me he felt very honored to start his film career at my facility, as one of his professors had lectured about my sound design techniques for Carpenter's The Thing.

Momentarily venerated, I felt a rush of pride, which was swiftly displaced by curious suspicion. I asked the young man what his professor had said. He joyously recounted his professor's explanation about how I had deliberately and painstakingly designed the heartbeat throughout the blood test sequence in reel 10, which subconsciously got into a rhythmic pulse, bringing the audience to a moment of terror.

I stood staring at my new employee with bewilderment. What heartbeat? My partner Ken Sweet and I had discussed the sound design for the project very thoroughly, and one sound we absolutely had stayed away from because it screamed of cliché was any kind of heartbeat.

I could not take it any longer. “Heartbeat?! Horse hockey! The studio was too cheap to buy fresh fullcoat stock for the stereo sound effect transfers! They used reclaim which had been sitting on a steel film rack that sat in the hallway across from the entrance to Dubbing 3. Nobody knew it at the time, but the steel rack was magnetized, which caused a spike throughout the stock. A magnetized spike can't be removed by bulk degaussing. Every time the roll of magnetic stock turns 360° there is a very low frequency ‘whomp.’ You can't hear it on a Moviola. We didn't discover it until we were on the dubbing stage and by then it was too late!”

The young man shrugged. “It still sounded pretty neat.”

Pretty neat? I guess I'm not as upset about the perceived sound design where none was intended as I am about the fact that the professor had not researched the subject on which he lectured. He certainly had never called to ask about the truth, let alone to inquire about anything of consequence that would have empowered his classroom teachings. He just made up a fiction, harming the impressionable minds of students with misinformation.

In that same sequence of The Thing I can point out real sound design. As John Carpenter headed north to Alaska to shoot the Antarctica base-camp sequences, he announced that one could not go too far in designing the voice of The Thing. To that end, I tried several ideas, with mixed results. Then one morning I was taking a shower when I ran my fingers over the soap-encrusted fiberglass wall. It made the strangest unearthly sound. I was inspired. Turning off the shower, I grabbed my tape recorder, and dangled microphones from a broomstick taped in place from wall to wall at the top of the shower enclosure.

I carefully performed various “vocalities” with my fingertips on the fiberglass—moans, cries, attack shrieks, painful yelps, and other creature movements. When John Carpenter returned from Alaska, I could hardly wait to play the new concept for him. I set the tape recorder on the desk in front of him and depressed the Play button.

I learned a valuable lesson that day. What a client requests is not necessarily what he or she really wants, or particularly what he or she means. Rather than developing a vocal characterization for a creature never truly heard before, both director and studio executives actually meant, “We want the same old thing everyone expects to hear and what has worked in the past—only give it a little different spin, you know.” The only place I could sneak a hint of my original concept for The Thing's vocal chords was in the blood-test scene where Kurt Russell stuck the hot wire into the petri dish and the blood leaped out in agony onto the floor.

At the point the blood turned and scurried away was where I cut those sound cues of my fingertips on the shower wall, called “Tentacles tk-2.” I wonder what eerie and horrifying moments we could have conjured up instead of the traditional cliché lion growls and bear roars we were compelled to use in the final confrontation between Kurt Russell and the mutant Thing.

BOB GRIEVE AND THE WESTERN

Bob Grieve is one of the most creative and versatile supervising sound editors in Hollywood. He created dynamic and fulfilling soundtracks for movies such as Body Heat, The Big Chill, Wolfen, and Ghosts of Mississippi. But he has also created action-packed sounds for Robin Hood: Prince of Thieves, Wyatt Earp, Silverado, and Turbulence. Here he comments on two westerns that he supervised, his fabulous work on Silverado and Wyatt Earp:

Silverado

Silverado was a joy to work on. This was the early days of temp dubs. The first temp pointed out to the director that they were killing far too many people in the movie. This came about by the number of gunshots that were put into the movie that they had never heard before, and it made them come to that conclusion. In the days before complete temp dubs you had more time to customize the movie with all original sound. So all the guns are original. All the horses were recorded for the movie. I mapped out every move a horse made in the movie and went out and shot all of it. There were horses on every surface, horses running, jumping, trotting, walking, cantering, etc. you name it, up hill, down hill, milling, everything.

Then we picked the most in-sync part of, say, a horse trot by, and using a VariSpeed 1/4” deck, we tailored the horse hooves to sync as closely as our eye could detect, and then we went in and tightened up the sync during sound editorial. In the old days the editors just picked the best sync points, which are usually the start and finish.

images

Figure 12.1 Supervising sound editor Bob Grieve.

At one time when we were recording the Henry rifle, one of the eject shells made a fabulous whistling whizzing sound as it left the chamber. We must have recorded 500 ejections looking for that perfect whizz. Especially in the beginning of the movie when Buddy is surrounded in the old shack. I remember we decided to shoot some very drunk cowboys in one of the bar scenes. Of course, we decided to become method actors. By the end of the session it was a miracle that I could still remember how to turn on the recorder. Later when the mixers and I were predubbing the track the assistant film editor, Mia Goldman, said that the track could never be used because it was too outrageous. With some mixing it's nicely embedded with the other tracks and still exists in the movie today.

One of the things that really helped was getting permission to record SFX on-set. One example of this was the stampede sounds. There is very little sound of cattle stampedes and in the case of Silverado I found out why. In order to get the cattle to stampede, the rangers were shooting off their guns, and there were explosions and a lot of yelling. It was impossible to get the sound of the hooves, and I was looking at cobbling SFX of horses to simulate a stampede. Aside: Cattle are very light on their feet and really don't make a huge bass thump. I was talking to a cowboy and he said that, even though they were driving the cattle, this was nothing like a real stampede. In a real stampede the cattle move like a school of fish and instantaneously.

In any case, I happened to have the recorder on as the cowboys drove the cattle walking back through town to set up for the next take. All of a sudden the cattle went into a real stampede and I was rolling. That was very exciting to say the least. To prepare the tracks I needed to add some of that bigger larger-than-life sound, and the Foley was very excellent for adding that over-the-top thunderous excitement. John Roesch was the Foley artist and at that time the studios chose to not credit them.

One night I decided to go out and record the wind in the rocks in Santa Fe. These are still among my favorite wind tracks. I record the sound of the wind every chance I get. Every scene I cut with wind in it has as many tracks of wind as I feel fit. Many times I'll have at least four or five winds all doing what they do to strengthen the action on the film.

I've often heard from other sound editors that the guns in Silverado were very excellent. We had a hell of a problem getting those guns into the film without overloading them. Kevin O'Connell worked a long time on those guns, and it shows. Remember that this was before the subchannel and 5.1 as well.

Back in the cutting room my thrust is to immerse the viewer in the world of the film. In the case of a western, I feel that it is extremely important to make sure that every sound is appropriate to the scene you're watching. Sound is subliminal: it enters your ears without your full awareness, but it completes and enhances the film that you're looking at. In a film that is dealing with a real environment it's important for me to do everything I can to immerse the viewer in the experience.

Wyatt Earp

I think for me Wyatt Earp was a much less stressful time because I'd lived it once through Silverado. It was just going to be better. There was some stress that came from the producer Charlie Okun, bless his heart. The movie was 3 1/2 hours long. I would tell him when he complained about the cost of post that it was like doing two movies. We had friendly fights about this on several occasions. This was a concern, though, and the work was daunting at times. Thanks to (editor) Carol Littleton and (director) Lawrence Kasdan (Silverado), the schedule was very realistic and cognizant of the time it would take to finish the film.

Again I got to go out to the set, and that practice is invaluable. They again imported all the old west stuff you could imagine, all vintage. They brought in a real blacksmith. I recorded a real six-horse stagecoach from every conceivable position. But one of the more useful things I did was organize and record the extras. The set had some fully functional buildings and I recorded the group ADR in these buildings. The bar, for example, was a real gift. I was able to record the bar sparse, medium, and busy from every conceivable angle. Upstairs, behind closed doors, etc., and these tracks fit perfectly into the movie without a lot of mixing. I used the extras to walk on the raised boardwalks, real cowboy boots on real wooden boardwalks. I recorded the horses in town so I had the natural reverb of the old wooden buildings.

I recorded the guns at different perspectives throughout town so when the bad guys roll into town shooting their guns off, the reverb and position of the guns are perfect to what the viewer is looking at. This adds a degree of realism that is hard to quantify but is very important for the immersion of the viewer. The extras, by the way, were very cooperative and would do anything I asked of them. You see, extras spend most of their time on-set standing around and they were thankful for anything to do. Another advantage of going to the set is going to dailies. This is a great way to bond early with your director, find out what's shooting the next day, organize your recording shoot, and now in today's world you could conceivably shoot SFX in the day and put them in dailies to enhance the daily viewing experience.

THE DIFFERENCE BETWEEN DESIGN AND MUD

David Stone

Supervising sound editor David Stone, winner of the Academy Award for Best Sound Effects Editing in 1992’s Bram Stoker's Dracula, is a quiet and reserved audio artist. As sound design field commanders go, he is not flamboyant, nor does he use showmanship as a crutch. What he does is quietly and systematically deliver the magic of his craft: his firm grasp of storytelling through sound. Though he loves to work on animated films—having helmed such projects as Beauty and the Beast, A Goofy Movie, Cats Don't Dance, Pooh's Grand Adventure: The Search for Christopher Robin, The Lion King II, as well as Ocean's Twelve—one should never try to typecast him. Stone has time and again shown his various creative facets and the ability to mix his palette of sounds in different ways to suit the projects at hand. He leaves the slick illusionary double-talk to others. For himself, he stands back and studies the situation for a while. He makes no pretense that he is an expert in all the fancy audio software or that he was up half the night trying to learn how to perform a new DSP function. Stone offered the following insight:

We're all becoming isolated by using so many computerized devices. Music and movies that used to play to an audience are now playing to an individual. This makes us think we can rely on tech as a magic genie, to make a good soundtrack for us. But what a great piece of film really needs is the work of a social group. Our best student filmmakers are the ones with more highly developed social skills, because film is the great collaborative art.

images

Figure 12.2 David Stone, supervising sound editor and Academy Award winner for Best Sound Effects Editing on Bram Stoker's Dracula (1992).

There was a time when Hollywood soundtracks kept getting better because freelance sound people were circulating and integrating into different groups for each film. So skills and strategies were always growing in a fertile environment. It's the social power that makes great films extraordinary. . .the same thing that audiences enjoy in films that have more ensemble acting, where the art is more than the sum of its parts.

I think if film students could study the history of the soundtracks they admire, they'd see this pattern of expert collaboration, and that's quite the opposite of what happens when too few people rely on too much tech to make movie sound.

Being a computer vidiot is not what creates good sound design—understanding what sounds go together and when to use them does.

Many times I have watched young sound editors simply try to put together two or more sound effects to achieve a larger audio event, often without satisfaction. A common mistake is to simply lay one pistol shot on top of another. They do not necessarily get a bigger pistol shot; in fact, they often only diminish the clarity and character of the weapon because they are putting together two sounds that have too many common frequency dynamics to complement one another—instead the result is what we call mud.

A good example of this is the Oliver Stone film On Any Sunday. The sound designer did not pay much attention to the frequency characteristics of his audio elements or, for that matter, did he pay much attention to over-modulation. As I listened to the presentation at the Academy theatre (just so you will not think that a lesser theatre with a substandard speaker system was the performance venue), I was completely taken out of the film itself by the badly designed football effects. The lack of clarity and “roundness” of the sound were, instead, replaced by overdriven effects with an emphasis in the upper midrange that cannot be described as anything other than distortion.

A newly initiated audio jockey may naïvely refer to such sound as befitting the type of film—the “war” of sports needing a “war” of sound. Frankly, such an answer is horse dung. Listen to the sound design of Road to Perdition or Saving Private Ryan. These are also sounds of violence and war in the extreme and yet they are designed with full roundness and clarity, with each audio event having a very tactile linkage with its visual counterpart.

I have harped on this throughout this book and will continue to do so, because it is the keystone of the sound design philosophy: if you take the audience out of the film because of bad sound design, then you have failed as a sound designer. It's just that simple.

I hope you will be patient with me for a few pages as I am going to introduce you (if you do not already know these sound artists) to several individuals that you may want to use as role models as you develop your own style of work and professional excellence. As you read about them, you will see a common thread, a common factor in all of them. They do not just grind out their projects like sausage in a meat grinder. They structure their projects as realistically as they can within the budgetary and schedule restraints. They know how to shift gears and cut corners when and if they have to without sacrificing the ultimate quality of the soundtrack that they are entrusted with. They custom record, they try new things, they know how to put together an efficient team to accomplish the mission.

Gary Rydstrom

Sound designer/sound rerecording mixer Gary Rydstrom has developed a stellar career, with credits such as Peter Pan, Minority Report, Backdraft, Terminator 2: Judgment Day, Jurassic Park and The Lost World: Jurassic Park, Strange Days, The Haunting, Artificial Intelligence, and the stunning sound design and mixing efforts of Saving Private Ryan.

images

Figure 12.3 Supervising sound editor/sound designer Richard King. On the left, Richard and his team are custom recording canvas sail movement for Master and Commander: The Far Side of the World, which earned Richard an Academy Award for Best Sound Editing (2004).

He was nominated for Academy awards for his work on Finding Nemo, Minority Report, Monsters Inc., Star Wars: Episode I—The Phantom Menace, and Backdraft and won Oscars for Saving Private Ryan, Titanic, Jurassic Park, and Terminator 2: Judgment Day.

Richard King

Supervising sound editor/sound designer Richard King is no stranger to challenging and complex sound design. He has won Academy Awards for Best Sound Editing for Inception, The Dark Knight, War of the Worlds, and Master and Commander: The Far Side of the World. Richard has also helmed such diverse feature films as Rob Roy, Gattaca, The Jackal, Unbreakable, Signs, Lemony Snicket's A Series of Unfortunate Events, Steven Spielberg's epic version of War of the Worlds, and Firewall, to name just a few. King says:

One of the requirements of the job is to place myself within the world of the picture, and think about sound as dictated by the style of the film. This can range from “Hollywood” (where you really don't hear any sounds not specifically relating to the scene at hand—I try not to work on these—boring!), to hyper-real, where, as in life, one is set up to accept the odd, off-the-wall sound, though, of course, carefully selected to support/counterpoint or in some way support and contribute to the moment.

Jon Johnson

Supervising sound editor Jon Johnson first learned his craft on numerous television projects in the early 1980s and then designed and audio processed sound effects for Stargate and later for Independence Day. Taking the helm as supervising sound editor, Jon has successfully overseen the sound design and sound editorial on projects such as The Patriot, A Knight's Tale, Joy Ride, The Rookie, The Alamo, The Great Raid, and Amazing Grace, to name a few. In 2000, Jon won the Academy Award for Best Sound Editing for the World War II submarine action-thriller U-571.

images

Figure 12.4 Supervising sound editor/sound designer Jon Johnson, Academy Award winner for Best Sound Editing in 2000 for the WWII sea action-thriller U-571.

images

Figure 12.5 John Larsen (left), reviewing material with assistant sound editor Smokey Cloud (right).

John A. Larsen

Supervising sound editor John Larsen is one of the few sound craftspeople-artists who is also a smart and successful businessman. His sound-editorial department on the 20th Century Fox lot continually takes on and successfully completes extremely difficult and challenging projects through the postproduction process. One of his first important sound editing jobs came when he worked on the 1980 movie Popeye, starring Robin Williams. He continued to practice his craft on Personal Best, Table for Five, Brainstorm, and Meatballs Part II. In 1984 he began serving as the supervising sound editor for the television series Miami Vice, where he really learned the often hard lessons of problem solving compressed schedules and tight crews.

After sound editing on 1985’s Fandango and Back to the Future, John continued to supervise feature films. Since then he has handled a wide breadth of style and demands from pictures such as Lord of Illusions, Down Periscope, Chain Reaction, and Jingle All the Way to high-design-demand projects such as Alien: Resurrection; The X Files feature film; The Siege; the 2001 version of Planet of the Apes; I, Robot; Daredevil; Elektra; Fantastic Four; and all four X-Men features. He has literally become the sound voice of Marvel Comics for 20th Century Fox.

John understands exactly how to tailor difficult and often highly politically “hot-to-handle” projects through the ever-changing, never-locked picture realm of high-budget, fast schedules with high-expectation demands. He says:

How can anyone who loves sound not be excited about the potential creation of unknown and otherworldly characters? It is a dream come true! We should all strive to be so lucky.

As a sound supervisor you must be prepared, you must be a step ahead, you must know your crews’ strengths as well as their weaknesses. With talent comes character but out of these character traits comes extreme and fantastic talent. People with real skills sometimes come with the baggage you must learn how to recognize and deal with. None of us are perfect, we are human beings, not G5s!

images

Figure 12.6 Richard Anderson, sitting in front of the giant server that feeds all of the workstations with instant access to one of the most powerful sound libraries in the business.

Richard Anderson

Supervising sound editor/sound designer Richard Anderson actually started his career as a picture editor, but in the late 1970s became the supervising sound editor at Gomillion Sound, overseeing numerous projects of all sorts and helming several Roger Corman classics, such as Ron Howard's first directorial effort Grand Theft Auto. A couple of years later, Richard won the Academy Award, along with Ben Burtt, as co-supervising sound editor for Best Sound Effects Editing for Raiders of the Lost Ark.

Richard teamed up with Stephen Hunter Flick as they worked together on projects such as Star Trek: The Motion Picture, Final Countdown, Oliver Stone's The Hand, Poltergeist, 48 Hours, Under Fire, and many others.

As one of the best role models I could think of for those who aspire to enter the field of entertainment sound, Richard Anderson's personal work habits, ethics, and a superb style of stage etiquette and leadership are extraordinary. It is especially evident on projects he supervised such as 2010, Predator, Beetle Juice, Edward Scissorhands, The Lion King, Being John Malkovich, Dante's Peak, and more recently Madagascar, Shark Tale, Shrek the Third, Imagine That, Astro Boy, and Over the Hedge.

Perhaps one of Richard's more satisfying efforts was in the field of audio restoration for the director's cut vision of David Lean's Lawrence of Arabia. Audio restoration has become a very popular process in recent years, whether it be resurrecting and expanding a film to a version that a director had originally intended or to clean up and bring back an existing historic soundtrack to the way it sounded when it was first mixed—removing optical crackle, hiss, transfer ticks, and anomalies that include, but are not limited to, nasty hiss presence due to sloppy copies made from copies made from copies rather than going back to original source.

Stephen Hunter Flick

Supervising sound editor Stephen Hunter Flick started his career in the late 1970s, a veteran of Roger Corman's Deathsport, which happened to be my first supervising sound editor's job and when I first met Stephen. He soon teamed up with Richard Anderson and helped tackle numerous robustly sound-designed projects, serving as supervising sound editor on no less than Pulp Fiction, Apollo 13, Twister, Predator 2, Starship Troopers, Spider-Man, Terminator 3: Rise of the Machines, Zathura: A Space Adventure, and many others. In addition, Stephen has won two Academy Awards for Best Sound Effect Editing for two action-driven features, RoboCop and Speed, and was nominated for his work on Die Hard, Total Recall, and Poltergeist.

images

Figure 12.7 Stephen Hunter Flick, taking a break from organizing a project in his cutting rooms on the Sony lot in Culver City, California.

Having worked with Stephen from time to time on numerous projects, I would also entice the reader to learn from the leadership techniques of Stephen. Many supervising sound editors, when faced with conflict or the giant “finger-pointing-of-fault” that often arises during the postproduction process, will sacrifice their editors to save their own skin, as they are first and foremost afraid of losing a client or worse, having a picture pulled from their hands. Stephen has, time and time again, demonstrated his leadership character to protect his talent. He is the supervising sound editor, which means he is responsible. If his team members make mistakes or have problems, he will deal with the problem within “the family,” adopting a basic and mature wisdom that comes from years of veteran know-how. Rather than finding the guilty or throwing a scapegoat to the clients, he'll just fix and solve the problem. It has earned him great respect and loyalty from hundreds of craftspeople who have worked with him and for him over the past 30 or so (let's not admit exactly) years.

BIG GUNS

I was in between pictures at my own editorial shop when Stephen asked if I would come over to Weddington to help cut sound effects on Predator 2 for him. Several days later, John Dunn, another sound effects editor, asked if I could bring some of my weapon effects to the opening shootout in reel 1. I brought a compilation DAT the following day with several gun effects I thought would contribute to the effort. Just after lunch, Flick burst into my room (as is his habit, earning him the affectionate nickname of “Tsunami”) and demanded to know, “How come your guns are bigger than mine?!”

“I never said my guns are bigger than yours, Steve.”

Steve shrugged. “I mean, I know I have big guns, but yours are dangerous!”

With that, he whirled and disappeared down the hall just as suddenly as he had appeared. I sat in stunned aftermath, pondering his use of the word dangerous. After due consideration, I agreed. The style by which I set my microphones up when I record weapon fire and the combinations of elements if I editorially manufacture weapon fire to create audio events that are supposed to scare and frighten have a bite to them.

Gunfire is an in-your-face crack! Big guns are not made by pouring tons of low-end frequency into the audio event. Low end does not have any punch or bite. Low end is fun, when used appropriately, but the upper midrange and high-end bite with a low-end underbed will bring the weapon to life. Dangerous sounds are not heard at arm's length where it is safe but in your face. They are almost close enough to reach out and touch you. Distance—what we call “proximity”—triggers your subconscious emotional responses to determine the danger.

I approach blending multiple sound cues together in a new design to being a choir director. Designing a sound event is exactly like mixing and moving the various voices of the choir to create a new vocal event with richness and body. I think of each sound as being soprano, alto, tenor, or bass. Putting basses together will not deliver a cutting punch and a gutsy depth of low end. All I will get is muddy low end with no cutting punch!

The Pro Tools session shown in Figure 12.8 is a rifle combination I put together for the 1998 action-adventure period film Legionnaire, a period film that is set circa 1925. We had several thousand rifle shots in the battle scenes; for each, I painstakingly cut four stereo pairs of sounds that made up the basic signature of a single 7.62 Mauser rifle shot. The reason is simple. Recording a gunshot with all the spectrum characteristics envisioned in a single recording by the supervising sound editor is impossible. Choice of microphone, placement of microphone, selection of recording medium—all these factors determine the voice of the rifle report.

images

Figure 12.8 The “Mauser rifle” Pro Tools session.

Listen to the individual cues and the combination mix of the “designed” 7.62 Mauser rifle shot on the DVD provided with this book.

As an experienced sound designer, I review recordings at my disposal and choose those cues that comprise the soprano, alto, tenor, and bass of the performance. I may slow down or speed up one or more of the cues. I may pitch-shift or alter the equalization as I deem necessary. I then put them together, very carefully lining up the leading edge of each discharge as accurately as I can (well within 1/100 of a second accurate). Then I play them together as one new voice. I may need to lower the tenor and/or raise the alto, or I may discover that the soprano is not working and that I must choose a new soprano element.

If possible, I will not commit the mixing of these elements at the sound editorial stage. After consulting with the head rerecording mixer and considering the predubbing schedule, I will decide whether we can afford cutting the elements abreast and leaving the actual mixing to the sound effect mixer during predubbing. If the budget of the postproduction sound job is too small and the sound effect predubbing too short (or nonexistent), I will mix the elements together prior to the sound editorial phase, thereby throwing the die of commitment to how the individual elements are balanced together in a new single stereo pair.

The same goes for rifle bolt actions. I may have the exact recordings of the weapon used in the action sequence in the movie. When I cut them to sync, I play them against the rest of the tracks. The bolt actions played by themselves sound great, but now they are mixed with the multitude of other sounds present. They just do not have the same punch or characterization as when they stood alone. As a sound designer, I must pull other material, creating the illusion and perception of historical audio accuracy, but I must do it by factoring in the involvement of the other sounds of the battle. Their own character and frequency signatures will have a profound influence on the sound I am trying to create.

This issue was demonstrated on Escape from New York. During sound effect predubs, the picture editor nearly dumped the helicopter skid impacts I had made for the Hueys for their descent and landing in the streets of New York. Played by themselves, they sounded extremely loud and sharp. The veteran sound effect mixer, Gregg Landaker, advised the picture editor that it was not necessary or wise to dump the effects at that time, to let them stay as they were in a separate set of channels which could be easily dumped later if not wanted.

Later, as we rehearsed the sequence for the final mix, it became apparent that Gregg's advice not to make a hasty decision paid off. Now that the heavy pounding of the Huey helicopter blades filled the theater along with the haunting siren-like score, what had sounded like huge and ungainly metal impacts in the individual predub now became a gentle, even subtle, helicopter skid touchdown on asphalt.

WHAT MAKES EXPLOSIONS BIG

For years following Christine, people constantly asked about the magic I used to bring such huge explosions to the screen. Remember, Christine was mixed before digital technology, when we still worked to an 85 dB maximum. The simple answer should transcend your design thinking into areas aside from just explosions. I see sound editors laying on all kinds of low end or adding shotgun blasts into the explosion combination, and certainly a nice low-end wallop is cool and necessary, but it isn't dangerous. All it does is muddy it up. What makes an explosion big and dangerous is not the boom but the debris that is thrown around.

I first realized this while watching footage of military C4 explosives igniting. Smokeless and seemingly nothing as a visual entity unto themselves, they wreak havoc, tearing apart trees, vehicles, and masonry; it's the debris that makes C4 so visually awesome. The same is true with sound. Go back and listen to the sequence in reel 7 of Christine again; the sequence where the gas station blows up. Listen to the glass debris flying out the window, the variations of metal, oil cans, crowbars, tires, and tools that come flying out of the service bay. Listen to the metal sidings of the gas pumps flying up and impacting the light overhang atop the fueling areas. Debris is the key to danger. Prepare tracks so that the rerecording mixer can pan debris cues into the surround channels, bringing the audience into the action, rather than allowing it to watch the scene at a safe distance.

THE SATISFACTION OF SUBTLETY

Sound design often calls to mind the big, high-profile audio events that serve as landmarks in a film. For every stand-out moment, however, dozens of other equally important moments designate sound design as part of the figurative chorus line. These moments are not solo events, but supportive and transparent performances that enhance the storytelling continuity of the film.

Undoubtedly, my reputation is one of action sound effects. I freely admit that I enjoy designing the hardware and firepower and wrath of nature, yet some of my most satisfying creations have been the little things that are hardly noticed: the special gust of wind through the hair of the hero in the night desert, the special seat compression with a taste of spring action as a passenger swings into a car seat and settles in, the delicacy of a slow door latch as a child timidly enters the master bedroom—little golden touches that fortify and sweetly satisfy the idea of design.

REALITY VERSUS ENTERTAINMENT

One of the first requirements for the successful achievement of a soundtrack is becoming audio educated with the world. I know that advice sounds naïve and obvious, but it is not. Listen and observe life around you. Listen to the components of sound and come to understand how things work. Learn the difference between a Rolls Royce Merlin engine of a P-51 and the Pratt & Whitney of an AT-6, the difference between a hammer being cocked on a .38 service revolver and a hammer being cocked on a .357 Smith & Wesson. What precise audio movements separate the actions of a 100-ton metal press? What is the audio difference between a grass fire and an oil fire, between pine burning and oak? What is the difference between a rope swish and a wire or dowel swish? How can one distinguish blade impacts of a fencing foil from a cutlass or a saber; what kind of metallic ring-off would go with each?

A supervising sound editor I knew was thrown off a picture because he did not know what a Ford Cobra was and insisted that his effect editors cut sound cues from an English sports car series. It is not necessary to be a walking encyclopedia that can regurgitate information about the mating call of the Loggerhead Shrike or the rate of fire of a BAR (that is, if you even know what a BAR1 is). What is important is that you diligently do research so that you can walk onto the rerecording stage with the proper material.

I remember predubbing a helicopter warming up. Suddenly the engine wind-up bumped hard with a burst from the turbine. The sound effects mixer quickly tried to duck it out, as he thought it was a bad sound cut on my part. The director immediately corrected him, indeed that was absolutely the right sound at exactly the right spot. The mixer asked how I knew where to cut the turbine burst. I told him that, in studying the shot frame by frame, I had noticed two frames of the exhaust that were a shade lighter than the rest. It seemed to me that that was where the turbo had kicked in.

Reality, however, is not necessarily the focus of sound design. There is reality, and there is the perception of reality. We were rehearsing for the final mix of reel 3 of Escape from New York, where the Huey helicopters descend and land in an attempt to find the president. I had been working very long hours and was exhausted. After I dozed off and had fallen out of my chair several nights before, Don Rogers had supplied a roll-around couch for me to sleep on during the mixing process. The picture editor paced behind the mixers as they rehearsed the reel. He raised his hand for them to stop and announced that he was missing a “descending” sound.

Gregg Landaker and Bill Varney tried to determine to what element of sound the picture editor was referring. The exact components of the helicopters were all there. From an authenticity point of view, nothing was missing.

I rolled over and raised my hand. “Roll back to 80 feet, take effects 14 off the line. Take the feed and the take-up reels off the machine and switch them; then put on a 3-track head stack. Leaving effects 14 off the line, roll back to this shot, then place effects 14 on the line. I think you will get the desired effect.”

Everybody turned to look at me with disbelief, certain I was simply talking in my sleep. Bill Varney pressed the talkback button on the mixing console so that the recordist in the machine room could hear. “Would you please repeat that, Mr. Yewdall?”

I repeated the instructions. The picture editor had heard enough. “What is that supposed to accomplish?”

I explained. “At 80 feet is where Lee Van Cleef announces he is ‘going in.’ The next shot is the fleet of Hueys warming up and taking off. If you check the cue sheets, I think that you will see that effects 14 has a wind-up from a cold start. Played forward, it is a jet whine ascending with no blade rotation. If we play that track over the third channel position of a 3-channel head stack, which means the track is really playing backward while we are rolling forward, I think we will achieve a jet whine descending.” (Listen to the sound cue of “Huey helicopter descending” on the DVD provided with this book.)

From then on, more movies used helicopter jet whine warm-ups (prior to blade rotation) both forward and reversed to sell the action of helicopters rising or descending, as the visual action dictated. It is not reality, but it is the entertaining perception of reality.

In today's postproduction evolution, the tasks of equalization and signal processing, once considered sacred ground for the rerecording mixers, have increasingly become the working domain of sound designers and supervising sound editors. Accepting those chores, however, also brings about certain responsibilities and ramifications should your work be inappropriate and cost the client additional budget dollars to unravel what you have done.

If you twist those signal-processing knobs, then know the accompanying burden of responsibility. Experienced supervising sound editors have years of combat on the rerecording mix stage to know what they and their team members should do and what should be left for rerecording mixers. I liken my work to getting the rough diamond into shape. If I overpolish the material, I risk trapping the mixer with an audio cue that he or she may not be able to manipulate and appropriately use. Do not polish sound to a point that a mixer has no maneuvering room. You will thwart a collaborative relationship with the mixer.

The best sound design is done with a sable brush, not with a 10-pound sledgehammer. If your sound design distracts the audience from the story, you have failed. If your sound design works in concert with and elevates the action to a new level, you have succeeded. It is just that simple.

Sound design does not mean that you have to have a workstation stuffed full of fancy gear and complex software. One of the best signal process devices that I like to use is the simple Klark-Teknik DN 30/30, a 30-band graphic equalizer (see Figure 12.9). Taking a common, but carefully chosen wood flame steady, I used the equalizer in real time, undulating the individual sliders wildly in waves to cause the “mushrooming” fireball sensation.

Listen to the audio cue on the DVD provided with this book to listen to the before and after of “mushrooming flames.”

“WORLDIZING” SOUND

Charles Maynes wrote a perfect article on the subject of “worldizing” sound. With permission from Charles Maynes and the Editors Guild Magazine (in which it appeared in April of 2004) we share it with you herewith.

images

Figure 12.9 The Klark-Teknik DN 30/30 graphic equalizer. If you turn up the volume gain full to the right (which equals + 6 dB level) and flipped the + 6/ + 3 dB switch up into the + 6 position, this would give you a total of + 12 dB or 12 dB gain or cut. Note how the settings of the frequency faders affect the response curve.

images

Figure 12.10 Rerecording prerecorded sounds under freeway overpasses in the “concrete jungle” of the city. Photo by Alec Boehm.

“WORLDIZING: Take Studio Recordings into the Field to Make Them Sound Organic”

By: Charles Maynes

For some of us in sound and music circles, “worldizing” has long held a special sense of the exotic. Worldizing is the act of playing back a recording in a real-world environment, allowing the sound to react to that environment, and then rerecording it so that the properties of the environment become part of the newly recorded material. The concept is simple, but its execution can be remarkably complex.

In Walter Murch's superb essay on the reconstruction of the Orson Welles film A Touch of Evil, he quotes from a 58-page memo that Welles wrote to Universal to lay out his vision for the movie. At one point, Welles describes how he wants to treat the music during a scene between Janet Leigh and Akim Tamiroff, and he offers as elegant a description of worldizing as I can think of:

“The music itself should be skillfully played but it will not be enough, in doing the final sound mixing, to run this track through an echo chamber with a certain amount of filter. To get the effect we're looking for, it is absolutely vital that this music be played back through a cheap horn in the alley outside the sound building. After this is recorded, it can then be loused up even further in the process of rerecording. But a tinny exterior horn is absolutely necessary, and since it does not represent very much in the way of money, I feel justified in insisting upon this, as the result will be really worth it.”

At the time, Universal did not revise A Touch of Evil, according to these notes, but the movie's recent reconstruction incorporates these ideas. Worldizing is now a technique that has been with us for some time and will likely be used and refined for years to come.

Walter Murch and Worldizing

The practice of worldizing—and, I believe, the term itself—started with Walter Murch, who has used the technique masterfully in many films. However, it has received most of its notoriety from his use of it in American Graffiti and in the granddaddy of the modern war film, Apocalypse Now.

In American Graffiti, recordings of the Wolfman Jack radio show were played through practical car radios and rerecorded with both stationary and moving microphones to recreate the ever-changing quality of the multiple moving speaker sources the cars were providing. On the dub stage, certain channels were mechanically delayed to simulate echoes of the sound bouncing off the buildings. All of these channels, in addition to a dry track of the source, were manipulated in the mix to provide the compelling street-cruising ambience of the film.

In Apocalypse Now, the most obvious use of this technique was on the helicopter communications ADR, which was rerecorded through actual military radios in a special isolation box. The groundbreaking result has been copied on many occasions.

Sound Effects

In the previous examples, worldizing was used for dialog or music, but it has also been used very effectively for sound effects. One of my personal favorite applications of the technique was on the film Gattaca, which required dramatically convincing electric car sounds. Supervising sound editor Richard King and his crew devised a novel method to realize these sounds by installing a speaker system on the roof of a car, so that they could play back various sounds created for the vehicles.

According to King, the sounds were made up of recordings that ranged from mechanical sources such as surgical saws and electric motors to propane blasts and animal and human screams. In the studio, King created pads of these sounds, which were then used for playback.

Richard and Patricio Libenson recorded a variety of vehicle maneuvers, with the prepared sounds being played through the car-mounted speakers and rerecorded by microphones. They recorded drive-bys, turns, and other moves to give the sounds a natural acoustic perspective. As King points out, one of the most attractive aspects of worldizing is the way built-in sonic anomalies happen as sound travels through the atmosphere.

images

Figure 12.11 Charles Maynes found that a squawk box in a trim bin creates a distinctive sound for the doomsday machine in Mystery Men. Photo by Steven Cohen.

King also used worldizing to create a believable sound for a literal rain of frogs in the film Magnolia. To simulate the sound of frogs’ bodies falling from the sky, King and recordist Eric Potter began by taking pieces of chicken and ham into an abandoned house and recording their impacts on a variety of surfaces, including windows, walls, and roofs. Using this source material, King created a continuous bed of impacts that could then be played back and rerecorded. For the recording environment, King chose a canyon, where he and Potter set up some distant mics to provide a somewhat “softened” focus to the source material. A loudspeaker system projected the recordings across the canyon to impact acoustic movement.

In addition to this, King and Potter moved the speakers during the recordings to make the signal go on and off axis. This provided an additional degree of acoustic variation. King and Potter created other tracks by mounting the speakers on a truck and driving around, which provided believable Doppler effects for the shifting perspectives of the sequence.

Another interesting application was Gary Rydstrom's treatment of the ocean sounds during the D-Day landing sequence in Saving Private Ryan, where he used tubes and other acoustic devices to treat the waves during the disorienting Omaha Beach sequence.

Mystery Men

I [Charles Maynes] used worldizing in the film Mystery Men, a superhero comedy that required a distinctive sound for a doomsday device called the “psycho-defraculator.” The device could render time and space in particularly unpleasant ways, yet it was homemade and had to have a somewhat rickety character. I was after a sound like the famous “inertia starter” (the Tasmanian devil's sound) from Warner Bros. cartoons, but I also wanted to give the sense that the machine was always on the verge of self-destruction.

We started by generating a number of synthesized tones, which conveyed the machine's ever-accelerating character. After finding a satisfying basic sound, we needed to find a suitable way to give the impression of impending collapse. By exhaustively trolling through the sound library, I found various recordings of junky cars and broken machines, and I began to combine them with the synthetic tones. I spent a considerable amount of time varispeeding the elements, but was never really satisfied with the result.

Since this was in 1999, we still had a film bench nearby, with a sync block and squawk box. I remembered how gnarly that squawk box speaker sounded, and thought, worldize the synth, that's the answer! So I took the squawk box into my editing room, plugged the synthesizer into it and was quickly satisfied with the distortion it provided.

Then I realized that using the squawk box inside a trim bin might be even better. So in came the trim bin, which became home to the speaker. As I listened, I noticed that the bin was vibrating at certain pitches and immediately tried to find which were the best ones to work with. After some trial and error, I started to put various bits of metal and split reels into the bin and noticed that it started really making a racket when the volume was turned up. I had arrived at my sound design destination. The compelling thing about this rig was that as I changed the frequency of the synthesizer, different objects would vibrate and bang against one another, creating a symphony of destruction. The sound was simultaneously organic and synthetic and gave the feeling that the machine was about to vibrate itself to pieces, like a washing machine during the spin cycle, with all the fasteners holding it together removed.

In the Future

Traditionally, it has been difficult to impart the acoustic qualities of real-world locations to our sound recordings using signal processors and electronic tone shaping, but this may well be changing. A new wave of processors now appearing on the market use a digital process called “convolution” to precisely simulate natural reverb and ambience. Using an actual recording made in a particular space, they separate out the reverb and other acoustic attributes of the sound, then apply those to a new recording.

The source recordings are generally created with a sine wave sweep or an impulse, typically from a starter's pistol, which is fired in the space being sampled. Hardware devices incorporating this technology are available from both Yamaha and Sony; software convolution reverb for Apple-based digital audio workstations (including Digidesign Pro Tools, Steinberg Nuendo and Cubase, Apple Logic Audio, and Mark of the Unicorn Digital Performer) is available from the Dutch company Audio Ease.

While it seems as though this might be a “Rosetta Stone” that could be used for matching ADR to production dialog, there are still limitations to the technology. The main one is that the reverb impulse is being modeled on a single recorded moment in time, so the same reverb is applied to each sound being processed with the particular sample. However, the acoustic character of this process is significantly more natural than the digital reverbs we are accustomed to. This tool was used very effectively to process some of the ADR on The Lord of the Rings to make it sound as though it had been recorded inside a metal helmet.

One of our goals as sound designers is to imbue our recordings with the physical imperfection of the real world, so that they will fit seamlessly into the world of our film. We want the control that the studio offers us, but we aim to create a sound that feels as natural as a location recording. Worldizing is one way to do this, and so are the new digital tools that help to simulate it. But sound editors are infinitely inventive. The fun of our job is to combine natural and artificial sounds in new ways to create things that no one has ever heard before!

Charles Maynes is a sound designer and supervising sound editor. His credits include Spider-Man, Twister, and U-571. Special thanks to Richard King, John Morris, Walter Murch, and Gary Rydstrom for their patient help with this article.

WORLDIZING DAYLIGHT

I think one of more interesting applications of worldizing was the work done by David Whittaker, MPSE, on the Sylvester Stallone action-adventure film Daylight. In the last part of the film, when the traffic tunnel started to fill up with water, the sound-designing chores beckoned for a radical solution. After the sound editing process and premixing were complete on the scenes to be treated “underwater,” all the channels of the premixes were transferred to a Tascam DA-88 with timecode. Whittaker describes:

We purchased a special underwater speaker, made by Electro-Voice. We took a portable mixing console, an amplifier, a water-proofable microphone, a microphone preamp, and two DA-88 machines—one to play back the edited material and the other one to rerecord the new material onto. We did the work in my backyard where I have a large swimming pool—it's 38 feet long by 18 feet wide.

We chose a Countryman lavalier production mic, because it tends to be amazingly resistant to moisture. To waterproof it we slipped it into an unlubricated condom, sealing the open end with thin stiff wire twisted tight around the lavalier's cable. After some experimentation we settled on submerging the speaker about three feet down in the corner of one end, with the mic dropped into the water about the same distance down at the opposite corner of the pool, thus giving us 42 feet of water between the speaker and the microphone. We found we needed as much water as possible between the speaker and the microphone to get a strong effect. We literally hung the mic off a fishing pole—it looked pretty hilarious.

We then played back and rerecorded each channel one at a time, using the timecode to keep the DA-88 machines in exact sync phase. The worldizing radically modified the original sounds. The low frequencies tended to disappear, and the high frequencies took on a very squirrely character, nothing like the way one hears in air.

The result became a predub pass onto itself; when we ran it at the rerecording mixing stage, we could play it against the “dry” original sound channels and control how “wet” and “murky” we wanted the end result.

Actually, I think it was very effective, and it certainly always made me smile when I would go swimming in my pool after that.

SOME ADVICE TO CONSIDER

I learned more than one valuable lesson on Escape from New York. As with magicians, never tell the client how you made the magic.

Escape from New York’s budget was strained to the limit, and the job was not done yet. The special effect shots had not been completed, and the budget could not bear the weight of being made at a traditional feature special effect shop. They decided to contract the work to Roger Corman's company, as he had acquired the original computer-tracking special effect camera rig that George Lucas had used on Star Wars. Corman was making a slew of special effect movies to amortize the cost of acquiring the equipment, in addition to offering special effect work to outside productions, Escape from New York being one.

The first shots were delivered. John Carpenter (director), Debra Hill (producer), Todd Ramsay (picture editor), Dean Cundey (director of photography), Bob Kizer (special effects supervisor), and several members of the visual special effects team sat down in Projection A at Goldwyn to view the footage. It has been suggested that the quality expectation from the production team was not very high. After all, this was only Corman stuff—how good could it be?

The first shot flickered onto the screen: the point of view of Air Force One streaking over New York harbor at night, heading into the city just prior to impact. Carpenter, Hill, and Cundey were amazed by the high quality of the shot. After the lights came up, John asked the visual special effects team how it was done. The team members, proud and happy that Carpenter liked their work, blurted out, “Well, first we dumped black paint on the concrete floor, then we let it dry halfway. Then we took paint rollers and roughed it up, to give it the ‘wave’ effect of water at night. Then we made dozens of cardboard cut-outs of the buildings and cut out windows. . . .”

Carpenter's brow furrowed as he halted their explanation. He pressed the talkback button to projection. “Roll it again, please.”

Now they viewed the footage again, only with the discerning eye of foreknowledge of the illusion's creation. Now they could see the imperfections, could see how it was done. The reel ran out, and the lights came up.

Carpenter thought a moment, and then turned to the visual special effects team. “No, I am going with what I saw the first time. You fooled me. If I ever ask you how you did something again, don't tell me.”

The lesson of that afternoon was burned indelibly into my brain and I have referred to it time and time again in dealing with clients who want to know how something was made. Do not destroy the illusion and magic of your creation by revealing the secrets of how you did them, especially to your clients.

THE MAKING OF JOHN CARPENTER'S THE THING

When it was unleashed in theaters back in June 1982, John Carpenter's The Thing was viewed as a critical and commercial disaster. Ferociously derided by the press who were almost unanimous in calling the film “disgusting,” it was also ignored by audiences who flocked to that summer's other friendlier alien visitation movie, E.T.: The Extra-Terrestrial.

At first renowned for its groundbreaking and spectacularly lurid special effects makeup, The Thing is now recognized as not only a modern classic but an unremittingly bleak and daring study of man's behavior in the most extreme conditions imaginable. Its themes and imagery have influenced such diverse films as From Beyond, Society, Reservoir Dogs, Se7en, Toy Story, The Faculty, and Isolation, to name but a few, and its impact is visible in the manner in which special effects would be employed in motion pictures.

Drawing on new and exclusive interviews with close to 100 members of the cast and crew, author Michael Doyle is attempting to present as complete a picture of the creation of this landmark horror film as possible. Full of never before heard behind-the-scenes stories and hitherto undisclosed technical information, the book will also include never before published photographs, storyboards, and artwork.

There will also be a thorough examination of earlier unused screenplays for The Thing by Tobe Hooper and Kim Henkel, and William F. Nolan, as well as alternate and deleted scenes and unfilmed special effects sequences. There is also a special section of the book that will include essays and comments by renowned directors, writers, FX artists, actors, and critics who share their opinion of the film and its legacy.

images

Figure 12.12 Michael Doyle, author of the upcoming book on John Carpenter's The Thing.

Tentatively titled The Making of John Carpenter's The Thing, Doyle's book will present a running commentary on the best American horror film of the past four decades by those who made it.

1 Browning Automatic Rifle, a rifle developed for U.S. military use toward the end of World War I.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset