What is sound design? It’s a broad term which connects different technical disciplines across a range of media. Essentially, sound design is the creation and manipulation of audio. In film and TV production, the placement of sounds will usually serve some dramatic purpose, providing the audience with clues about what is happening on screen and how they should be reacting emotionally, as well as establishing the general mood or atmosphere in a scene. As such, sound design in film is a vital part of the storytelling language, and without it the production would feel flat, incomplete, and certainly less aesthetically pleasing.
The design of sounds in a project is generally considered apart from the composition of a musical score, focusing more on non-musical sound effects, though the lines where these areas intersect are often quite blurred. In terms of making music, sound design typically refers to the sculpting and mixing of sonic tones, as opposed to the melodies and rhythms being played. The primary concern is with sound rather than music, though again, it’s not always simple to draw a line between the two.
If the term itself is somewhat vague, at least the practical tasks that fall to the sound designer are more concrete. The basic sound design elements include sourcing distinctive sounds and recording them at high fidelity, or alternatively synthesizing or otherwise generating sounds from scratch, then editing and processing with a variety of effects, and mixing so that each element blends together nicely as a whole. Sound design, therefore, spans the full recording and production process. Certain sound design techniques, while essential behind the scenes, are intended to be imperceptible to the audience and not distract from the action or the music in the moment. It’s only when you consciously focus in on the sound that you notice what is really happening.
A cool example of this in filmmaking is the work of the Foley artist. Consider a nature documentary, with spectacular shots of whales gliding through the ocean or extreme close-ups of minuscule insects. When it is not possible to record audio for the scenes directly with microphones, Foley artists will instead approximate the sound in a studio using a full assortment of custom and everyday objects. This gives much more control over the precise shape and detail of the sound, and the technique is widely used to improve the auditory experience and overall impact of a radio or film production. The sound of rustling clothes is usually added with Foley, for instance, or think of someone clapping coconut shells together to recreate the sound of horses’ hooves. These subtle, artificial exaggerations of the sound make the scene feel more real and give it physical substance, whereas without the Foley it might sound hollow and unnatural, as if something is missing. When done well, these added sounds feel grounded in the shot, indeed as if they are actually coming from the source depicted on screen. The sound design here provides helpful feedback that we don’t even realize we need. This is just one part of the magic of filmmaking and the illusions involved in making creative arts in general
Sound Design in Film
Technology improved over the years and Hollywood began to dominate the world of entertainment. Audiences were for the first time able to experience vast, dreamy landscapes and dazzling special effects through the screen, accompanied by soundtracks that had to match them in scale and splendor. Traditional atmospheric effects, Foley, and other sound design elements were carried over from radio to the fledging film and TV industries before being thrust to new heights, as visuals and sound could be combined in an unprecedented, full-sensory experience.
Filmmakers were inspired by the fresh sounds and radical, boundary-pushing ideas coming out of the music studios of the ‘60s and ‘70s, and many classic landmarks of cinema are celebrated for their inventive and effective uses of sound design. In Star Wars, released in 1977, much of the texture and excitement of the outer-space worlds come from what we hear. The film was the first to introduce booming sub-bass frequencies, which have since come to be expected from any action movie shown in a cinema. The now-instantly recognizable buzz of lightsabers was created by accident, when a sound designer for the film walked in front of an old TV set with a live microphone and some static noise was picked up. Chewbacca’s endearing growls were a mix of real recordings of walruses and other animals. Darth Vader’s notorious breathing noise was inspired by a scuba apparatus. Add to this all the other explosions, lasers, spacecraft engines, and other futuristic noises; without the sound design elements, you’d be left with only half a movie, as half the story is told through sound.
Apocalypse Now, the 1979 Vietnam War epic, opens with a forest of palm trees exploding into flames as helicopters fly past in the foreground. The ‘wop-wop-wop’ sound of the helicopter blades was actually played on a synthesizer, which establishes an eerie, dreamlike atmosphere right from the start. The film was the first to use surround sound in cinema, and the frantic effect of the helicopters zooming across three-dimensional space, around six different audio channels and speaker placements, was the perfect demonstration of what sound mixing could achieve. As the Doors’ “The End” comes to a haunting crescendo, the image of the burning forest is superimposed with a ceiling fan in the protagonist’s hotel room. The synthesized helicopter noises and the sound of the fan are likewise blended, highlighting for us the protagonist’s subjective point of view, his memories of war, perhaps his trauma. It’s a masterful example of storytelling using sound, which even bagged sound designer, Walter Murch, the Oscar that year for Best Sound. In fact, it was Murch who popularized the term Sound Designer as a professional job title in the industry, with a specific focus on creative, post-production sound design.
There are countless great uses of sound design in film, some more recent examples being the bone-rattling drones throughout Inception and other Christopher Nolan blockbusters, or the mechanical whirr of morphing machines in Transformers, almost like a dubstep bass line. These are iconic movie sounds, and they play a huge part in the experience and the success of these famous titles. We already have a sense of the interrelationship between sound effects and the musical score, and how the line between musical and non-musical sound is rather hazy. Further technological advances in the last few decades have drawn these disciplines closer together, as creators have increasingly more processing power and sonic possibilities at their fingertips in a standard home-studio set-up. Software samplers such as KONTAKT and BATTERY are great platforms for importing and manipulating recordings from the real world, as well as accessing enormous libraries of sampled sounds, from percussion and orchestras to electronics from another dimension.
The modern, digital world is constantly evolving and opening up new possibilities for sound design. These days, video games have budgets and audiences as large as Hollywood movies. Players can remain engrossed in the same virtual worlds for up to hundreds of hours, which poses new challenges for composers and sound designers. As the player has the freedom to move and interact with the environment in a non-linear fashion, the soundtrack arrangement and triggers for sound effects must reflect this, and the development of interactive audio is exponentially more complex as a result. That said, the fundamentals are the same. Aural cues help us identify characters and give us a sense of direction, while both musical and non-musical sounds are used for tension and release at the right moments in the narrative. Our multimedia era goes even further than this, with sound design built into apps and smart devices and bleeding into all areas of everyday life. Think of iPhone ringtones and the various other chimes and jingles we identify with different electronics; all of this has been precision-designed to be recognizable and useful, but also pleasing to the senses.
How to Design Sound in Music Production
As mentioned above, sound design applies more to sculpting the tone and character of sounds than the musical notes being played. Indeed, many artists get their ‘signature sound’ from the palette used as much as the music itself, whether that’s Jimi Hendrix experimenting with delay and feedback effects to build sci-fi dreamscapes out of sound, or Aphex Twin using computer trickery to warp sounds into elaborate sonic sculptures for the digital age. Sound design is not just recording a musical performance, but honing in on the actual tone of the recording itself. Creative recording only really emerged as a concept in the latter half of the 20th century, using the studio as an instrument to add color to the music and transform it with overdubs, stereo mixing, and other techniques.
In the 21st century, we are really spoiled for choice, with groundbreaking new tech dropping virtually every year and unprecedented access to a world of high-quality sounds. Take MASSIVE X as just one example of a powerful, multi-purpose software sound engine with a reasonable price tag. It goes far beyond the basic remit of a synthesizer, capable of diverse sound creation from analog warmth to digital intricacy, throbbing lead lines to ambient soundscapes. It also lets you re-route inputs and outputs as you might with modular hardware, giving you absolute control of the audio signal. Having this kind of scope and versatility, able to run on a standard computer set-up, at an affordable price, would have been unthinkable not too long ago.
If half of sound design is the sourcing or synthesizing of new sounds, the other half is effects and processing. The order of plug-ins in the signal chain makes a lot of difference to the output sound, so try experimenting with shuffling things around to get the best results. Reverb is a widely used effect for scaling up sounds to fill a big room and giving more sense of physical space and depth, while distortion can add punch as well as corrupting clean audio to produce a lo-fi aesthetic or Aphex Twin-esque glitches. Sound design has always been about harnessing and maximizing the potential of technology in any given period to create the most immersive experience, while constantly looking ahead to the future. In terms of technology, as well as conceptually, the potential for sound design has never been greater than it is right now.