Sonic Design: Weeks 1-4 Summary
Bachelor of Computer Science (Hons)
Introduction
Over the past four weeks, I have embarked on an immersive journey into the world of sonic design as part of the Sonic Design module for my Bachelor of Design (Hons) in Creative Media. This module, led by Mr. Razif, introduced me to the fundamentals of audio environments, professional sound equipment, and digital audio workstations (DAWs) like Adobe Audition. Through weekly lectures and hands-on exercises, I explored the nature of sound, its manipulation, and its critical role in storytelling across media such as film, gaming, and advertisements. Below is a detailed reflection on the lectures and exercises from Weeks 1 to 4, highlighting my learning outcomes, challenges, and aspirations for future projects.
Week 1: Audio Fundamentals
Lecture Reflection
The first week introduced the essence of sonic design, emphasizing sound as a vibration of air molecules that progresses through production, propagation, and perception. I learned about the human ear's structure—comprising the outer, middle, and inner ear—and its role in processing sound. The lecture also delved into psychoacoustics, the study of how humans perceive sound, covering concepts like pitch, loudness, and timbre. I was particularly fascinated by the six properties of sound: pitch (determined by frequency, ranging from 20Hz to 20kHz for human hearing), loudness, timbre, perceived duration, envelope, and spatialization. These foundational concepts underscored sound’s ability to evoke emotion and enhance storytelling, setting the stage for practical applications.
Exercise 1: Equalizer Matching
In this exercise, I was tasked with analyzing five audio files—one reference track and four with altered frequency balances. My goal was to adjust the equalizers (EQ) of the four tracks to match the reference audio’s sound profile. Initially, I struggled to identify differences without headphones, but once equipped, I could discern variations in bass and treble. Using Adobe Audition’s Parametric Equalizer, I fine-tuned each track’s frequency bands:
- Equalizer 1: Adjusted mid and high frequencies to balance the tone.
- Equalizer 2: Boosted bass slightly to match the reference’s warmth.
- Equalizer 3: Reduced treble to correct harshness.
- Equalizer 4: Enhanced mid-range for clarity.
This exercise honed my ability to detect subtle frequency differences, use EQ tools precisely, and develop a critical ear for audio production. It felt like real-world mixing, laying a solid foundation for future sound design tasks.
Week 2: Sound Design Tools
Lecture Reflection
Week 2 focused on the tools and techniques of sound design, introducing common DAW features like layering, time stretching, pitch shifting, reversing, and vocalization. I learned that layering combines multiple sounds to create rich, professional effects, while time stretching alters a sound’s duration without affecting pitch. Pitch shifting modifies pitch without changing duration, and reversing produces unique, unnatural effects. The “less is more” principle resonated with me, emphasizing the power of intentional, minimal sound choices to maximize impact. These techniques inspired me to explore creative sound design for multimedia projects, particularly in gaming and film.
Exercise 1: Telephone and Cyberpunk Voice Simulation
The exercise required using Adobe Audition’s rack effect processor to simulate a telephone voice effect. I boosted mid frequencies and reduced low and high frequencies to emulate the characteristic “voice-enhanced” telephone sound. I fine-tuned the low end to allow subtle background noise and slightly raised treble for vocal clarity. Additionally, I applied flanger and reverb reduction effects to enhance realism. The final telephone effect was convincing and aligned with the exercise’s goals.
As an extra challenge, I created a cyberpunk radio voice effect. Using similar EQ adjustments, I modified the flanger to produce a cold, robotic tone and replaced reverb reduction with indoor reverb to evoke an ethereal, futuristic ambiance. This extension exercise deepened my understanding of how effects like flanging and reverb can transform audio to fit specific narrative contexts, such as a cyberpunk setting.
Week 3: Sound in Space - Environment
Lecture Reflection
The Week 3 lecture explored diegetic and non-diegetic sound, drawing from the Ultimate Guide to Diegetic vs. Non-Diegetic Sound. Diegetic sounds, like dialogue and environmental effects, build a believable world, while non-diegetic sounds, such as background music, shape emotional responses. I was intrigued by trans-diegetic and acousmatic sounds, which blur narrative boundaries for innovative effects. This lecture highlighted sound design as both a technical craft and an artistic expression, requiring strategic alignment with a story’s intent. I aim to balance diegetic and non-diegetic elements in future projects to enhance immersion and emotional resonance.
Exercise 1: Space Environment Sound Design
This exercise involved designing sound for a space station setting depicted in a concept art image, featuring a cultivation chamber with plants and scattered people. I used Adobe Audition to layer multiple tracks:
I applied reverb to create a vast, futuristic feel, ensuring the soundscape felt cohesive and immersive. This exercise taught me how to use EQ and reverb to craft atmospheric soundscapes that complement visual narratives.
Exercise 2: Laser Beam Environment
The second exercise focused on a sci-fi scene dominated by a dazzling laser beam. I layered two tracks:
- Track 1: Space white noise, processed with harmonization, parametric EQ, and dynamic limiting to evoke a vast space environment.
- Track 2: Laser beam sound effects, modulated with a notch filter for a distinctive texture.
The combination of white noise and laser effects, enhanced by precise EQ and filtering, amplified the scene’s futuristic appeal. This exercise underscored the importance of tailoring sound effects to visual elements, reinforcing my appreciation for sound design’s role in storytelling.
Overall Reflection
The past four weeks have been a transformative learning experience, deepening my understanding of sound design’s technical and artistic dimensions. I gained proficiency in Adobe Audition, mastering tools like parametric equalizers, reverb, flangers, and pitch shifters. The exercises, from EQ matching to environmental sound design, honed my ability to manipulate audio to serve narrative goals, whether simulating a telephone voice or crafting a sci-fi soundscape. The challenges, such as software crashes and identifying subtle frequency differences, taught me resilience and the importance of a critical ear.
Sound design has revealed itself as a powerful storytelling tool, capable of evoking emotion and enhancing immersion across media. The “less is more” principle and the strategic use of diegetic and non-diegetic sounds have shaped my approach to audio production, emphasizing intentionality and creativity. Moving forward, I am excited to apply these skills to more complex projects, experimenting with advanced techniques like modulation and spatialization. These weeks have ignited a passion for sound design, and I am confident that the foundational skills I’ve developed will enable me to create compelling audio content in future endeavors.
Comments
Post a Comment