INDEX
Note: page numbers in italic indicate a figure and page numbers in bold indicate a table on the corresponding page.
2D audio sources 51–52, 62, 68
2.5D audio sources 51–52
2.5D games 26
3D audio, implementation of 58–67
3D audio sources 51–52
3D levels 26–27
5.1 Dolby Digital 66
5.1 standard 62–65
7.1 surround systems 66
360-degree surround 65–67
AAC (advanced audio coding) format 281
absolute time 168–170
absorption coefficients 216
AC-3 Dolby Digital 282
access modifiers 158–159
active mix events 258–259
adaptive crowd engine prototype 143–146
adaptive mixing 251–275; considerations for 251–253; good practices 271–274; music, dialogue, and sound effects 253–254; planning and pre-production 254–259
ADPCM (adaptive differential pulse code modulation) format 282
aeoliphone 70
aesthetics 252
algorithmic reverb plugins 104
algorithms: in coding 148–149; Fourier-based 89; random emitter 183
ambiences 174–182, 188–189; creating 175–178; spatial distribution 180–181, 181; time property 181–182
ambient lights 33
ambisonic recording 11
amplifiers 77
amplitude modulation 76, 100–101, 233; creature design and 141–142
animation controllers 36
animation system 35–37
AntiPattern 156
Aphex Aural Exciter 97
area lights 33
asset delivery checklist 22–23
assets folder 26
Atari 1
attenuation shapes 47–52
attributes 150
audio: object types 34; role of, in games 7–17; see also game audio
audio assets: gathering and preparing 82–86; high quality 83–84; importing 174; management and organization of 22–23, 85–86; preparation of 173–174
audio clips 34, 44, 46, 190–192
audio data reduction 276–286; common file formats 280–282; file size calculation 277–278; good practices 282–283; options 283–286; perceptual coding 280; pulse code modulation 276–277; strategies 278–282; trade-offs 280
audio developers 5–6
audio effects 52–53
audio emitters 212–213
audio filters 52
audio group inspector 260–261
audio implementation 173–213; ambiences and loops 174–182, 188–189; animation events 201–203; asset preparation 173–174; collisions 193–197; distance crossfades 206–210, 206; fades 204–206; intermittent triggers 188–189; prefabs 210–213; random emitters 182–188, 182; raycasting 197–201; sample concatenation 189–193; smart audio sources 197–201
Audiokinetic 17
audio listeners 43–45
audio localization 53–69
audio mixers 53, 116–118, 123, 170, 259–266, 272–273; see also mixing
audio playback technology, evolution of 3–5
audio programming and implementation 5
audio reverb filters 224
audio script 160–171
audio settings 40–41
audio source parameters 46–47
audio sources 34, 45, 49, 179–180; 2D, 3D, and 2.5 51–52; directional 50; smart 197–201; square/cube 50, 51; volumetric 51
audio-visual contract 76
augmented reality 4, 5; categories of 14; immersion and 14–17
aural exciters 97
automation 266–271
Avatar 36
Awake() function 154, 160, 207
axes 37–38
baking 216
behaviors 150
believability 137
B format 67
binaural renderings 58–61
bit crushing 92
bit depth 3
blending 106–107
blind spots 179
Blue Print 17
Booleans 154
broadband noise 244
broad frequency spectrum 62
Brown, Tregoweth 71
Burtt, Ben 71
bus compression 94
Bushnell, Nolan 1
C 149
C# 148, 150; accessing functions from another class 159–160; access modifiers 158–159; arrays 155–157, 155; audio script 160–171; data types 154; first script in 151–154; introduction to 151–171; lists 157–158; syntax 151–154; variables 154–155
Cage, John 18
camel casing 155
cameras 29
Cartesian coordinates 26–27
cartoons 71
center speakers 64
chambers 103
character controllers 28, 28, 29
characters 154
CheckDistance() function 198
CheckForDistance() function 198, 201, 208
child classes 151
Chime Vs. Buzzer Principle 11–12
Chion, Michel 76
clarity 251–252
classes 150–151, 151; accessing functions from other 159–160
class names 152
clients, communication with 86
clouds 99
coalescence 99
coding 147–172; algorithms 148–149; audio script 160–171; C# 151–171; detecting keyboard events 167–168; encapsulation 150; inheritance 150–151; logic 148; object-oriented programming 149–151; perceptual 280; reasons to learn 147–151; reusable code 156–157; sample randomization 166–167; syntax 148; using triggers 164–166
coin-operated games 2
Colavita visual dominance effect 16
colliders 32, 32, 38, 164–166, 193–195, 194, 200–201
collision detection 32, 38, 39, 193–195
collisions 193–197
colons 152
colors, working with, in Unity mixer 261–262
comb filtering 89, 101–102, 102
communication 86
complex sounds, breaking into layers 73–74, 74
compressed audio formats 83
compression 92–95, 93; bus 94; dynamic range 93; inflation 95; transient control 94–95
compression formats 285–286
Computer Space 1
concatenation 189–193
condenser microphones 80–82
constant bit rates (CBR) 279
context 8
convolution 107–110, 108; creature design and 142–143; filtering/very small space emulation 110; hybrid tones 110; optimization 109; speaker and electronic circuit emulation 109–110
convolution-based reverb plugins 78
CPU resources 240
creature sounds: amplitude modulation and 141–142; animal samples 141, 143; convolution and 142–143; distortion and 140; emotional span 137–138; equalization and 140; non-human samples 143; pitch shifting and 138–140; primary vs. secondary sounds 137; prototyping 136–143; vocal recordings 138
crosstalk 61
curly braces 153
cut scenes 122–126
data 8
data reduction: good practices 282–283; options 283–286; strategies 278–282
data types 154
DAWs 116–118
deadlines 22
decay time 105
deltaTime variable 168–169
design documents 85–86
Destroy() method 211
diffuse resonant bodies 247
diffusion 220
digital audio 276–278
digital audio converters (DACs) 116
digital audio encoding 277
digital audio signals 92
Digital Signal Processing techniques 107
directional audio sources 50
directional lights 33–34
distance: Doppler effect and 234–237; dry to wet ratio as product of 227–229; factors in 75–76; filtering as product of 224–230; low pass filtering with 55; perception of 10; simulation 229–230; spherical spreading over 48–50, 48, 49; width perception as product of 225–226
distance crossfades 206–210, 206, 233–234
distance modeling 224–230
distortion 89–92, 91; bit crushing 92; creature design and 140; overdrive 91; saturation 90–91, 90
distortion/saturation plugins 78
Dolby Atmos 58
Dolby Digital Live 282
Doppler effect 234–237
Doppler factor 235
drop files 145
dry to reflected sound ratio 55
dry to wet ratio 227–229
DSP classics 100–102
DTS:X 58
ducking 266
dynamic microphones 80–82
dynamic range 77, 120–121, 120, 252, 256–258, 257, 283
dynamic range compression 93
effects, adding to groups 262–263
effects loops 122–125, 222–223; inserts vs. 263–264; setting up for reverberation 264–266
electronic circuit emulation 109–110
emotional involvement 17
environmental modeling 4, 9–10, 21, 214–237; best practices for 219–220; definition of 214–215; density and diffusion 220; distance crossfades 233–234; distance modeling 224–230; Doppler effect 234–237; effects loops 222–223; exclusion 230, 232–233, 232; guns and explosions 130–131; high frequencies vs. low frequencies 220; late vs. early reflections 219; obstruction 230, 231–232, 232; occlusion 230, 231, 231; reflections level 219–220; reverberation 215–219, 222–223; reverberation for 106; reverb zones 221–222
equalization 77, 95–97; creature design and 140; resonance simulation 96–97
equalizers 77
equal loudness contour graph 272, 272, 273
evaporation 99
event scheduling 192–193
exclusion 10, 230, 232–233, 232
experimentation 86
fall-off curve 48
Farnell, Andy 242
Fast Fourier Transform (FFT) 107, 108
fast Fourier transform-based algorithms 89
fatigue avoidance 18–19
file formats 280–282
file size calculation 277–278
filtering 95–97, 110, 233; low pass 55, 76, 87, 224–225, 249–250; as product of distance 224–230
first-personal controller 28
Fletcher-Munson curves 272, 272, 273
floating point numbers 154
Foley, Jack 113
Foley recording 113–114
forward slash 154
Fourier-based transforms 89
Fourier synthesis 247
front left and right speakers 64
full bandwidth recordings 83
full sphere, surround format 65–67
fully immersive systems 14
functions: accessing, from another class 159–160; see also specific functions
game audio: challenges in 17–23; coding for 147–172; evolution of 3–5; genesis of 1–3; role of 7–17
game engine: definition of 24–29; level elements 29–34; paradigm 24–42; sub systems 35–42
game levels 26–27; elements of 29–34
game mechanics 11–12
GameObject.Find() function 197–198, 207
game objects 20; see also objects
gameplay: adjusting levels during 265–266; increasing complexity in 4
Gardner, W.G. 227
generic audio 279
geometry 9–10
Gerzon, Michael 65
GetComponent() method 160, 161
GetKeyDown() function 167–168
GetOcclusionFreq() function 200, 201
grain duration 98–99
granular synthesis 88–89, 88, 97–100, 98; pitch shifting 99–100; sample manipulation/animation 100; terminology 98–99; time stretching 99–100
gravity gun 20
Grindstaff, Doug 71–72
groups: adding effects to 262–263; adding to audio mixer 259–260; audio group inspector 260–261
group sidechaining 125–126
guns: detonation/main body layer 129–130; environmental modeling 130–131; general considerations 127–128; gunshot design 128–129; one shot vs. loops 126–127, 127; player feedback 131–132; prototyping 126–132; sublayer 130; top end/mechanical layer 130
Half Life 2 20
harmonic generators 97
harmonic processors 78
headphones 274
Head Related Transfer Functions 11
head related transfer functions (HRTFs) 58, 58–62, 59, 108
high cut parameter 105–106
high frequencies 220, 233, 283
high pass filtering 75–76
home gaming consoles, first 2
horizontal axes 37–38
horizontal plane, localization on 56–57
HRTFs see head related transfer functions (HRTFs)
humanoids 36
hybrid tones 110
IDE see Integrated Development Environment (IDE)
IEnumerator 184
immersion 8; characteristics that create 15; definition of 14–17; maintaining 16
implementation, challenges 17–18
impulse reponse 104
inflation 95
information, provided by audio 8–12, 252–253
inheritance 150–151
input 116
input system 37–38
inserts 116–117, 122–123, 263–264
Inside 16
Instantiate() method 210–211
integers 154
Integrated Development Environment (IDE) 148, 152
interactive elements 19–20
interaural intensity difference (IID) 11, 57, 57, 58
interaural time difference (ITD) 11, 57, 57, 58
intermittent emitters 189
intermittent triggers 188–189
inverse square law 54–55
isKinematic property 38
isPlaying property 190
isTrigger property 39
Kandinsky, Wassily 115
keyboard events, detecting 167–168
kinematic RigidBody colliders 194
lavalier microphones 81–82
Law of Two and a Half 76
layering/mixing 86–87, 94, 175
layers 85
level meters 117
levels: 2D 26–27, 30; 3D 26–27; adjusting during gameplay 265–266; game 26–27, 29–34; mix 273–274
LFE submix 125
Lifecycle script 153
Limbo 16
linear amplitude 170–171
linear animation 41–42
linear fall-off curve 48
linear mixes 122–126
linear model synthesis 246–250
lists 157–158
LKFS unit 273
load type 284–285
local coordinates 27
localization: audio 53–69; cues 56–58; on horizontal plane 56–57; on vertical plane 58–59
location, perception of 10–11
logarithmic amplitude 170–171
logarithmic fall-off curve 48
logic 148
loops 174–182, 176, 189; creating 175–178; implementing 178–182; inserts vs. effect 263–264; seamless 175–176; spatial distribution 180–181, 181; time property 181–182; see also effects loops
lossless data reduction 278–279
loudness K-weighted full scale (LKFS) 273
loudness unit full scale (LUFS) 273
low cut parameter 106
low frequencies 220
low frequency effects (LFE) 64
low pass filtering 55, 76, 87, 224–225, 249–250
LUFS-based loudness meters 78–79
LUFS unit 273
MacDonald, Jimmy 71
Magnavox Odyssey 2
MapToRange() function 208–209
mass, of sound 74–75
Massachusetts Institute of Technology (MIT) 1
master output 124
materials 31
MaxMSP: adaptive crowd engine prototype 143–146; sword maker example in 246–250
MaxxBass plugin 97
Mecanim 35
Menzies 247
meshes 30
.meta extension 44
microphones 80–82; dynamic vs. condensers 80–82; placement of 82
mixer parameters 270
mixers 53, 116–118, 170, 223, 259–266, 272–273
mixing 13–14, 21–22, 86–87; adaptive 251–275; considerations for 251–253; dynamic range 256–258; good practices 271–274; inserts vs. effect loops 263–264; music, dialogue, and sound effects 253–254; passive vs. active mix events 258–259; planning and pre-production 254–259; premix 273–274; routing 255–256; snapshots and 266–271; submixing 254–255; Unity audio mixer 259–266
mix levels 273–274
mix sessions 123
modal synthesis 246–250
models 30–31
monitoring 126
Monobehaviour 152
mono signals 61–62
multi-modal integration 76
multi-player games 42
music bus 257
naming conventions 22–23, 85, 155, 180
narration 254
narrative function 252
networking 42
No Country for Old Men 72
noise 84
non-diffuse resonant bodies 247
non-immersive systems 14
non-player controllers (NPCs) 28
non static variables 159
Nutting Associates 1
Nyquist theorem 276
object-based audio 58–61, 62, 67, 68–69
object-oriented programming 149–151
objects 30; audio 34, 43–45; colliders 32, 32; lights 33–34; materials 31; meshes 30; models 30–31; particle systems 32; prefabs 34; shaders 31; skyboxes 32; sprites 30; terrain 31–32; textures 31, 31; transform component 30; triggers 33
obstruction 10, 230, 231–232, 232
occlusion 10, 197–199, 210, 230, 231, 231
Ogg Vorbis 281
OnCollisionEnter() function 194
ontological modeling 241
OnTriggerEnter() function 165
OnTriggerExit() function 165
OnTriggerStay() function 165
opacity 99
output 118
overlapping 89
overriding 34
Pac Man 2
parameters: editing via scripting 270; exposing 270–271; see also specific parameters
parent class 151
particle systems 32
passive mix events 258–259
PCM audio 18
peak meters 272–273
pebble effect 199–201
perceptual coding 280
peripheral vision 9
phasing issues 181–182
physical analysis 242
physics engine 38–40
pitch 74
pitch shifting 87–89, 178; creature design and 138–140; fast Fourier transform-based algorithms 89; granular synthesis 88–89, 88, 99–100; playback speed modulation 87–88
Pitch Synchronous Overlap and Add (PSOLA) 88, 99–100
playback speed modulation 87–88
player feedback 131–132
PlayFirst() function 191
PlayOneShot() method 163–164
PlayScheduled() function 192–193
PlaySecond() function 191
PlaySound() function 187
plugin parameters 270
point lights 33
post-fader sends 118
precedence effect 11
predelay parameter 105
pre-delay time to reverb 75, 219
prefabs 34, 210–213; creating smart intermittent emitter prefab with occlusion 210; destroying objects instantiated from 211; instantiating audio emitters 212–213; instantiating from scripting 210–211
pre-fader sends 117
premix 273–274
pre-production 254–259
primary sounds 137
prioritization 252
private keyword 158
procedural assets 239
procedural audio 4–5, 238–250; approaches to 241–242; candidates for 241; definition of 239–242; introduction to 238–239; practical 242–250; pros and cons of 239–241; sword maker example 246–250; wind machine example 242–246
procedural programming languages 149–150, 149
procedural sound synthesis 5
programming see coding
programming languages 149–150
protected keyword 158
prototyping 19–20, 126–146; adaptive crowd engine 143–146; creatures 136–143; guns 126–132; vehicles 132–136
public keyword 158
pulse code modulation 276–277
RAM 239
random emitters 182–188, 182; algorithm 183; coroutines 183–188
randomization 18–19, 99, 162–163; linear amplitude and 170–171; sample 166–167
raycasting 39, 197–201; avoiding pebble effect 199–201; implementing occlusion with 197–199
real-time computation 216
rear left and right speakers 64
reflections 56; late vs. early 219; level 219–220
relativeVelocity 195
repetition 18–19
resonance simulation 96–97
resonant bodies 247
resonators 101–102
reverberation 78, 84, 102–107, 103; absorption coefficients 216; audio reverb filters 224; as blending tool 106–107; as dramatic tool 107; effects loops for 222–223; for environmental modeling 106, 215–219; indoors vs. open air 102–104; inserts vs. effects loops for 122–123; parameters 105–106, 217–219; pre-computed vs. real time computation 216; setting up effect loop for 264–266; in Unity 216–219
reverb time/decay time 105
reverb zones 217–218, 221–222, 229
RigidBody colliders 194
ring modulation 100–101
Roads, Curtis 88
routing 255–256
Russel, Steve 1
sample concatenation 189–193
sample manipulation/animation 100
sample playback 3–4
sample randomization 166–167
sample selection, velocity-based 195–197
sampling rate 276
Schaeffer, Pierre 71
scripting: editing mixer and plugin parameters via 270; recalling snapshots via 268–269; see also coding
seamless loops 175–176
secondary sounds 137
semicolons 152
semi-immersive systems 14
Send/Receive technique 264–266
separators 152
SetSourceProperties() function 187
shaders 31
shotgun microphones 80–82
side chain compressors 257
sidechaining 125–126
signal path 119–121
silence 73
size parameter 105
skyboxes 32
smart audio sources 197–201
snapshots 266–271; recalling vis scripting 268–269; working with 267
sound: information provided by 8–12; mass or weight of 74–75; pitch of 74
sound design: art of 70–86; basic considerations 72–76; clipping 119–121, 120; effective 72–74; entertainment and 12–13; environmental 21; frequency chart for 96, 96; guidelines 74–76; history of 70–72; microphones for 80–82; optimizing for spatialization 68–69; practical 115–146; preparation for 82–86; prototyping and 126–146; session setup 115–118, 122–126; technical 5; tools for 76–80; working with video 118–119
sound design techniques: amplitude modulation 100–101; comb filtering 101–102; compression 92–95, 93; convolution 107–110, 108; distortion 89–92; DSP classics 100–102; equalization/filtering 95–97; Foley recording 113–114; granular synthesis 97–100, 98; harmonic generators/aural exciters 97; layering/mixing 86–87; pitch shifting 87–89; reverberation 102–107; time-based modulation FX 110–113
sound effect bus 257–258
sound effect library 84
sound effects 4, 122, 253–254; procedural audio and 5
sound FX librarian software 84
sound layers 85, 86–87; blending 106–107
sound recording, Foley 113–114
soundscapes 21
sound sources see audio sources
soundtracks: evolution of 4; music 13–14; role of, in games 7–17
Space Invaders 2
Spacewar! 1
spatial audio 5
spatial awareness 9–10
spatial distribution, of ambient loops 180–181, 181
spatial imaging 252
spatialization, optimizing sound design for 68–69
spatial width 56
speakers: center 64; emulation of 109–110; front left and right 64; rear left and right 64
spectral analysis 242, 248–249
spectral balance 140
spectrum analyzer software 79
speech 279
spherical spreading 48–50, 48, 49
spotlights 33
spread parameter 225–226
sprites 30
square/cube audio sources 50, 51
Stalling, Carl 71
StartCoroutine() statement 184
Start() function 161, 186, 197–198, 208
Star Trek 71–72
static colliders 194
static keyword 158–159
stems 122
stereo 62
Stochastic techniques 18
streams 99
strings 154
subharmonic generators 97, 125
sub master 124
submixes 118, 124–125, 124, 254–255
Subotnick, Morton 13–14
sub systems 35–42; animation 35–37; audio engine 40–41, 43–69; input 37–38; linear animation 41–42; physics engine 38–40
subtractive synthesis 242–246
subwoofer 64–65
surround channel-based formats 62–65
sweeteners 145
sword maker example 246–250
teams, communication with 86
technical sound design 5
teleological modeling 241
terrain 31–32
third-party implementation tools 17
third-person controller 28, 29
time-based modulation FX 110–113; chorus 110–111, 111; flangers 111; phasers 112, 112; tremolo 112–113
timecode 119
time property 181–182
time stretching 99–100
Time.time 211
timing 168–170
transform component 30
transforms, Fourier-based 89
transient control 94–95
TransitionTo() method 268
tremolo 112–113
tremolo effect 141
Trespassers: Jurassic Park 4, 20
triggers 33, 39, 164–166, 188–189
trigger zones 33
Unity3D project structure 25–29
Unity Editor 26
Unity game engine 6, 10, 148; ambisonic recording and 11; animation system 35–37; audio engine 40–41, 43–69; audio mixer 259–266; data reduction options in 283–286; ducking in 266; input system 37–38; linear animation 41–42; physics engine 38–40; playing audio in 160–171; reverberation in 216–219; scenes vs. projects 26
Unity Hub application 25
Unity projects: creation of 25–26; level basics 26–29
Universal Audio LA-2A leveling amplifier 77
Update() function 154, 167–169, 198, 201
UpdateVolume() function 209
UREI 1165 limiting amplifier 77
user feedback 11–12
user input, detecting 167–168
utilities 80
variable bit rates (VBR) 279–280, 283
variables 154–155
variations, creating 178, 189–190
vehicles: material selection 133; processing and preparing materials 133–134; prototyping 132–136; specifications 132–133
velocity-based sample selection 195–197
version control 22
version tracking 85–86
vertical axes 37–38
vertical plane, localization on 58–59
very small space emulation 110
video: frame rates 118–119; working with 118–119
video games: first 1–3, 18; role of audio in 7–17; see also game audio
views, working with, in Unity mixer 261–262
virtual reality 4, 5, 13, 239; categories of 14; immersion and 14–17
visual field 8–9
vocal recordings, working with 138
volume faders 117
volume sliders 270–271
volumetric sound sources 51
WaitForIt() function 187
Warner Brothers 71
waveform analysis 242
weight, of sound 74–75
wet to reverberant signal ratio 75
white noise 243–244
width parameter 105
width perception 225–226
wind machine example 242–246
Wirth, Werner 15
world coordinates 27
world geometry 27–28
yield return statements 184
zip files 278–279