Game Background Music Composing and Arrangement
Game music is not background track. This is a system that should react to game state in real time: rise when enemies appear, calm down when exploring, seamlessly transition between themes without clicks at junction. Writing beautiful track is half the work. The other half—proper arrangement for interactive context.
Adaptive Music: Where Linear Approach Breaks
Key difference between film and game music—lack of fixed duration. Player can spend 30 seconds or 20 minutes in one zone. Linear track looped this way 40 times becomes unbearable.
Professional solution—horizontal re-sequencing and vertical re-mixing. First: track cut into sections (intro, loop_a, loop_b, bridge, outro), system (FMOD or Wwise) assembles them at runtime depending on state. Second: track originally written with separate stems—bass, percussion, melody, atmosphere—engine dynamically controls their volume.
In FMOD Studio this implemented via Multi Instrument with Playlist in Shuffle mode + Transition Timelines. State transitions happen by Quantization—nearest musical beat, excluding rhythm glitches on switching. Combat intensity parameter (like combat_intensity from 0 to 1) controls crossfade between calm and combat version via FMOD Parameter Sheet.
In Wwise same result via Music Switch Container + Interactive Music Hierarchy. Wwise can do Pre-entry and Post-exit for smooth transitions without pauses—especially important for swift switches when player enters combat zone in second and exits back.
Practice—survival horror project. Client wanted "music that builds dread when player in danger". Implementation: 4 stems (drone, pulse, strings, full_tension), RTPC threat_level from 0 to 3, each level adds next stem. At threat_level = 0 hear only drone. At = 3 all four, dynamic surges. Threshold-triggers in C# script update RTPC every 500 ms based on distance to active enemies. No sharp track change—only mounting pressure.
Arrangement for Engine and Platform
Arrangement for game accounts several constraints not in studio work.
Loop points. Track must loop without audible seam. Means: final bar harmonically leads back to beginning, reverb tail cut or processed via loop crossfade. In DAW (Reaper, Ableton, Logic Pro X) loop-points written in file metadata or set in FMOD/Wwise on import.
Stem recording. For vertical re-mixing each instrument layer exported separate. All stems same length, synchronized by grid, exported same sample rate (usually 44100 Hz or 48000 Hz by platform) and without normalization—levels equalized in FMOD/Wwise master, not in files.
Memory budget. On mobile platforms total sound bank usually limited 50–80 MB. Long orchestral track in PCM—problem. Solution: Vorbis compression 60–70% quality (loss practically unheard via game speakers), Streaming instead of Decompress On Load for tracks longer 10 seconds.
Loop music vs cinematics. For cut-scenes need synchronized track with fixed duration. Here works classic approach: track written under timeline, integrated via Unity Timeline with Audio Track, no adaptivity—only precise sync.
From Specification to Final Bank
Work starts with game document study: genre, setting, target emotions, references, zone/state count, platforms. Determines tooling: FMOD or Wwise, stem count, budget.
Then—writing and arrangement in DAW. Usually several iterations with team feedback. After approval—stem export, transition and parameter setup in FMOD/Wwise, Unity integration, transition testing on edge states (multiple parameter simultaneous change).
| Volume | Estimated Timeline |
|---|---|
| One adaptive track (2–3 states) | 1–3 weeks |
| One location soundtrack (5–7 states) | 3–5 weeks |
| Complete game soundtrack (10–20 tracks, adaptive) | 2–5 months |
| Orchestra recording + post-production + integration | by agreement |
Cost calculated after analyzing volume, adaptive system complexity and integration requirements.





