What's new

How does video game music work?

Hey all! I'm starting to get really interested in scoring video games.

I have so many questions but I'd love to start by learning more about the general workflow of things from start to finish (the other questions can come along later lol!).

I'm also curious about the technical/implementation side. How do you make your music loop/transition to another track seamlessly? And what software do I use to implement the music into the game (or is that even handled by the composer)?

And finally, if you have any resources/YouTube channel recommendations for learning this type of stuff, please let me know!

Looking forward to reading all your answers! :)
 
in a nutshell:

Wwise and FMOD are the main bits of middleware (software to plug your music into a game)
You can also do implementation in Unity or Unreal direct

music systems vary massively per game, some are simple loops, some are vast, semi-generative systems with thousands of components

look at youtube for demonstrations of wwise music systems for various projects

You can download both FMOD and wwise for free and play around with them yourself, and you can learn everything you need from that, the official documentation and youtube etc. Lots of great stuff on dynamic and adaptive music systems out there.


You don't *need* to know or use middleware, but it will get you hired over people without that ability at the lower levels, and it will make you a more effective composer and collaborator at the higher levels.
 
in a nutshell:

Wwise and FMOD are the main bits of middleware (software to plug your music into a game)
You can also do implementation in Unity or Unreal direct

music systems vary massively per game, some are simple loops, some are vast, semi-generative systems with thousands of components

look at youtube for demonstrations of wwise music systems for various projects

You can download both FMOD and wwise for free and play around with them yourself, and you can learn everything you need from that, the official documentation and youtube etc. Lots of great stuff on dynamic and adaptive music systems out there.


You don't *need* to know or use middleware, but it will get you hired over people without that ability at the lower levels, and it will make you a more effective composer and collaborator at the higher levels.
Thank you Richard! I've heard about Wwise and FMOD before but haven't learned more about them yet, but will certainly do now!

Also, I'm wondering, with games the development process usually takes a few years right (depending on the complexity)? So do the composer get involved right at the beginning or do they wait until the developers are at a certain stage to start writing?
 
Thank you Richard! I've heard about Wwise and FMOD before but haven't learned more about them yet, but will certainly do now!

Also, I'm wondering, with games the development process usually takes a few years right (depending on the complexity)? So do the composer get involved right at the beginning or do they wait until the developers are at a certain stage to start writing?
Yes - some take less time, but generally a year or more. And in my experience the composer is brought in early. Not always right at the start, but early enough to talk gameplay/design and how they might inform the music system. Then demo/vertical slice (essentially a 'proof of concept' stage) to get started.
 
Yes - some take less time, but generally a year or more. And in my experience the composer is brought in early. Not always right at the start, but early enough to talk gameplay/design and how they might inform the music system. Then demo/vertical slice (essentially a 'proof of concept' stage) to get started.
Thank you so much for sharing! I'll look for videos on YouTube about this so I'll have a clear idea of how to do it. Thanks again :)
 
in a nutshell:

Wwise and FMOD are the main bits of middleware (software to plug your music into a game)
You can also do implementation in Unity or Unreal direct

music systems vary massively per game, some are simple loops, some are vast, semi-generative systems with thousands of components

look at youtube for demonstrations of wwise music systems for various projects

You can download both FMOD and wwise for free and play around with them yourself, and you can learn everything you need from that, the official documentation and youtube etc. Lots of great stuff on dynamic and adaptive music systems out there.
This is a really interesting idea. Like you, I am interested in video games, but not as a composer but as a gambler. And I recently discovered a great playground https://aviator-pinup.info/ where I can happily try my luck and gain new gaming experiences. This is a truly unique gaming platform that is definitely worth a visit.

You don't *need* to know or use middleware, but it will get you hired over people without that ability at the lower levels, and it will make you a more effective composer and collaborator at the higher levels.
Thanks!
 
Last edited:
Love this thread. Can anyone discuss more about Wwise and scoring?

There's some stuff here 5:30 in


Lots more from composers who aren't me, on YouTube. I remember a long one about Ghost of Tsushima and wwise.
Search for things like WWISE Dynamic Music, or FMOD Dynamic music, or Adaptive Music, or Music System. You should find lots of interesting stuff!
 
I would also suggest to publish some of your works on Unity or Unreal Engine. it may help to create a user-base for your music and to get some more exposure.

Those are my go to for video game song exposure:
Unreal
Unity
Gamedev Market
itch.io

What's really cool with itch.io is something called a Game Jam where devs work with composers to build a functional game within a certain timeframe.

You can also head to reddit:
Unreal

These are resources you can use to better understand what's going on under the hood and see what devs may want for their project and their requests aren't that dissimilar than that of showrunners/editors that want a cue that can be implemented into their projects without much hassle.

As for the technical side, the big two (Unreal and Unity) want at least 20-30 minutes of unique music cues known as assets to be put together as a cohesive package for their customers. Gamedev and itch.io allow you to see a single song if desired or a larger asset pack.

When constructing these packs, it's best to make it cohesive, ie (Ambient sounds, Guitar Melodies/Loops, Drums, etc.).

Finally, watermark your audio so unscrupulous individuals won't use your works without purchase or permission.

Good luck!
 
Last edited:
Hey all! I'm starting to get really interested in scoring video games.
Welcome! I'll try my best to answer your questions. For reference, I've been working in game audio for almost 20 years, with the first 11 of those freelancing in mostly music composition and production and the last (almost) 8 of those at Epic Games working as a technical designer.
I have so many questions but I'd love to start by learning more about the general workflow of things from start to finish (the other questions can come along later lol!).
It generally starts with a project where the need for bespoke music has been identified. Audio Leads or Directors will work with Creative stakeholders to determine a specific direction to take the music. Sometimes the nature of the project itself can drive these needs, sometimes there is room for exploration or discovery.

To be honest, contracting talent, especially key talent like a Composer is a fairly political process and is often a power wielded by only a few folks depending on the studio hierarchy. I have been in situations where hiring a composer has preceded music direction and situations where music direction has preceded hiring a composer.

In either case, a mood board or collection of example music is collected and organized by the Audio leadership and other music stakeholders in order to communicate musical ideation to non-musically or semi-musically inclined Project leadership.

Feedback is refined and it's safe for the composer(s) to start sketching ideas out or prototyping. Some directors like a "paint-over" where the composer will compose music to a video of prototype gameplay, animatics, or story-board/collages; other directors may just look for audio recordings they can audition. Sometimes one method or another is dependent on the progress so far as well as the linearity or non-linearity of the gameplay or cinematics.

Depending on the personalities of the various leadership involved as well as the flexibility of the production timeline, this pre-production process may take longer or shorter. When on tight deadlines, you need to just make decisions even if it's just a better not best type of scenario. When you're lucky, you can spend time refining the musical signature or identity, which usually brings a lot of additional value to the play experience and the game branding.

Because games are generally non-linear experiences and because game engines support stochastic design, music design will begin communicating with the composer (if they are separate) in terms of potential interactivity and explore together how the possibility space can expand on the work done in the prepro phase to establish a musical identity for the project. Often this is coincidental with that prototype phase or slightly shifted as a follow-up.

Often times, as is the case in game development, the initial idea is bigger than the budget (time and human resources), and so scope management may begin eating away at the initial design ideas until something reasonable shakes out.

On very small teams, music design and implementation are done by either the Audio Director or a designated Technical Designer.

Music designers and/or points of contact to the composer will request music to be authored to explicit specifications and formats. This is because music design dictates how the music is separated into smaller parts and this also dictates how recording will progress in cases where live players are needed.

For example, a lot of group-rate recording options out there offer separate sessions by instrument group, which is great for traditional music mixing requirements but is not necessarily how music designers on a project will need the music broken down.

As soon as possible Music Designers will be putting music into the game (if they know what they're doing), sometimes even non-interactive or with very limited interactivity, just so people can start hearing what it will sound like.

After the prototyping is done and it's time to author real-boy cues, using synthetic instruments/virtual instruments is often requested so the music designers can work on the necessary implementation pathways while simultaneously, composers and their music prep team can start making arrangements for live recording.

After which, the temp versions can be swapped out with the live mixed versions.

Then the Audio Director and the Music Designers will refine and debug the music system and the game mix. The same composer may be called upon for commercial media or promotional work, and so there may be cues that are going to marketing or whomever needs them to promote the game, etc.

I'm also curious about the technical/implementation side. How do you make your music loop/transition to another track seamlessly? And what software do I use to implement the music into the game (or is that even handled by the composer)?
Unless you're a particularly technically inclined composer, it is often not the purview of the composer to implement in the game. It can be, but you would basically replace "music designer" in my previous answer with "composer." So it doesn't necessarily change the process. I have been on small gigs where I both composed interactivity and handed it off to a programmer to implement AND composed music where I implemented it directly into the engine.

One thing to note is a composer, except in rare cases, is outsourced and music designers, except in rare cases, are in-house.

On LEGO Fortnite, I was acting as music designer. I programmed the initial music management system in C++ (which controlled the music's general playback behavior), I worked with the composers with the Audio Director on delivery requirements as well as explored the possibility space both for initial releases and future ones, and I also designed the music in the Engine tech (MetaSounds), which controlled the music's specific behavior.

Skills generally used by technical music designers:
  • Music theory
  • Music math (intervallic, rhythmic, as well as modal math manipulations)
  • DAW proficiency (Music editing skills, seamless loops included)
  • Middleware proficiency is often requested or useful, though a lot of companies have their own audio engines or systems that you'd be expected to learn
  • Computer Programming/Scripting (Understanding Computer Logic)
    • C++, C# (Unity), Blueprints (Unreal) (proficiency in any Object Oriented scripting language will be translatable to whatever the company uses)
  • Organization skills, time-keeping/production skills, etc.
  • Proficiency in a graph design environment is appealing:
    • Reaktor, Max/MSP, Pure Data, Kyma, etc. (MetaSounds is becoming more common)
And finally, if you have any resources/YouTube channel recommendations for learning this type of stuff, please let me know!

Looking forward to reading all your answers! :)
Technical stuff moves pretty fast, books tend to be out dated by the time they're popular, so they can be a poor resource for technical proficiency.

Check out the Game Developer's Conference in San Francisco, and GameSoundCon in LA/Burbank every year. Those are good places for a lot of learning.

There are lots of YouTube videos channels and books as resources.

As a note, some people conflate procedural design with generative AI. Procedural design is just the same as stochastic design. A good procedural music design will adhere to the composer's rule set for the project's musical identity and just accelerate the workflow for expanding how useful cues can be in different game contexts.

Keep writing music, pursue writing great music, keep learning about today's technology and tomorrow's technologies and KEEP PLAYING GAMES.

Be a gamer, be a game developer.

- Dan
 
Welcome! I'll try my best to answer your questions. For reference, I've been working in game audio for almost 20 years, with the first 11 of those freelancing in mostly music composition and production and the last (almost) 8 of those at Epic Games working as a technical designer.

It generally starts with a project where the need for bespoke music has been identified. Audio Leads or Directors will work with Creative stakeholders to determine a specific direction to take the music. Sometimes the nature of the project itself can drive these needs, sometimes there is room for exploration or discovery.

To be honest, contracting talent, especially key talent like a Composer is a fairly political process and is often a power wielded by only a few folks depending on the studio hierarchy. I have been in situations where hiring a composer has preceded music direction and situations where music direction has preceded hiring a composer.

In either case, a mood board or collection of example music is collected and organized by the Audio leadership and other music stakeholders in order to communicate musical ideation to non-musically or semi-musically inclined Project leadership.

Feedback is refined and it's safe for the composer(s) to start sketching ideas out or prototyping. Some directors like a "paint-over" where the composer will compose music to a video of prototype gameplay, animatics, or story-board/collages; other directors may just look for audio recordings they can audition. Sometimes one method or another is dependent on the progress so far as well as the linearity or non-linearity of the gameplay or cinematics.

Depending on the personalities of the various leadership involved as well as the flexibility of the production timeline, this pre-production process may take longer or shorter. When on tight deadlines, you need to just make decisions even if it's just a better not best type of scenario. When you're lucky, you can spend time refining the musical signature or identity, which usually brings a lot of additional value to the play experience and the game branding.

Because games are generally non-linear experiences and because game engines support stochastic design, music design will begin communicating with the composer (if they are separate) in terms of potential interactivity and explore together how the possibility space can expand on the work done in the prepro phase to establish a musical identity for the project. Often this is coincidental with that prototype phase or slightly shifted as a follow-up.

Often times, as is the case in game development, the initial idea is bigger than the budget (time and human resources), and so scope management may begin eating away at the initial design ideas until something reasonable shakes out.

On very small teams, music design and implementation are done by either the Audio Director or a designated Technical Designer.

Music designers and/or points of contact to the composer will request music to be authored to explicit specifications and formats. This is because music design dictates how the music is separated into smaller parts and this also dictates how recording will progress in cases where live players are needed.

For example, a lot of group-rate recording options out there offer separate sessions by instrument group, which is great for traditional music mixing requirements but is not necessarily how music designers on a project will need the music broken down.

As soon as possible Music Designers will be putting music into the game (if they know what they're doing), sometimes even non-interactive or with very limited interactivity, just so people can start hearing what it will sound like.

After the prototyping is done and it's time to author real-boy cues, using synthetic instruments/virtual instruments is often requested so the music designers can work on the necessary implementation pathways while simultaneously, composers and their music prep team can start making arrangements for live recording.

After which, the temp versions can be swapped out with the live mixed versions.

Then the Audio Director and the Music Designers will refine and debug the music system and the game mix. The same composer may be called upon for commercial media or promotional work, and so there may be cues that are going to marketing or whomever needs them to promote the game, etc.


Unless you're a particularly technically inclined composer, it is often not the purview of the composer to implement in the game. It can be, but you would basically replace "music designer" in my previous answer with "composer." So it doesn't necessarily change the process. I have been on small gigs where I both composed interactivity and handed it off to a programmer to implement AND composed music where I implemented it directly into the engine.

One thing to note is a composer, except in rare cases, is outsourced and music designers, except in rare cases, are in-house.

On LEGO Fortnite, I was acting as music designer. I programmed the initial music management system in C++ (which controlled the music's general playback behavior), I worked with the composers with the Audio Director on delivery requirements as well as explored the possibility space both for initial releases and future ones, and I also designed the music in the Engine tech (MetaSounds), which controlled the music's specific behavior.

Skills generally used by technical music designers:
  • Music theory
  • Music math (intervallic, rhythmic, as well as modal math manipulations)
  • DAW proficiency (Music editing skills, seamless loops included)
  • Middleware proficiency is often requested or useful, though a lot of companies have their own audio engines or systems that you'd be expected to learn
  • Computer Programming/Scripting (Understanding Computer Logic)
    • C++, C# (Unity), Blueprints (Unreal) (proficiency in any Object Oriented scripting language will be translatable to whatever the company uses)
  • Organization skills, time-keeping/production skills, etc.
  • Proficiency in a graph design environment is appealing:
    • Reaktor, Max/MSP, Pure Data, Kyma, etc. (MetaSounds is becoming more common)

Technical stuff moves pretty fast, books tend to be out dated by the time they're popular, so they can be a poor resource for technical proficiency.

Check out the Game Developer's Conference in San Francisco, and GameSoundCon in LA/Burbank every year. Those are good places for a lot of learning.

There are lots of YouTube videos channels and books as resources.

As a note, some people conflate procedural design with generative AI. Procedural design is just the same as stochastic design. A good procedural music design will adhere to the composer's rule set for the project's musical identity and just accelerate the workflow for expanding how useful cues can be in different game contexts.

Keep writing music, pursue writing great music, keep learning about today's technology and tomorrow's technologies and KEEP PLAYING GAMES.

Be a gamer, be a game developer.

- Dan
Hi Dan, my apologies for the late reply. 🙏🏼

Wow, thank you so much for your incredibly detailed explanation! This really helps me understand how things work in video games. Composing for games is something I'd like to know more about, I know the workflow of film scoring pretty well but have not learned much about games. Your post helped me a ton! Thank you again for your time to chime in here, and have a great rest of the week :)
 
I came across this video which is an interview with Steven Milen who is a 15 years media composer especially for video games.



He also runs the Video Game Music Alliance (VGMA) platform where you can enroll to his courses, chat with AAA game composers, sample library discounts and more.

I am not sure how legitimate are his courses but they seem very thorough but pretty expensive.

I'd be glad if anyone enrolled in his courses and share his/her experience because they seem a dedicated "Video Game Music" courses which you don't find anywhere else.

His VGMA: https://www.videogamemusicalliance.com/
His Channel: https://www.youtube.com/@StevenMelin
 
Hey all! I'm starting to get really interested in scoring video games.

I have so many questions but I'd love to start by learning more about the general workflow of things from start to finish (the other questions can come along later lol!).

I'm also curious about the technical/implementation side. How do you make your music loop/transition to another track seamlessly? And what software do I use to implement the music into the game (or is that even handled by the composer)?

And finally, if you have any resources/YouTube channel recommendations for learning this type of stuff, please let me know!

Looking forward to reading all your answers! :)
Will Roget (Helldivers 2, COD WW2, Star Wars games) has a couple talks where he goes over his composing approach. Really useful stuff. Audio isn't great on the first link, but still a good talk.



it's a more high level one, but here he also gives a more detailed breakdown of production techniques, mixing/reverb, using samples vs live, etc. in this one:

 
Top Bottom