Hi, I always wondered for example for the GBA or Nintendo DS, even 3DS maybe now, what music software they use to make the music. Using some sort of logic, it has to be a synthesizer, but what I am wondering is, is it one that is commercially available or is it property of Nintendo and part of their SDK or something? Also, what kind of commercial synthesizer would be pretty close to get that sound (music) like heard in those GBA, DS, 3DS games? And as an extra question: how do they integrate the music created with that software into the ROM (game)? They convert it to a format or something? Thanks
Generally all made then imported as binary into a C file and compiled that way. There is no set method, they all have custom engines built, you can make your own engine for MP3 for example for the GBA. 1st party nintendo games will probably use their own specific engine which would be different for all other 3rd party companies. The GC had the MusyX SDK which was optional.
very interesting, how did the MusyX SDK work, you have to eventually use some sort of either instruments or something, i don't think you can use an SDK as a programming tool to make music, there has to be something, so do you know how they worked with MusyX, is there some documentation for this? thanks
MusyX had a tracker and a console for external MIDI instruments if I remember correctly. If you mean how is the audio actually created, depending on the detail, the song is either composed and then arranged for synth instruments through either MIDI or a sampled tracker format or just arranged from existing bits and pieces and then, depending on the sophistication of the sound driver, it is either converted into either a tracker format or PCM or some other digital audio or hardcoded into sound code (the latter of which is hopefully hardly ever done). Sampled tracking formats are byfar my favorite for game audio because you can slip stuff into the stream to drastically change audio without it having to be pre-arranged in a digital stream. That may not make much sense but I what I mean is every instrument or channel is actively "rendered" persay so that means alterations are simpler as you have individual control over each channel instead of it all being pressed into a digital file with no access to individual streams. I hope all of that makes sense but that is video game audio as I understand it. (Oh, there are also differences when it comes to whether you have a software or hardware synth, on older systems you often had a hardware synth, like the Mega Drive and it's YM2612, I can't say for sure but I assume modern systems don't have such chips and all synth is handled by software and is then pressed into PCM for digital output.)