Ape’s Journey

Game

For our 3D Game Audio exam at Sonic College, we were tasked with creating all of the audio for a short game in a group of three. We started from scratch, integrated Wwise into Unity, designed all of the ambience, sound effects, and music, and then implemented it using C# and Wwise. Additionally, we had to utilize object-based audio and mix the entire game in 7.1.4 surround sound in a certified Dolby Atmos studio. For this project, my main area of responsibility was ambience sound design and audio implementation. I therefore spent a lot of time in Wwise and Unity, implementing everything, setting up containers, creating events, and playing the events in-game. Sound effects and music by Alexander Vollmers-Hansen and Jonas Elliot Bedsted.

My ambience sound design process

  • I wanted the ambience to be pretty dynamic and therefore split it into two parts – an ambience wind bed for the base, and more specific ambient sounds positioned around the level for detail.
  • For the ambience bed, I opted to use ambisonics wind tracks to give the player a 360-degree soundscape to turn around in, rather than a static stereo bed.
  • To add more detail, I placed ambient event emitters around the level and had them play more specific ambient sounds – some examples being birds chirping in the trees, frogs croaking on lily pads in the swamp, and cicadas chirping in bushes. These are placed in random containers in order to create variation every time they play.
  • When the level transitions to night time, the birds are replaced by owls, and some of the swamp animals have gone to bed, creating a shift in ambience and simulating different animals’ hours of activity. The ambisonics bed is also replaced by a night time variation.
  • In the player’s home I didn’t want to create artificial ambience, so instead of playing an interior ambience track, a different event instead triggers upon entering the house. This event ducks and muffles the sound of the ambience to simulate occlusion.
  • For the bridge in the swamp, we really wanted to emphasize how old and ragged it is. I created an ambient loop of wood creaking, pitched it down to make it sound larger, and used a stereo imager to make it wider. This ambient loop is then triggered when the player stands on the bridge.

My implementation process

  • For the game’s music system, we created one overarching music container that plays throughout the whole game. This music container then has two states – exploration and combat. When the player enters combat, we simply play an event that switches the music state to combat music. When combat is over, it switches back to exploration state.
  • Early on we decided that we wanted to create a dynamic combat music system that shifts different music layers in and out depending on the player’s health. This was done by creating a health RTPC and then automating each music layer’s volume depending on the value of the health parameter. Two examples being the drums increasing in intensity at both 75% and 50% health, and some intense staccato strings playing at 50% health.
  • Another example of RTPCs being used in this project is the music playing in the player’s home. We also created a dynamic music system for this, similarly to the combat music, but instead of following the player health parameter, the music instead follows the distance between the player and the house. This value is calculated by getting the Vector3 distance between the player’s transform and the transform of an invisible cube in the center of the house. This allows us to have certain music layers only play inside and then fade out upon exiting the house where other layers continue playing. The rest of the layers then slowly fade out to the max range of 30 units which matches with the player walking through the gate into the forest.
  • For the exploration music, we created a state for each area in the game – home, forest, forest night, swamp, swamp night, and cave. At the entrance to each area in the level, we’ve placed an invisible trigger box that switches the music state to the corresponding area. The system then smoothly fades between the old music track and the new one. We decided to go with this technique since the linear level design allows for it.
  • We created a dynamic footstep system with a raycast pointing down at the player’s feet and getting the value of the terrain texture that the player is standing on. This then goes into a switch with a case for each terrain texture (grass, dirt, rocks, gravel) and adjusts the Wwise footstep switch container accordingly. For object surfaces (wood, planks, stone, carpet) it goes through a different switch. For the carpet specifically, we wanted to make it a bit more dynamic and also mix in the surface of the floor beneath the carpet. When the player steps on the carpet, it therefore plays a 75/25 mix of the carpet and the floor beneath it.
  • For interior ambience, I utilized Wwise’s Rooms & Portals feature to designate unique reverb buses to each interior room, as well as play ambient beds that can be heard through open doors using portals and occlusion.
  • In the lava cave, I used AkAmbient large mode to position multiple lava ambience emitters around the room. This gives it a much wider and more dynamic feel with the ambience changing as the player moves through the room.
  • For optimization purposes, I used AkAmbient multiposition mode for objects such as torches and campfires, instead of playing a unique event for every torch. This results in every torch only using one voice total, instead of one voice each.

What I did

  • - Audio implementation
  • - Ambience sound design
  • - Additional sound effects

What I used

  • - FL Studio
  • - Wwise
  • - Unity