This week we learnt about adaptive sound design and how to use it within games. We looked at how music is developed for games using loops which constantly repeat in the background, as well as how these are implemented via vertical development and horizontal development, both of which are defined below. We looked into examples, such as Journey’s use of an RTPC and vertical development to generate music that adapts and changes whenever the player moves towards the correct place; using music in this way gives the player positive feedback to suggest that they are heading the right way, without needing to explicitly say it.
A good example of a game that uses horizontal development is Skyrim which selects from multiple, area-specific, loops as the player walks around the world (Evans, Richard, 2019: *Analyzing and Designing Dynamic Music Systems for Games.* PhD thesis, University of York.)
We also looked into stingers, which are pieces of music that play once, whenever a specific thing happens. I will aim to include stingers for events like picking up fuel or visiting space stations, and I will definitely need to include them for key interactions like installing a new module or the player dying.
I aim to include horizontal development in my game, as I can then create a repeatedly changing track that has a consistent beat whilst still having variation in loops. The intention here is to create a unique track that can be very adaptive to player behaviour, such as when they succeed in difficult scenarios or when they reach specific heights. Having a horizontal system also allows the game to easily transition between music for space and music for the planet surface.
We learnt how to include sound effects in our design as loops, for longer repeating sound effects like alarms or breathing, and one-shots, for sound effects that will be played once like footsteps or reload sounds. When implementing sound effects, it is important to not play the same sound over and over again; this is known as “machine-gunning” and will quickly become annoying. Instead, it is better to have a selection of sound effects for the interaction and each should be slightly different from the others; when implementing, the game can randomly select from one of these sounds whenever the interaction happens. This is easy to do in Wwise, a music middleware program that prepares music for games, using a random container.
This week I also began to design the asteroid patterns, discussed in the original game project proposal. I initially found it difficult to work out how to visualise the patterns on paper, however, after reviewing adaptive music I knew that I wanted the asteroids to come into the screen in time with a musical beat. I then developed the visualisation diagram that can be found in the “Asteroid Patterns” page, which is linked. I then developed all of the basic patterns and half of the simple ones this week; I plan to have finished designing most of the others next week, and begin implementing and playtesting them.
Finally, I edited the asteroid size within the game so that they are smaller and this made them more fun to avoid. I also edited the rocket’s collider and made it into a polygon collider, as I was getting a bug where asteroids were colliding with it when they were not touching the triangle. You can see a video below with these changes in place, as well as music to give an idea of the mood and feel of the game: