Skip to Content

Building Soundsearcher

Starting with nothing for Ludum Dare 45

Last month I participated in Ludum Dare 45. Over its 48 arduous hours there were many challenges to overcome and insights to gain as I built a game completely from scratch.

As an accessibility advocate with a background in digital signal processing, I wanted to explore how the Web Audio API could be leveraged to build spatial audio experiences. The result was Soundsearcher, a minimalist exploration game designed for playing with your eyes closed.

Starting with nothing

Game jams like Ludum Dare are primarily exercises in time management.

It was early evening when its 45th theme—start with nothing—was announced. The concept for an audio-only exploration game was my immediate reaction. As an added challenge I decided I’d also start with nothing: no framework, tools, or base code—vanilla JavaScript only.

The first eight hours were dedicated toward building its underlying engine:

  • User interface. Knowing that Soundsearcher needed touch and mouse controls, starting with the user interface felt logical. To achieve its design, this was my first time leveraging conic-gradient() in production despite it not having widespread browser support.
  • Controls. The controls module provides a unified API for multiple control schemes. It listens for mouse, touch, and keyboard events and maps them to a generic state. For example: pressing W, the up arrow, or the forward button would each toggle its moveForward state.
  • Movement. The position module is the closest thing Soundsearcher has to a main player object. It maintains the current coordinates, heading, and vector of the player. Acceleration or deceleration are applied to its vector each frame based on the state of the controls.
  • Audio system. The audio module is a small wrapper around the main AudioContext with a few utility methods. When initialized, an inner BufferNode is filled with several seconds of random samples; this provides a performant and reusable source for all white noise.
  • Objects. Every sound source in the game world is an object. Prototypical inheritance was leveraged to create a base object that all others would extend with a mixin. To create the spatial audio system, each object has a StereoPannerNode and GainNode that update in relation to the player position—the most trigonometry I’ve done in years.
  • Main loop. Once the game begins, its main loop is executed recursively via requestAnimationFrame. Each iteration updates the position module with the latest controls state and then each object with the new player position.

To close the night, I created an example object with an OscillatorNode and placed it in the world. I was delighted to finally approach it, hearing it grow louder as it neared, and transition from ear to ear as I circled it.

Giving players purpose

The following morning I awoke to an alarm and sprung back to work. With input controls, a world to explore, and an audio engine to perceive it all, any game could fill this blank canvas. Soundsearcher needed meaningful and enjoyable items for players to collect and use to orient and immerse themselves in the world.

The next twelve hours were dedicated toward:

  • Collectibles. A subset of objects have a truthy isCollectible property which indicates they can be picked up. When they’re within a certain pickupRadius from the player, collectibles are marked as isCollected and have their onPickup() method called to transition their behavior from the world to an inventory state.
  • Collectible spawner. The pickups module manages when and where collectibles are spawned in the world. Getting this right was integral to the overall experience, so special care was given to ensuring they always spawn in the general direction the player is moving, and respawning them if they move too far away.
  • Compass. It was important to help orient players and return keen listeners back to previously visited areas. The compass collectible leverages the white noise buffer, some trigonometry, and a BiquadFilterNode to sweep through its frequencies and be strongest when players face north.
  • Chord system. The chord module was created to maximize consonance between proximate objects and increase the variety between areas. It divides space into a grid and assigns each cell to one of four musical chords with a unique inversion. Its key is randomized each play.
  • Resonators. The first applications of the chord module were three collectibles that change pitch as the player moves through space. When listened together, they form the root, third, and fifth of the current chord. To help separate their tones, the third and fifth are panned to a cardinal direction like the compass.

At the end of this sprint I lost nearly three hours—arguably my longest distraction—to an issue with the spatial audio system. Specifically, objects’ pan values weren’t wrapping as expected when scaling from (0, TAU) to (-1, 1), causing them to suddenly jump from left to right. With pen and paper I visualized and understood the issue, learning more about linear transformation in the process.

Creating an atmosphere

The next alarm marked the final sprint. Now that it had a framework for success, Soundsearcher needed ambient sounds and creatures to give it life and fill the liminal space between collectibles—and some polish.

The final eight hours were dedicated toward:

  • Ambient spawner. The ambients module leverages the same grid system driving the chord module to spawn noninteractive objects into the world. As players move into unexplored cells, objects are spawned within a time cone ahead of them to ensure their sounds fade in naturally.
  • Waterfalls and wind. The white noise buffer inside the audio module offered many opportunities for delight. The waterfalls were its simplest application: a static point source with a random intensity and lowpass frequency. Implementing wind took this a step further by ramping a lowpass filter to random frequencies over random durations.
  • Buggers. The insects are akin to an audio particle system. Each frame their position is summed with a randomly changing vector, moving them erratically through space. Their distinct buzz is achieved by detuning their oscillators each frame, effectively modulating their frequency with a random sample and hold source.
  • Tweeters. Bird songs were added to lighten the mood and inject music into the world. They generate and sing random sequences of notes with pauses between. To simplify sequence generation, the chord module was extended with a getRandomNote(x, y) method to retrieve a consonant frequency for any location.
  • Footsteps. To complement the theme, footsteps were implemented as the first collectible object. This was another use case for the white noise buffer. Their percussive sound is achieved with a lowpass filter; an envelope is applied to its cutoff each time the player moves one unit.
  • Shiny things. Highlighting interactive objects is a common user interface trope in video games. A distinct repeating sound was created so players could differentiate certain collectibles from other ambient objects and encourage them to investigate.
  • Documentation. Good documentation provides enough context and instruction for a smooth onboarding process. For Soundsearcher, it was imperative to convey to players its experimental nature, how to play, and its objectives.

Pencils down. It was finally time to submit and breathe.

Realizing its potential

During the voting phase, I collected user feedback and implemented changes within a post-compo branch of the code. Their thoughts and suggestions were revealing and helped me refine the experience further:

  • Memory optimization. As players would travel further from the starting area the audio engine would grind to a halt. This was because objects’ nodes were always connected even when they were imperceptibly quiet. Culling was implemented to disable and stop updating inaudible objects. Overall this led to an increased density of spawns.
  • Improved controls. Some players had desired the ability to strafe, so the controls and position modules were refactored to decouple movement direction from orientation as its own vector. Additionally, the Gamepad API was leveraged to provide basic controller support.
  • Sound localization. Due to time constraints, all of the objects had been implemented as point sources with zero radius. To give them a circumference, the gain and pan calculations considered a radius parameter to gradually roll off beyond that distance. Wind and waterfalls were given larger radiuses so they could fill entire areas.
  • More ambient sounds. With only a handful of ambient sounds to choose from, no level of randomization would prevent its procedural generation from growing bland. The starting area was given a campfire to make it welcoming and distinct. To complement the Tweeters, Subwoofers bellow the root note under their melodies. Similarly, Oinkers provided another moving sound source with a rich and rhythmic call.
  • Improved collectibles. After many playthroughs it became apparent that the collectibles could be more reactive. The timing and timbre of footsteps were adjusted to indicate speed. Similarly, resonators were given a second oscillator so their harmonics could be filtered in at faster speeds. To reduce its cognitive load, the compass now points to the starting area and sweeps its pan as players turn.
  • Main quest. The pickups module was adjusted so a final collectible is spawned after its queue is exhausted. Find all three resonators, use the compass to return to the campfire, and enjoy the surprise!

Merging these changes into the master branch felt great. Not only was Soundsearcher a stronger product—it was a fuller realization of its vision.

Further reflection

Every game jam is a journey, and none are complete without learning more about code and yourself:

  • I’m not an impostor. It’s easy to feel like an impostor when confronted by the prolificness of our peers or a taste of what we don’t know. Competing in a game jam and successfully releasing an MVP before its deadline was the win that finally loosened its grip around me.
  • I’m not burnt out. Exhaustion and frustration with code at my day job impacts my enthusiasm, leading me to lose focus and efficiency to easy distractions. I had always attributed this to burnout before I understood that I’m no longer building the right things with my career.
  • I want to care. Games and sound design have always been passions of mine that I’ve set aside while pursuing a career in web development. It’s clear to me now that these aren’t mutually exclusive, so in the future I will be exploring how to unify them and still survive.
  • I want to share. During development I encountered many teachable moments that aren’t typically shared publicly. For this post I opted to discuss my process rather than dive deeply into the source code. But revelations about spatial audio, movement controls, and sound synthesis are perfect subjects for future posts.

Most importantly, I’ve decided that my first commercial game (as much as I believe money shouldn’t exist) will be a successor to Soundsearcher. There will be a formal announcement and development blog (here) once I finish its design documents and fully understand its scope. I look forward to sharing more with you all soon.

Until then, be sure to play Soundsearcher here.