Skip to Content

soundStrider v0.4.0

An expedition into the elemental palette

After nearly fifty hours of development, the latest version of soundStrider has arrived at its destination. Its features include the beginnings of a procedural quest system, a fresh palette of evolving sounds to explore, and a dynamic compass to help lead the way. And with added accessibility features it’s more inclusive and approachable than ever.

Continue reading to listen and learn as we journey into the otherworldly textures of the elemental palette.

Launching expeditions

The main component of soundStrider’s adventure mode is its procedural quest system, which generates tasks to encourage exploration with fun and persistent rewards. Expedition quests are the first test of this system. By leading players along a circuit to faraway places, they will discover new areas to bookmark and explore.

Let’s take our first expedition together in the excerpt below:

Bookended by the warmth of our campfire, our compass guides us on a journey through new textures with plenty to unpack along the way.

Implementing procedural quests

An event system was added to soundStrider’s engine to handle the selection and activation of sequences. With the ability to specify their priorities, whether they’re required or repeatable, and any additional criteria, it’s robust enough to articulate the plot and quest chains of any roleplaying game. Then by leveraging the observer pattern, other systems such as the autosave mechanism can subscribe to and act upon event completion.

Events themselves were implemented as finite-state machines to handle their starting, completion, and intermediate states. An additional store allows them to associate data with their current state so that it persists between play sessions. For example, the expedition quests have one or more legs that are represented as intermediate states; they store their starting and next destinations so they’re not lost when reloading the game.

Quests in soundStrider simply extend that base event class. Like its props, they are defined as several archetypes that, given a seeded random number generator, produce deterministically infinite variations. This ensures that no repeat playthrough or quest is quite like the last.

Player agency was an important consideration when designing its quests. To keep the event system simple, it activates the next event after the current one completes. Rather than immediately starting the quests, a waypoint is spawned within a time cone ahead of the player which activates its first objective when entered. This allows players to avoid or ignore quests until they’re ready to proceed.

Beyond this first expedition, there are plenty more quests on the horizon. Some that I have planned are an initial tutorial, a port of Soundsearcher’s main quest line, and opportunities to terraform the world. A post-release update may even add a persistent base building chain.

Considering the compass

If the world of soundStrider is a virtual reality, then its compass projects a nonverbal user interface layer that augments its reality.

Something I learned from building Soundsearcher was that its compass could have been more successful. By leveraging white noise it was hard to distinguish it from other sound sources like waterfalls or wind. And by making its effect strongest when pointing toward its destination it was easy to feel fatigued after long distances. Ultimately these shortcomings led me to the current implementation:

Given any destination, it leverages stereo positioning and other sonic qualities like frequency and amplitude modulation to orient players toward their destination. By acting more as a wrong-way indicator, it prevents listener fatigue by only communicating incorrect headings. And its notification system allows quests to alert players of state changes like new destinations or when they’ve arrived.

The result is much more robust, communicative, and accessible.

Introducing the Elemental palette

What if we could deconstruct sounds into their universal essences of pure oscillations? Few like Fourier or Moog have tried, but our expedition together has proven that we can go somewhere that does.

Let’s hear another seed in the excerpt below:

With the elemental palette I aspired to generate ethereal drones similar to Eliane Radigue’s influential Trilogie de la Mort, whose discreet moments are seemingly unchanging textures eternally frozen in time, but zoomed out they demonstrate a very organic and purposeful movement.

While each type of prop here emits a constant tone, a variety of attributes such as vibrato or tremolo ebb and flow to give them character. Staying still will reveal the large timescales at which they operate, making each space they occupy an evolving and meditative experience, whereas passing through reveals the melody and harmony inherently between spaces.

Creating this palette is ultimately a litmus test for soundStrider’s engine. It demonstrates how it could be applied to other four-dimensional, generative musical experiences—especially ones that aren’t necessarily games, like sound installations or Music for Airports. So this encourages me to keep its codebase clean and reusable for future projects.

Accessibility wins

If you haven’t heard by now, accessibility is one of my top priorities as a developer. Like websites or public spaces, games are for all people, or—for an audio-only game, which I’ve reconsidered quite a lot—as many folks as possible. Designing for inclusivity is therefore a moral imperative because it solves the clear and obvious crises of exclusion.

Recently I’ve subscribed to and learned a lot from Mark Brown’s channel, Game Maker’s Toolkit. His video series on Designing for Accessibility was an especially enlightening and enjoyable watch. Importantly, it inspired me to make adjustments to soundStrider’s control scheme and finally start debugging its tougher screen reader issues.

Designing for motor disabilities

Something I discovered from Brown’s videos, specifically Making Games Better for Players with Motor Disabilities, was that I had failed to catch all the ways a walking simulator like soundStrider could be exclusive.

The worst issue I identified was its requirement for players to hold their fingers in specific positions for prolonged periods of time in order to explore its world. This was problematic because it expected all players to have equal stamina. An automove toggle was implemented to solve this issue; when activated, it moves the player forward in their current direction.

Another oversight was the inability for controller users to play single-handed or without analog sticks. Similarly, this was problematic because it expected all players to have equal control. Digital inputs for lateral movement and turning were implemented to allow players to ignore their analog sticks. Forward and backward movement was then added to the secondary analog stick so the sticks aren’t required to be used in tandem.

A lingering issue is that the current setup only supports standard controllers such as my Xbox One controller. For users of nonstandard controllers, a future update could bring the ability to remap controls. While I’m aware this would be the most inclusive solution, it’s also the most costly.

Designing for visual impairments

As a web developer I have worked on a variety of projects having specific and testable accessibility requirements; however, many of which simply had none at all due to lack of awareness of disability or the law.

My first experience using a screen reader was about four years ago when I developed an application for a public university. It was revelatory. The shock and regret that I experienced when I realized that the previous six years of my career had produced so many inaccessible sites and components was so profound that I never want to repeat those mistakes again.

To claim that soundStrider is a game designed to played closed-eyed, I also needed to be serious about its screen reader accessibility. So I finally installed NVDA on my home workstation to assess its full experience, from first launching the game to playing and finally closing it.

It wasn’t terrible—mostly usable with a few rough edges. There were a few oversights, like making sure the credits screen had a negative tabindex and received proper focus so its entire contents would be read aloud, or giving the game screen an aria-label so it’s announced when transitioning to it.

The worst part was the controls. Something I hadn’t encountered in my experience with VoiceOver was that NVDA was eating Escape keypresses which, crucially, were responsible for pausing the game or backing out from screens. This inspired two helpful changes:

  • Alternative controls. The Backspace key was added as an alternate shortcut for any action that requires the Escape key. This circumvents the issue entirely and strengthens its external consistency by behaving more like a browser.
  • Application role. By adding the application role to its root element, soundStrider now indicates to screen readers that it should be treated as an application. This affords it more control over input and focus state compared to the <html> element’s default document role. This change was what ultimately glued together the entire screen reader experience.

With these changes I can reasonably claim that soundStrider is completely playable closed-eyed, but it won’t be long before I revisit its screen reader accessibility. One truth I know is that everything can always be better.

Roadmap for future posts

In its next point release, soundStrider will double its playable areas with the addition of its desert and beach palettes. Comprising these palettes are a variety of textures and behavioral patterns that will be a delightful challenge to implement efficiently and effectively.

One thing I’m always reminded of when updating this devlog is the difficulty of consistently producing quality content. It takes time and, given its June deadline for gold release, my time is unfortunately very limited.

This means that my next few posts will have a lower priority as I diligently work through this content creation phase. Rather than ceasing altogether, they will take a simpler form. For each point release I will upload new audio excerpts to the shiftBacktick SoundCloud account. Here I will announce new batches of excerpts with an embedded player, and offer further insight into my design philosophy and approach whenever time permits.

I look forward to getting back to work so I can share more soon. Meanwhile, please follow the SoundCloud for the latest sneak peeks into my progress.