fireSpot
Created with Sketch.

Woodbury Shortridge

Sonified Pong

Accessible gaming with p5.js
The pong game user interface

Multi-sensory gaming provides an accessible “pong” experience. Unlike traditional pong, sonified pong invites users to track the ball with auditory queues. Play with your eyes closed for an extra challenge!

This project provided proof of concept for a Tufts University senior capstone project to develop an accessible, auditory aide for independent swimming for the visually impaired.

Premise

While working with Matthew Shiffern (my fantastic intern at the Institute for Human Centered Design), who is blind from birth, I learned about the primarily text-based accessible games available to blind individuals. My sonified pong endeavors push beyond the limits of visual-only modes of communication with the player, providing for more interactive play.

Building

With P5.js, the JavaScript version of Processing, and Web Audio API I created two audio streams, hard panned to the left and right. These audio streams display the ball and paddle movement. On the left stream, I instantiated a triangle wave that is mapped to the X-axis movement of the paddle. As the X position of the paddle increases, the frequency of the triangle wave increases

Then, on the right panned auditory stream, I created a square wave to represent the movement of the ball in 2D space. Like the paddle, the X position of the ball is mapped to the square wave frequency. For the Y position of the ball I created a sine wave modulator so that as the ball gets closer to the bottom, the modulation gets more intense.

To display the ball contacting the boundaries or paddle, I created an auditory cue with a ping-pong delayed tone. Again, the frequency of this tone is mapped to the Y position of the collision to provide context: a bounce off the paddle is a low frequency, whereas off the wall is higher, and off the ceiling is highest.

Here is a video of Matthew Shifrin demoing the game play.

Design

I chose the auditory textures to provide signifiers and contextual awareness of the game state. Matthew is able to understand the ball location in the context of the frame by listening to the frequency changes, modulation, and bouncing cues. While simultaneously tracking his paddle movements through frequency to block the ball at the right time. Along with hard pans, the unique sound textures of the two waveforms help to differentiate the audio stream.

To make the interactions accessible to keyboard dependent users, I made options in the UI for different JavaScript listeners such as arrow keys or even touch.

Gif animation of the visual interface

Assistive tech such as Matthew's JAWS screen reader cannot be used concurrently with game play because they override key functions and often speak over the auditory display. Thus, I used text-to-speech in JavaScript to make a video game-like scorekeeper who makes announcements.

Extensions: Tufts Capstone Project

I was amazed at the play-ability for Matthew, other blind gamers, and anyone seeking a non-visual dominated game experience. I wonder if this proof-of-concept can be applied to wayfinding within a constrained environment. That is, the "ball" could be an actual person in an environment, displaying a stream to heighten spatial perception. Matthew described a perfect problem space for this application: he enjoys swimming, but cannot swim independently as he must bring a friend to literally whack him on the head before he crashes into the walls.

This year I am sponsoring and leading a group of Tufts University senior capstone students to attack this problem. We are currently prototyping and user-testing an underwater ultrasonic sensor system with various auditory and haptic feedback streams to facilitate independent swimming for the visually impaired community.

Made with

heart

by Woodbury with

gatsby
react
styled components

more about me