SonicSwipe is a music-motion game designed for the Leap Motion. It was designed and coded by Alissa Anh Chu and myself as a final project for the class CPSC 112: Introduction to Programming with Prof. Richard Yang. The game features real time hand tracking mapped to the game interface.
We took inspiration from Dance Dance Revolution and wanted to create a version that uses hand movement instead. We implemented an arrow class that we query to see if the user has “swiped” it at the right time. We also studied and implemented the Leap Motion API to access the tracking data from the Leap Motion. Although the game could have been done using arrow keys, we wanted to explore the gestural augmentation of this game (and it’s also super cool!). Gesture control and body tracking has become a hot field lately, namely because it allows a more natural and intuitive interaction with a device or interface compared to conventional inputs like the mouse, keyboard or controllers.
In the process of designing the game, we learned many things about user interface. The initial design did not feature a live score nor a “flash” to indicate a correctly swiped arrow. As the creators of the game, we took this for granted and realized after discussion with our TA’s (Newman Wu and Ben Eskildsen) that “user feedback” in visual and auditory (or even tactile) is extremely important when designing software or hardware for an end user. Moreover, we realized that putting a matrix in the middle of the screen to control the swipe actions forced the user to pay attention to two things at once. The user had to pay attention to where their finger was located relative to the matrix and also keep focus on the top of the screen to time when an arrow crossed the target direction. As we discovered during demo day, it is extremely difficult to focus on these two items on the interface at once.
The reason we did not employ “swipe” gestures was because we found the Leap Motion was not that reliable in detecting these swipe gestures. Moreover, as the Leap Motion is queried several times per second, a left swipe naturally “persisted” over several frames (no one swipes in 1/60 of second!). Although there are ways to code around this, we found that tracking hand location was much more effective for this game and the project time constraints.
A future design will integrateĀ the arrow and matrix interface together so that the user does not have to look at both. For example the arrows could be coming from all four sides towards the center of the matrix, this would make one focal point in the game. Alternatively, we could retain the DDR interface and implement swipe gesture recognition while working around the Leap Motion’s inherent Swipe gesture detection. The project also made it clear that taking a game designed for one interface and “slapping” on motion tracking is not necessarily the best method of interface design. The game must be designed fro the ground up to take advantage of the Leap Motion’s strengths.
Nonetheless, considering this was a final project for a single semester introduction to java programming, I think we did a very good job! In fact, I joined the class because I wanted to be able to use and implement the API’s of sensors like the Leap Motion. Thus, designing and coding this game was a dream come true! I’m very excited to be writing more programs for the device!
Thanks to the TA’s and Prof. Yang for such a wonderful semester!
Good times!