Revenge of Left Shark

Revenge of Left Shark is an interactive rhythm and dance game. In other words, DDR with your hands. The game uses computer vision to place the player on the beach, supply visual cues and provide real-time scoring information.

Here is the game being demo’d at ITP’s 2018 Spring Show by one of my heroes: Dan Shiffman

700+ lines of code later, a simple game has been fully created from what was once a tiny idea.

This project was made in collaboration with Gabriel BrasilGitHub Repo here.

CONTEXT & INSPIRATION:

In 2015, during the Katy Perry Halftime show of Super Bowl XLIX, the shark dancer on the left side went way off-script:

“Left Shark” became a thing. An expression of ‘doing your own thing’ or just regular messing up. Just look at all the GIFs.

I started to think that maybe Left Shark could benefit from some choreography lessons… learning through a game like DDR (Dance Dance Revolution) – this game:

Here at ITP – 3 years later – I have the physical computing skills to make such a game happen. It also helped that I already owned a Left Shark costume.

With the combination of my fascination with chromakeying (aka green-screening) and some of my new knowledge from my pixels class, I was able to write a Java application using Processing 3.0.

The game was a big hit for players of all sorts  – 100’s of people played!

As you can see, people had a lot of fun with it.

IDEATION & TECH:

The initial idea was to use the Xbox Kinect to track the hands and moves of a person. The Kinect is a 3-D camera the looks like this:

perhaps you have seen one looming on a tv stand…

The Kinect has really good skeleton tracking – which would be useful for persistently tracking ones hands. It tracks the whole body!

Unfortunately, the Kinect only works well with Windows computers. Sorry $MSFT, but I live in a world of $AAPL hardware now. With no Windows computer, I opted to find another means of hand-tracking.

Another option was pixel tracking. This case involves using a regular camera (or web-cam) and analyzing each pixel, check to see if it is similar to some standard, like a color, depth, or brightness and changing it based on this value.

For example, pixel tracking and replacement was used to make the beachball chromakey processing sketch seen here.

The original idea was to use these colors to track the user’s hand position on the screen. Here is the back-of-the-napkin version:

thought of this at Ippudo

…and a slightly larger back-of-napkin:

Yet the pixel tracking used for the beach balls wouldn’t work well for reliable tracking of a group of pixels. In other words, I wanted only the hands to be tracked, not some random single red pixel. Also, how will the computer know when a particular pixel is doing the “right” thing?

just a friendly neighborhood blob

Enter Blobs. Stemming from the idea of pixel tracking, we created a class of objects called “blobs”. Blobs are just code objects that track a specific color (or depth).

These blobs are made and morphed through an algorithm that dynamically decides on their size and position in every frame. The squares seen in the video below are the blobs that tracking the color.

The blob detection used in the project is a slight modification from Daniel Shiffman’s BlobTracking_improved processing sketch. For more information on blobs, check out his videos on computer vision in Processing, and specifically this one:

DEVELOPING A MINIMUM VIABLE PRODUCT (MVP):

where my consultants at? …core functionality

The game was developed with a continuous improvement approach. First, I made a working version – no matter how ugly or boring. Then improve upon the design and “build out from the core functionality” – a term I can’t help but put in quotes.

To do this we made a system to match the blobs to some kind of target – aka collision detection. The collision detection system was built by creating a new object class (called “Target”), and determining if the objects were overlapping on any given frame. Collision detection was done with the distSq() function from the Shiffman’s blob code. Specifically:

for (Blob b : blobs) {
  if (distSq(b.getCenter().x, b.getCenter().y, 
             target.xCenter,   target.yCenter) 
             < b.size()) {--- do something when the collision happens ---}
         }

In a sketch, I set the Target object to move around a few places and put some text animation in the “do something” part of the snippet above. The detection was working!

After the collision detection was working, I developed the choreography for the targets to follow. The basic moves below were the inspiration for the 4 routines developed.

These moves were hard-coded into x and y coordinates, mostly through mental-math and guesswork…

an over-drawn coordinate map (based on a 1200 x 800 pixel screen)

…and then transformed into processing-friendly coordinates using a good old-fashioned custom excel formula. This was without a doubt, the least-fun part of the project.

back in excel after all these months…

 

I also decided on ‘California Gurls’ by Katy Perry, to keep with the Left Shark theme. With the song decided, I created a modular section timing system which utilizes the millis() function at its core. The timers work based on the following hierarchy:

Each song section has a duration (timeSection) and corresponding routines (e.g. leftRoutines) defined in lists. Each routine has a set of steps to go through. Each step has a set of x/y coordinates to indicate where the target should be. Each step has a duration equal to the BPM of the song (in this case, 1920ms).

The sectionTimer variable and other step counters are used to loop through the coordinates put in the routines. This is why you will see detailed x/y coordinates listed in each of the giant leftRoutine and rightRoutine variables.

To help test faster, I used an entirely different sketch to work out the kinks of the timing and choreography. This part definitely drove me a little crazy. I’ve listened to ‘California Gurls’ more than anyone ever should. Here is what a test choreography sketch looked like:

FURTHER DEVELOPMENT & TESTING:

Next, the green screen features were added, sweeping us away to a tropical environment for further work. I set up a green screen and added more features, formatting, and storyboarding for hours.

a rare photo of me both happy and programming, green screen beach behind me

I improved the animation and timing, added a scoring system, developed a storyboard and game stages (kept track of by the screenSection variable in the code) to surround the actual interactive part.

To test, I first made red and blue cards for holding in my hands, to avoid wearing a whole suit or glove every time I needed to test something.

Then I tested…

…and tested…

…and tested…

…until it worked well enough to try the real deal.

With all of the components in place for a working version, it was time to test if they would work under “showtime” conditions. I suited up and tested it out:

After this run, I continued to tweak the program in preparation for a the performance you saw at the top of this post:

It was a fantastic experience to make this game, first as a performance, and then as a game. I hope you enjoyed reading. If you have any questions about the project or the code, please don’t hesitate to reach out!

Copyright to Katy Perry and Capitol Records for ‘California Gurls’

Leave a Reply

Your email address will not be published. Required fields are marked *