Blog

Ball Game Update: Final! Winter Show!

This is a post about the final updates to our ball game and the 2017 ITP Winter Show.  For more about the project’s conception and prototyping, look here, and for our planning process, look here, and for some on our fabrication, look here, and again here.

Final User Testing

Our in-class discussion and feedback for our final project presentation focussed on opportunities for increasing subtlety and complexity in how we presented information.  April and I aspired to create a game which didn’t require any instructions — one in which the game’s interface afforded exactly what the users could and should do.  In practice, the game required a single instruction:

  • shoot for your own goals (rather than your opponent’s goals)

This could have perhaps been solved with a painted playing area (one half being red and one half blue), or some other related design choice, but in the time we had left, it was solved with an on-screen instruction.

Additionally, we would have liked to have the game’s reset and (planned) menu options (for choosing between planned single and 2-player modes, difficulty levels, high scores, etc.) to all be accessible through the game’s goal interface (i.e. menu options could be chosen by shooting for specific goals), but we again did not have time to fully implement this.

Winter Show

After presenting our final in class, we had one last opportunity to user test at this year’s ITP Winter Show.  We updated our project based on our in-class feedback, focussing on making our serial Arduino to P5.js interface and physical game more robust to hold up to the several hundred visitors for the show.   While debugging, we simplified the serial protocol so that the “hello” portion of the handshake reset the game on the p5.js side.  This allowed the arduino to be reset directly (using a big red arcade pushbutton attached between its reset pin and ground), rather than through a p5.js interface.  In practice, this meant our code was simpler on both the Arduino side:

and on the P5.js side, which recognized this “hello” function as a cue to reset the game:

Of course, using this method, rather than use the much more complex protocol we had previously used, in which our p5.js sketch had control over all of the Arduino’s goal-states (which control the LEDs as well as the scoring) was a decision driven by need rather than desire.  Our previous serial communication protocol was buggy and this work-around seemed best suited to fix it.  If I were to continue with this project, however, I would re-implement p5.js control over the arduino, such that all of the game logic was contained in one place and could be more easily connected, changed, and iterated upon.

Additionally, if I were planning the project’s next phase, I would work on:

  • increasing interactivity with respect to goal-state changes and scoring:  I could imagine a number of modes in which the goals responded actively to scoring to benefit either the first or second place player.  For instance, goals switching colors after every score would benefit the second place player while having scores turn off an opponent’s goal would benefit the player in first place.
  • managing chaos: while the bouncing ping pong balls were fun in class and at the Winter Show, that particular aspect of the game didn’t lend themselves to any sense of containment or order.  Without a dedicated game space, or a net setup like I put up at the Winter Show, chaos rules.  Perhaps this game is not meant to be portable, and should be used with a net, but I would be interested in cheap (in terms of less complexity while building, not just cost) solutions to this problem.  I can foresee using less bouncy balls and/ or creating angles on the faceplate and goals which redirected incoming balls downward.
  • dialing in the interactions: ideally, this game wouldn’t require a screen, or at least wouldn’t require anything resembling a sentence on screen.  I can imagine this working if we added more visual and aural cues for the player in terms of blinking goals after a score, more sound cues, spoken countdown and game over cues, etc. and would increase the overall smoothness of the game experience.

All of that said, the reaction at the Winter Show was very positive!  Our setup was packed the first day with children (who often played three or four games in a row) and — after some brief repairs — was packed again on the second day!  It held together remarkably well through several hours of sustained use, and was reliable in its score-keeping, serial communication and p5.js sketch.  For whatever reason, we sometimes needed to reset the Arduino several times before the p5.js sketch reset, but it worked reliably enough that I never quite tracked down this bug’s source…  Anyway, here are some photos from the show:

Update: The Depths of the Pacific

Elizabeth and I spend some time editing our After Effects assignment to clean up the animation and editing.  In After Effects, we added eases to all of our character and camera moves which made them much more natural-seeming.  We also avoided moving to the music, which was too slow, but made the animation work first, then cut to the music (and cut the music) in Premiere.  We also increased the ‘underwater’ effect, which helped to tie together assets from different sources and lighting conditions and resolutions.

,

Animation Final: Unity

For this project, I used the Unity game engine and assets from Mixamo and Thingverse to create a small game in which a player navigates an “Oh the Places You’ll Go”-esque landscape as a samba-dancing older man who collects golden cassette tapes to play his favorite tunes.

Terrain

After making the terrain in Unity’s terrain builder, I added trees from the asset store and the terrain builder.  Frankly, I’m unsure what the difference is between trees made in the terrain builder and trees dragged-and-dropped onto the scenes main folder (so they show up as individual assets.  I suspect you’re meant to populate a map with trees through the terrain builder, but am unsure if that is computationally less expensive, or just less labor intensive…

Wanting a ground like Dr. Seuss’s “Oh the Places You’ll Go,” I made up a texture with stripes of the same color:

 

This was then applied to the terrain as a texture and produced a ground pattern which failed to repeat with itself.  To make the texture seamless, I saw online that I could flip it horizontally and vertically and offset the results to create a repeating pattern:

 

Below you can see the difference between the original texture and the seamless one:

Cassette Tapes:

In order to add the cassette tapes to the game, I followed along with this “Roll a Ball” tutorial.  I found a cassette tape asset on Thingverse here in STL format and used this online tool to convert it to an OBJ file accepted by Unity.  I wanted each cassette tape on the map to play a different song when the character walked through them.   Scripting wise, the most difficult part for a neophyte such as myself was to trigger an audio source to play at a single point in 3D space (the location of the tape), rather than globally in the game.  This was accomplished (with quite some help in person and here) using the PlayClipAtPoint method of an AudioSource.   

This done, I ended up with a game which looked like this:

Carnival Ball!

This is a post about the last stage of our arcade / carnival game construction and coding.  For a bit more about the project’s conception and prototyping, look here, and for our planning process, look here, and for some on our fabrication, look here.

After last week’s user testing session in class, April and I focussed our last week of work on improving two aspects of our construction and design:

  • Physically, it was important that the ping pong both didn’t bounce out of the goals and reliably returned to the players.  Because ping pong balls are designed to bounce (and do this quite well), this proved to be more of a challenge than expected.
  • In terms of the code, it was necessary that the goal sensing/ score detection worked without fail.

Physical Construction:

At first, I made several attempts to design goals which worked reliably (didn’t allow bounce-out) from 1/4″ luan.  My process involved mocking up the panel shapes with cardboard, cutting them out of luan, and gluing them together.  The angles were designed to redirect the forward motion of the ball downward.  Despite my cardboard mockups working quite well, however, luan ended up being much too spring-y of a material to reliably avoid having the balls bounce out.  So, I cut our final goals out of cardboard, which has just enough sponginess to avoid the balls bouncing out.

 

Goal sensing:

Piezo sensors proved too noise-y to reliably sense a single goal.  So for our final version, we returned to light-sensitive resistors as a way of sensing goals.  We had moved away from this method for cost reasons, as the light sensitive resistors embedded in color sensors were quite expensive but having decided not to sense ball-color, the light sensitive resistors themselves were both reliable and cheap.  For each goal, we positioned a white LED aimed at a LSR hooked into our Arduino’s analog input.   The ball would fall between the LED and LSR and create a drop in voltage seen at the analog input pin.  In code, we used a peak detection method described here to sense this drop as a score for either the red or blue team.  Because the mounting for each goal was slightly different, we had to individually set the threshold values for the six goals.

Serial Communication between P5.js and Arduino:

Our final challenge this week was developing a protocol for communication between the p5.js sketch and our Arduino.  Our midterm project used a reliable two-way Serial protocol with a handshake method (as described here) to send scores from the Arduino to the p5.js sketch and to reset the score.  For our final, however, we had the added element of setting each of the goal states as either “red,” “blue,” or “off.”  This would indicate that a goal could be scored upon by the red team, blue team, or neither, respectively.  At first, we hoped all of the logic behind changing goal states could be handled on the p5.js side (where the rest of the game-related logic lived), and goal state information would be sent to the Arduino as a string.  In the Arduino, the goal state would be used to determine which team’s score was added to when a goal was made, and which LED’s were turned on.

This protocol worked quite well in isolation.   From the Arduino Serial Monitor, we could send strings of “gRRBBRB” to set the goal states to red, red, blue, blue, red, blue and the Arduino would switch it’s internal scoring logic as well as lighting accordingly.  However, once this was incorporated with the rest of the p5.js serial protocol, we began to notice strange behavior: goal states would be sent from p5.js to the Arduino but not change the goal states, or other failures would occur.  I believe this is due to the p5.js sketch timing — and the fact that asking the Arduino for the score (by sending an “x” serially) could happen in the middle of sending a goal state update.  I am reworking this protocol to avoid this.

 

Take the Money and Run: ICM Final!

 

 

 

Simon Jensen (link) and I made a game in which you generate and control a personalized bot which travels the world (in your stead).  The project can be found here and the code can be found here.  Below is a run-down of the various elements that went into the project in terms of both technical and design choices:

Server:

Our node.js server performs several functions:

  • serving the game pages to the clients using Express.js,
  • communicating with those clients, relaying player login and saved game information between the client and the database using Socket.io,
  • pushing player and saved game information to Firebase database and querying for saved information.

Database:

We are using Google’s “Firebase” database as a service to store game information.

Saving Images:

I had some trouble figuring out exactly how to save images to the database using the p5.js library.  At first, I tried passing the p5 image object to the database directly.  This didn’t seem to work, which I think was because the object

This stored several

Because the built-in p5 save() function automatically downlaod

Questions

This project brought up a number of questions about how javascript executes in time and how p5.js interacts with DOM elements:

  • Because our game would dynamically query the flickr API for photos from cities where the player-avatar was visiting, we were making loadJSON calls throughout the game.  As such, we were unable to place this loadJSON call inside of the …….. confusion about function execution order…

Arcade Game Update!

Yifan and I have been in fabrication mode as we prepare for user testing of our arcade-style ball-tossing game.  Below is some of what went into the construction work over the past few weeks.  For a bit more about the project’s conception and prototyping, look here, and for our planning process, look here.

Game Case:

The case of the game is made from 1/4″ MDF.  April modelled the front panel in Rhino CAD software, which we then ran through MasterCAM to generate the G-Code to run the Techno 40 CNC.  MasterCAM has an incredibly broad feature set, and we no doubt could have spent weeks learning them all, but as we were only cutting holes (rather than true 3D features), we managed to make it through the process without too much trouble.  Our goals were to be made from 6″ schedule 40 PVC pipe, and we used its outside diameter as the diameter of our goals.

Goal Sensing:

We decided to switch to piezo disc sensors to detect the number of goals made.  This decision was made based on a few reasons: they are much cheaper (all six of our piezo sensors and several spares equalled the cost of a single color sensor), they simplified the build (in order to make the color sensors work, light would have to be controlled — i.e. the balls would have to pass through a light controlled tunnel), and the benefit of color sensing was unclear in our first round of user tests.

Goals:

Our goals were cut from 6″ schedule 40 PVC pipe.  This process introduced us to the complexities of cutting specific angles on a round pipe too tall for a single cut on the miter saw.   As we needed to cut from both sides to achieve the required depth of cut, we marked/cut as follows:

  • mark two lines down pipe at opposite sides,
  • measure top of goal length (2″) down one line,
  • measure bottom of goal length (4-1/2″) down other line,
  • using a flexible cord (really a length of RJ-11 telephone cable) held taught between two points, mark the line on both sides,
  • cut from one side, aligning saw with marked line,
  • rotate pipe 180″ and reverse miter angle,
  • cut from other side to cut off.

Ball Guides:

Behind each of the goals is a series of walls which guide the ball onto a board behind which is mounted the piezo sensor and back down into play once again.  I made a pattern from cardboard using a bevel guide –shaping it to fit behind the goal — then proceeded to make them in luon.  Because I built these in a relatively straight-ahead way — without a detailed CAD or paper drawing — it was and continues to be a process of refining the fit such that the balls neither bounce out of the goals nor get stuck anywhere in the works.  To circumvent this first issue, we decided that the back of the goals should be cloth to cut down on bounce.  Surprisingly, cloth seems to work only so well in this regard, and is required to be quite loose to prevent bounce.  Another possible solution to this problem would be to create interior goal walls with angles that bounce incoming ping-pong balls internally, but not out of the goal.

 

Goal Lights:

Because each of the goals must be able to display its mode at any given time (off, red, or blue) depending on whether it counts any scored balls, and whether it directs those points to the red or blue team, we decided to mount RGB LEDs inside of the white PVC pipe sections of goal.  We hope this will be a clear visible affordance for the red and blue team players.  That said, we have little indicating which player is which at any given time, so will need to find a solution to that problem.  Perhaps sets of red and blue game-gloves?

Electronics I/O:

Inputs: the Arduino is using its six analog inputs to read the incoming sensor data from the piezo sensors.  We will use a threshold detection scheme to register a hit as a point, and attribute it to the goal’s current team allegiance (red or blue).

Outputs: because each of the goals is using RGB LED strips, which run on 12V, we require 18 (six goals with three colors each) total TIP-120 transistors to be able to fully control these strips.  To switch these transistors will require the addition of an MCP23017 input/output port expander which communicates with the Arduino using I2C.  In our case, it isn’t strictly necessary, as we currently only have red and blue modes for the goals, but it would be nice to have further capabilities down the line, should user testing inspire any additional modes.

 

Final Project Planning

On Planning

How do we allow creative flexibility into a project timeline while recognizing the logistical needs of getting it done?  This question comes up at some point in any project’s maturation from idea to mockup to fully realized creation.  How do we maintain a flexible and evolving understanding of the project’s goals throughout a planning process which by design tries to ‘nail things down’?  Personally, I find that planning — specific and tedious planing — is not a limiting activity, but raises questions which ultimately lead to a greater understanding of the project and how to execute on it.  The act of pre-visualization demands a more rigorous mental model of the project than I would otherwise develop and forces decisions to be made.   As long as planning is done early and often, and as long as you are willing to to discard previous plans, you will have a greater conceptual and logistical understanding of your project.

Final Idea

For our final Physical Computing project, April and I are planning to continue our midterm project of an interactive ball-tossing game.  After combining feedback we received from our class following last week’s proposal with feature ideas we have been developing, we are left with a outline for a game with a rather broad set of features and challenges.  Narrowing this feature set down into a manageable project will be our goal for the coming week.  There are aspects of the interaction we know we need to improve upon: improving ball-return, increasing goal size, and balancing difficulty for players of different skill sets.  There are features which we would like to see implemented: increasing number of goals from two to six and dynamic goals (which turn on and off).  Finally, we have several questions we would like to answer through user testing: how do players interact with the projection-based scoring system we currently have in place and could this be improved upon?  Could we / should we incorporate a storyline element in the style of pinball storylines (i.e. the storyline does not affect gameplay, but it visually interesting)?  Would it be possible to develop a single-player mode?

One of the comments we received in last week’s feedback was that the game is sufficiently complex with only two goals.  We believe, however, that having each goal have two or more “modes” taps into an exciting vein of potential interactivity between player and player, as well as between player and game.   Physically, each goal would have a ring of lights around it which indicates its current mode: active (lights on) or inactive (lights off) as well as which team it belongs to: blue (blue lights) or red (red lights).  When a goal is “active-red”, any balls which are scored through it increase the red team’s score and visa-versa.  When a goal is inactive, no scores are counted.   This allows the possibility of a single player mode in which the game responds to the player — i.e. the active goal changes every time the player scores.  It also allows for the possibility of increasingly complex and  interesting scoring mechanisms: a blinking goal doubles score, etc.  In short, we feel that this added complexity in hardware design would be well worth the potential increases in interactivity.

Nothing to it but to do it…

Of course that isn’t quite true, and because we are adding a level of complexity to the goals and are still unsure of the scoring system, I think it behooves us to build soon and use a working model for playtesting.   While I would like to playtest with a cardboard model for longer, build another model, and properly nail down the interactivity before the final project is built, our shortened timeframe doesn’t necessarily allow for that and I believe our current model is a step too far removed from what we’d like to see to use as a proper play-testing model.  Because interaction depends so much on specifics (of timing, physical spacing, and subtle allowances in the goal-lighting), it is difficult to bring our idea forth without fully building it.  That said, much of the coding work is less vital to playtesting, and can happen relatively later in the process.

Below is a preliminary bill of materials and a very preliminary timeline.

Bill of Materials:

Timeline:

 

Stop Motion

A quick stop motion animation:

We made this animation using the stop motion software Dragonframe connected to a DSLR shooting directly down at a table.  The actual field of view is quite small — no larger than 10 or 12 inches across.  Two thoughts:

  • the materials which compose your animation are as important to the final look as is the movement — non-standard choices here would make a much more interesting video.
  • even with the onionskinning view, it is difficult to understand exactly how certain movements will play out.

Pixel Manipulation

Using the p5.js image object’s pixel array, I attempted a number of different image manipulations:

reduceColors():  in lieu of a proper “posterize” effect — one which finds a set of color averages from the value and frequency of each color in the image — this function simply maps each pixel’s color to a reduced set of possible colors.

Original Image:

  

thresholdChange(): this function takes three thresholds (one for each color value) and takes any value above that threshold in each pixel, and bumps it up to 255:

Finally, I made a slightly different version of the pointillism effect from class, using a different method to iterate through the pixels.  As in all of my functions this week, speed became an issue and I began to understand why certain techniques (such as using modulo to spread the load out across multiple frames) allow an animation to run much faster despite the appearance being nearly identical.

Setting up a github workflow:

Finally, I spent some time this week following along with this set of videos and spent some time figuring out how to work locally using the command line tool http-server and store code in github.  While it doesn’t appear much different to me now than Dropbox / Google Drive / etc.,  I imagine once I begin working on more collaborative coding projects, that will change.