About Me

Report

DeeJump

Home
Map

PORTFOLIO
ALEXANDER BALSON

Welcome

This is my 2015 portfolio, a collection of my work done over the last 3 months in Physical Computing and the creation of DeeJump, a music skills learning game. DeeJump was created by Joel Tan, Hayden Stapleton, and myself.

Read about my project, DeeJump, to the left. Then read about my design process to the right.

This site is based on the classic idea of a dungeon crawler, so instead of directing you to a link, you will be directed to follow a direction. Enjoy exploring the map by clicking the side panels, using your arrow keys, or WASD keys. If you want to skip somewhere in a hurry, find the map in the lower right corner.

DeeJump

It can be difficult to keep children engaged when teaching music,
So we created DeeJump.

DeeJump.
2015 UQ Physical Computing Exhibition. (2015). DeeJump [Picture]. Brisbane, Qld: Photographer: Epifani, A.

DeeJump makes the most of the benefits of music games on formal music education, by extending on products like Guitar Hero and Dance Dance Revolution, in relation to research that shows where benefits exist, or where they could be added. DeeJump is an engaging interactive experience, intended to be used in a classroom setting as a cyclical learning activity.

The game is intended for students in grades 3-8, suited to fit the Australian music curriculum for these ages. Teachers are secondary users of the game, due to the autonomy the game gives to students. Music and learning in early years were our design space for the project, as part of the greater physical computing theme of playful and open ended interactions in everyday life.

The game has 5 people playing at any one time, 4 performers and one DJ. Performers jump on large floor buttons to activate the sound of their instrument, which can be synths, drums, or even vocals. Coming towards them down the screen are extended dots that show the player when they should jump on the button. The game aims to reward experimentation and doesn't punish any players for trying something outside of the dots. Their sound will play regardless, giving them the chance to experiment with rhythm and texture as they add their own part to the performance. Performers start at a button on one side, and as the song continues, they move down the buttons towards the other side, experiencing different aspects of the song as they go. They then become the DJ, a position where they can control the tempo and different aspects of the sound. They have 4 aspects to change at first, but as they play with them, more become available.

The game takes inspiration from Guitar Hero in its use of lanes, but instead of a lane for each note on a common instrument, each lane is a different instrument. The inspiration from DDR comes from the use of floor pads. The following works informed our design process, giving us a basis for design decisions:

'Playful Sounds From the Classroom: What Can Designers of Digital Music Games Learn From Formal Educators?'' (Duysburgh, Slegers, Mouws, Nouwen, 2015)

'Interactive Music Video Games and Children's Musical Development' (Gower, McDowall, 2012)

'Music-Games: A Case Study of Their Impact' (Cassidy, Paisley, 2013)

DeeJump was designed by myself, Joel Tan, and Hayden Stapleton:

Alexander Balson
alexander.balson@gmail.com
My Blog.

Joel Tan
joeltanthuanewe@hotmail.com
Joel's Blog.

Hayden Stapleton
hstap1@gmail.com
Hayden's Blog.

How It Works

For the technical demo of DeeJump shown at the 2015 Physical Computing Exhibition, the setup used 2 computers, 2 MakeyMakeys, 2 Arduinos, 4 floor buttons, and around 100m+ of wire.

Game diagram.

MIDI formed the heart of DeeJump. Without MIDI it would have been almost impossible to create our own rhythm game with any kind of reliable timing whatsoever. MIDI was designed for connecting musical devices, and it gave us the possibilities that we needed. The other core life-giver of the project was Max, an incredibly powerful prototyping and development program that can interface almost everything to almost everything. In DeeJump, Max is how we were able to connect all of the technology together with very little latency. To see more about how Max was used in the project, please go Right.

The computer running Max and Ableton Live used internal MIDI connections (Mac IAC Driver) to interface between the two programs. In the DJ controls, there were USB MIDI keyboard internal PCBs installed that received MIDI output from Ableton. These signals were then sent to a USB MIDI adaptor, connected to the Unity computer.

The game used Unity for its visuals as it was the only system that provided the functionality we needed in combination with a reliable MIDI bridge plugin. To see more about how Unity was used in the game, please go up.

As I have been working with Ableton Live for many years, it was only appropriate that I used it for the project. It can be controlled in many ways by MIDI, which made it perfect for use with Max. To see more about the way Ableton was used, please go left.

After my initial attempts to use the USB MIDI keyboard PCBs for player button signals failed to be consistent, we had reached a point in our knowledge of Unity and Max to attempt an earlier plan we had, to use 2 computers instead of one running all the software. Each player button was attached to the two MakeyMakeys, using the keypress from the MakeyMakeys in the programs in focus on the two computers. This way both Max and Unity knew exactly when a button was pressed. This let us process the game logic and output signals to Ableton without much latency.

The tempo changing was controlled by momentary switches in the DJ booth, which were also connected to one of the MakeyMakeys. An Arduino was used in the booth to control the various LED lights installed. The DJ controls used the MIDI knobs and sliders embedded in one of the USB MIDI keyboard PCBs.

Max

The full Max file.

I only started with Max in mid-April, but very quickly I saw the potential for it forming the basis of the entire project.

The MakeyMakey inputs from the buttons were parsed into different 'bang' messages that Max uses to trigger its objects, from these bangs the instruments and drums are activated.

The Max file player parts.
The player parts on the left of the picture show the inputs from player keypresses. The coloured bands are set of instruments for each song.
The Max file instruments.

The player instrument sub-patches send out MIDI controller messages to Ableton Live via IAC MIDI driver buses, with envelopes applied to slow their attack and release. The messages control the volume of individual tracks in Ableton. The drum sub-patches use Max's 'Makenote' object to create a MIDI note message, then send it to the drum tracks in Ableton that are armed for recording (listening for input).

As part of the experience of the game in terms of exploration by the players, the Max instrument and drum sub-patches let them activate sounds whenever they wanted to. To give players feedback about their performance (another important part of the experience), the Max patch needed to be able to 'score' the players. For this Max listened to the MIDI signals Ableton was sending to the Unity computer, and compared these to the signals coming in from the player buttons.

The Max file scoring.

To generate the correct outputs for the RGB LEDs in the player buttons, I found the delta (rate of change) of the score count each time it was updated. This mostly resulted in values between -1 and 1, sometimes higher. A delta of -1 meant the player was not pressing the button in time with any suggested rhythm, or not at all, a delta of 0 meant the player was playing well to what the visuals suggested, and a delta of 1 or higher meant that the player was experimenting, and trying something out (a good thing, and what we want!).

These values, like the other values that triggered LEDs in the game, were used to send specific number values to the Arduinos.

The Max file arduino interface.

For the tempo control, keypressed from the MakeyMakey were sorted and used to trigger control messages that were sent to Ableton. This section of the patch also controlled starting and stopping.

The Max file tempo control.

To change the output from Ableton as the game went on, the patch would count the beats and trigger different sets of instruments at the right time.

The Max file song changing.

Unity

The Unity part of the game used a command line MIDI bridge program that allowed us to spawn objects and change variables in the program based on the key value of notes received.

The MIDI signals received by Unity were roughly 2.5 beats ahead of time, to make sure that they reached the players' feet in time with the correct timing for the player parts.

After a bit of help, a beat detection script was written for the game that was used to sync the beat lines coming towards the players, but after experiencing timing issues with some tempos, we chose to use discrete float values.

Ableton

From my experience with Ableton Live I knew that we could control everything we needed to with MIDI signals (that another program could provide). It gave us many options for the sounds that came out from a player's press of a button. Using sequencers would have required players to perform perfectly for an in-sync, fitting sound to come out. In the end players were controlling audio track volume envelopes via their keypresses into Max. It also gave us opportunity to implement the DJ controls aspect of the project.

Although in the end we had to limit the number of songs used, initially there were 7 songs implemented into Ableton, split into parts, and ready to be controlled by players. Overall this was around 25-45 minutes depending on the tempo, a session length as suggested by teachers interviewed during the design process. Each song was split into parts, with at least the four parts for players to control separated from the other tracks. Some tracks were included as cues, to continue playing regardless of play input, to demonstrate the tempo.

Play/stop, tempo, track volumes, and effect parameters were all controlled by Max 'ctlout' objects, for more information on the use of Max, please go right twice.

Exhibition

The DeeJump creators.
2015 UQ Physical Computing Exhibition. (2015). DeeJump Group Members [Picture]. Brisbane, Qld: Photographer: Epifani, A.

At the 2015 UQ Physical Computing Exhibition, DeeJump was played by members of the public, school children, other UQ students and members of the UQ community.

The only functions that weren't included in the exhibit demonstration were the DJ effect changes, and the game's automated instruction for players to move to another button. Even without these features, users could see the more integral elements of the interaction we designed. Often after hearing a verbal explanation of the other features, it became clear what the missing functions of the game were, and how they would have worked.



We learnt so much from our experience at the exhibition. For my personal reflection of the experience, please see Report.

Kickstarter

To demonstrate the project, here's our kickstarter video:



Tan, J. (2015). DeeJump - a Design Computing Studio Project [Video]. Available from https://www.youtube.com/watch?v=-TKLfXMO2Co.

Design Process

Throughout the semester, I was thoroughly buckled into the rollercoaster ride that is design. From here continuing to the right, you will see the process I took in the course, from inspiration through to final product.

As you will see, the process has been split into major areas:

Inspirations

2/3 to 15/3

The first task of the course was to explore sci-fi inspired playful and open ended interactions in everyday life, a perfect first task for a course I thought would be very creatively based. I started by watching my favourite sci-fi classics, and finding the interactions that jumped out to me as a starting point for an idea.

We were given the course's theme at this stage 'playful and open ended interactions in everyday life.' This wasn't all however, there were extra areas: healthy ageing, learning in early years, street computing, and social presence and awareness. Because these themes were introduced here, I inadvertently combined the sci-fi, playful and open ended interaction, and learning in early years into every idea I had. This had an interesting run-on effect, which I discuss in Prototype.

I generated a wide list of ideas in the end, ranging from real life Force skills (Star Wars), to use of 3D grids for music creation. In the end, the concept I introduced to the class was 'Imiwa?,' a project to teach users about Japanese kanji. To see more about Imiwa?, please go up.

I loved the idea of what I'd created, and I aim to build it one day. At the time, this idea had stuck in my head as one that could be used for a final project, and to this day, I think Imiwa? could have been a better concept for the course than DeeJump, purely because of the more interesting physical computing interaction. It was the area of music that won me over in the end however.

Imiwa?

An image of the concept

'Imiwa?' is a 4D learning experience for children, where they discover the meaning of Japanese characters through play and experimentation. Imiwa loosely translates to 'what does it mean?' Basically, through experimenting with characters they don't know, they affect and influence a large interactive nature scene, learning what each character means in the process. Sometimes it will rain on them, sometimes it will be cold or hot, and sometimes the lighting will change for example.

The 4D aspect makes it much more memorable and exciting. The concept is designed for the level of Japanese taught in grades 1-4, but everyone could learn and have fun.

In Japanese, the characters from Chinese called kanji are notoriously difficult for people to learn, especially kids. Through a 4D experience, with no English explanation for each character, Imiwa helps to build a photographic memory link between a kanji, and its meaning. And there's no easy way to know how kanji are pronounced, so Imiwa reads aloud each kanji as it's used.

Some kanji are pictographic and easy to imagine, but most are quite abstract. Imiwa uses a range of kanji, introducing children to the patterns that are common and helpful when learning kanji.

The first important part of the concept is the kanji blanks that children trace over to 'activate' the character. These blanks only have faint lines of the character to start with, so children must trace over them with a special pen. The blanks then light up to show they are activated, and ready to put into the center of the tsukubai.

A tsukubai is usually a stone sculpture fountain with a large opening in the center, but here there's no water, and the center is a dark space where the kanji blanks go in. After going in, the blanks' lights go out and it disappears.

At this point whatever is represented by the kanji appears in the nature scene, and in some cases the 4D element would kick in, making it rain, blowing hot or cold or room temperature air, changing the lighting, for example. In this case, yama 山, mountain, would create another mountain in the distance, and the tsukubai would say 'yama' to reinforce the pronunciation.

Some of the kanji used by Imiwa combine to form other Japanese words, so accidentally or on purpose, kids can enter two kanji at the same time to see what the combination does.

Imiwa is designed to encourage many children to play and discover at the same time, collaboratively to create a thriving natural space, or otherwise (like using the kanji for fire over and over again, it could destroy everything).

To stop it from getting too crazy, and to keep it fresh as new kids play with the scene, elements added to the scene disappear after a few minutes.

Imiwa could be used in a library, school, or at a Japanese event. Anywhere where kids are learning Japanese, or where event organisers want an engaging Japanese experience for children (and no doubt adults too).

My inspirations are Ubik for its main plot device, Ubik, a product that changes form but always stays the same, like the kanji and scene representations of the same thing. The Fifth Element for its interesting user displays that only appear for 5 seconds in the background, and Independence Day where they had to experiment with alien technology that was completely new to them.

Forming Groups

16/3 to 29/3

Weeks 3 and 4 saw the class take part in World Cafe activities, Arduino workshops, group formation, and final concept ideation.

During the world cafes the class as a whole travelled around to develop various ideas, and in some cases good ideas came out of it. My experience at various tables where not much happened drove me to decide on the area I wanted to work in. I spent a lot of time at the music table, working with the inspirations there, getting very excited at the idea of making a game that used a rhythm mechanic.

The technology workshop was brilliant, it very quickly demonstrated the potential and function of Arduino. I saw an RGB LED in the kit and instantly felt the need to get a turquoise colour out of it. This simple demonstration gave me the knowledge of Arduino to make incorporating it into DeeJump much easier later on.

Arduino LED testing.

Week 4 saw the beginning of the long journey to the exhibition with the group formation. After forming the group based on a common desire of working with music, we began developing our concept for the (very) close proposal presentation due date.

The process of creating the concept of DeeJump came from the group looking at existing inspirations in front of us (music concepts from week 2), and brainstorming like crazy. The elements from these inspirations that really grabbed us were collaboration and rhythm, two things that remained in DeeJump until the end. My idea of a mistake-detecting rhythm skipping rope turned into laser lines on the ground, which became lines from a projector. To detect the correctness of jump timing, the skipping rope itself became buttons.

Due to my misconceptions about the course, we started with the concept and the idea, and then afterwards found a context and purpose. Looking back, of course this isn't what was wanted by the teaching staff, but the course felt… Different. So it seemed like anything could have suited at the time. I thought learning was the only area that could fit a game like the one we ideated, so we set out to bring an educational side to it.

Proposal

30/3 to 12/4

The proposal week started with the first presentation of DeeJump. To see the slides from the Presentation, please go Down. While generating our concept, we struggled with the way that we were required to have more of a specific purpose for the design. We brainstormed very effectively, churning out ideas fuelled by the inspiration the early weeks gave us, but after finding that there were more specific problem spaces, and research questions, we had trouble making ideas fit. It meant that our proposed concept had a lot of issues the tutors didn't like, but we didn't find out about that until much later. Personally I hadn't made the connection between the course's themes and strict academia yet, because my own artistic influence and misreading of the course clouded my vision.

Completing the proposal presentation meant we could start setting our mind to building the game, an intimidating point. We were fairly confident that we could demonstrate most of the game's features at the final exhibit, but of course to be safe we worked on the most important features of the user interaction first.

Our initial plan for implementation used Flash at its core, as it had MIDI libraries to interface with other MIDI devices. My duties in the build were handling the songs, setup of Ableton, hardware, and the Flash/MIDI logic.

The way we figured the game would work early on.

These other MIDI devices included a USB MIDI keyboard of mine, a perfect basis for the DJ booth controls, and a music program called Ableton Live to output music. Flash would need to interface with both of these, so my first move was to get these MIDI libraries working in a flash file.The benefit of Flash was its options for visuals, but in the end Flash was fine, the MIDI libraries were not. Created as recently as 2010, it seemed like there had been some kind of MIDI/Flash falling out in 2010 that signalled the death for an updated interface between the two systems. I tried a lot of strange things to get the MIDI bridges to work, but nothing worked for longer than a second (caused by freezing java programs).

During this time trees decided to invade pipes at my house, resulting in an incident, which required a lot of my time unfortunately.

Presentation

Here you can find the slides and screenshots used in the Proposal Presentation:

Google slides

Starting the Build

13/4 to 26/4

Continuing the focus on the most basic user interaction first…

After the necessary abandonment of Flash, the limited music program knowledge of the teaching staff (not your fault!) caused a brief 'is this the end?' moment. Without anything for the game's most basic functionality, we wouldn't have been able to demonstrate even the smallest part of the user interaction that we designed, so I set out on a mission to find something, anything.

While experimenting with Ableton, I noticed a section in settings that I had ignored for the previous 6 years of using Ableton Live.

The mysterious section in the Ableton preferences.

In an act of desperation I followed the link to the Max website, and discovered what seemed like exactly what we needed (and it was, 100%). It really seemed too good to be true, a program designed around MIDI connectivity and interfacing (almost) anything to (almost) anything. I toned down my initial excitement and prepared for disappointment, to prevent any further crushing of dreams. As it turned out of course, Max was far beyond perfect. Not just for the project, but for everything I enjoy. I downloaded Max and began working through a few tutorials.

It absolutely blew my mind.

It was programming as I had never experienced before. It was the same actions and logic paths I would write in code, but now appearing on screen much faster, and connected with cables. It was like the magnetic poles switching around the Earth, the way I'd learned to think about programming had suddenly been forced to flip upside down. My first few experiences using the program were very tough. It was such an alien environment to develop in, using cables instead of methods with arguments took a while to feel comfortable. After the basic Max tutorials I was hooked, hooked and convinced that it would carry us to the end.

As my group hadn't worked too much with music or MIDI hardware before, they couldn't share my excitement at the same level, but they did like how the game had a chance to survive now. As I went through the tutorials with MIDI, it occurred to me how Max could allow us to have the visuals and sound coming from the one computer, greatly simplifying the setup we initially devised.

Of course at the same time, dropping Flash left a hole in our build plan. After about half the class and tutors suggested Unity, we were able to find a MIDI bridge program that was made quite recently, and tried it out. I gave Unity a go, but learning how to use Max and Unity would have been too big of a task for me. Hayden took over Unity duties eventually.

I worked on getting familiar with Max and another vital part of the experience: player buttons. The buttons we used were from 2014, and they happened to be loud when jumped on. To this day I defend the distinctive clack of the buttons. The games DeeJump extended on (Guitar Hero and DDR) both creak, clack, click, and thud as they are played, and without the clack sound, DeeJump would have felt less satisfying to play for everyone involved.

At home I had an old USB MIDI keyboard that had consistency issues with some keys, so I decided to take out the PCBs inside and repurpose them for the game. The internals would let us interface with a computer, and send MIDI in or out through a USB MIDI converter to another computer. The keyboard part of the keyboard would let us trigger a specific note with a player button, by hacking the connections across the PCB, or so I hoped.

The infamous USB MIDI keyboard.

As some tutors know all too well, my battle with this particular piece of hardware was long lived. The MIDI signals we could send were convenient, but activating that signal was not.

The contacts of the keyboard with wires soldered on.

To activate a note, one circuit had to be completed, then the next, and then in reverse order to de-activate the note (a necessity). I had a bit of experience working with electronics before this, but I hadn't ever dealt with a situation like this. It was suggested that I used an Arduino to create a circuit with timer chips to close the circuits in the right order. Then began the long journey of timing chips, transistors, resistance, and futility. With a lot of help by a tutor I tried at least 3 different circuits to time the circuits closing correctly using timer chips and/or transistors. Eventually, by hacking into the ribbon cable that connected to the main PCB, and with a physical delayed switch using a spring, we were able to get the button presses sending out MIDI signals from the keyboard.

Various stages of battle with the keyboard.



The double switch button.

We were given a powerful Mac Mini to use at this point, which inspired us to deviate from the initial implementation plan with 2 computers and put everything on this one Mac. Up until Back Tracking this is how we thought the project would continue, until a watershed moment much later on that made a lot of my early work with the USB MIDI keyboard pointless. Oh well, it was all for the experience.

Prototype

27/4 to 3/5

The marks for the proposal presentations came out during this time, and we were shocked (which then shocked some teaching staff). As discussed earlier, confusions about what the course wanted came back at us with a vengeance. It was a pretty tough time for me, I felt personally responsible as I'd had such a hand in the creation of the idea, and I'd suddenly found out that I'd made some huge mistakes.

While experiencing a fairly huge amount of shame, the core experience was put through the wringer. Thankfully, Pete helped us get back on track by giving us the details of where we went wrong and how to fix it before the prototype presentation.

Our approach to background material had to change completely, to focus on papers and academic research. I set out to find academic material relevant to our problem space and found three papers that were directly related. This of course required an overhaul of the project and interaction to reflect what the papers suggested.

The three papers that helped us were:
'Playful Sounds From the Classroom: What Can Designers of Digital Music Games Learn From Formal Educators?' (Duysburgh, Slegers, Mouws, Nouwen, 2015)
'Interactive Music Video Games and Children's Musical Development' (Gower, McDowall, 2012)
'Music-Games: A Case Study of Their Impact' (Cassidy, Paisley, 2013)

They were able to give us direction and guidance within the problem space we were designing for. During this time the focus of the project changed drastically. The base interaction remained the same (jumping on buttons to make sound come out), but the other parts were adjusted or removed to fit. The DJ controls changed to be released in groups, and the game would push for more exploration with a change in game mechanic with a strong sense of autonomy given to players. An aspect that only had to change slightly was the use of feedback at the player's feet, going from simple red=bad, green=good, to include a third colour to denote exploration.

These big changes occurred only days before the Prototype Presentation, so it was a tough few days. As someone who had never had to look into the world of papers and academia, it was a trial by fire. For the presentation we would have a hardware and software prototype, where the hardware prototype was my responsibility. I was able to demonstrate a player button sending a MIDI signal to the USB MIDI keyboard, which sent the message to Max, which parsed the MIDI and triggered a synth, kick drum and snare drum in Ableton. This demonstrated the core technology of the experience working as it should (albeit with the old, inconsistent double switch button).

I felt a lot better after the Prototype Presentation, with our project in line with what the teaching staff wanted (finally).

To see more about the Prototype Presentation, please go down.

Presentation

Here you can find the slides and screenshots used in the Prototype Presentation:

Google slides

Build Complexity

4/5 to 10/5

After the drama of a presentation week that featured heavy iteration and change, it was a relief to have the projects interaction more clear and correct. For the build process it made it more clear what we required, and what we would need to test.

Having the requirements of the build become clear caused the complexity of the project to dawn on us. In this picture you can see how we figured it would all work at the time:

A later iteration on the way the game would work.

As the only one with knowledge of Max, I started the process of build-prototype-test-iterate-repeat for the different sub-patches of the system. Beginning with the most core element of the interaction, player button presses causing sound to come out, my first task was to set this up in Max. In the following video you can see how testing went about, controlling the mouse and keyboard with one hand, while using the player button with one of my feet. I demonstrate how the music continues over time, with the button presses activating the sound at any time. This functionality was crucial to the important exploration aspect of the project.



During the build process as we ran into inevitable issues, our implementation plan changed (about once a week). Each time we reached another level of experience with the new technology we were using (Hayden with Unity, Max for myself), we saw more and more work ahead of us, and the incredible complexity of it all. My work in Max formed the core of the experience, and as such I refined the sub-patches I built often, to prevent them from breaking during use, and to keep in line with the experience we designed. Click here to see more on how I used Max, and the enormous final patch.

At this time, thanks to tutors, we were able to interview some education students (future teachers) to get their feedback and input on the project. Joel and I introduced the teachers to DeeJump, then asked for their opinion on specific points of the experience. Unfortunately none of the teachers we interviewed were music teachers, so they couldn't give us feedback on specifics, but they were great in giving us detailed and actionable feedback about other elements of the experience.

Interviewing teachers.
Education Student Interviews. (2015). Introducing DeeJump [Picture]. Brisbane, Qld: Photographer: Tan, J.

They gave us great information regarding the use of something like DeeJump in a classroom, in terms of the length of a session and how they would use in during a semester. We used this information to adjust the experience so it fit more closely with what they suggested, as everyone we interviewed gave very similar responses. The result of this was an intended cycle of 25-45 minutes of music (depending on tempo). The teachers also suggested that the game's experience cycle (from player to DJ) moved children from one side of the buttons to the other, exposing them to all the different elements of the sound before reaching the DJ position.

An issue that any kind of teacher deals with is naughty children, so the responses we got from the interviews justified our decisions of leaving children-control to the teachers. Even after discussing options for the game controlling naughty children in different ways, the teachers felt that to keep their sense of authority, they needed direct control.

From user testing with the buttons during this time and before, we realised that jumping up to land on the buttons took far too much energy, so we devised button frames to encapsulate moving parts and give players somewhere to put their feet when they weren't pressing the button. These button frames also gave us a way to display another vital element of the user experience; instant feedback. The instant feedback of the game in the original concept was delivered by screens under the players' feet, but through iteration became RGB LEDs at either side of the player's button. The frames gave us a place to mount these LEDs, and to route the vast amount of cable required for the buttons and lights.

Back Tracking

11/5 to 17/5

So after many weeks of battling the USB MIDI keyboard's inconsistencies, in the late game of the project, we had to back track. We had discussed using MakeyMakey at the start of the project, but its limitations as a keyboard device seemed too inconvenient at the time. It meant that only the program in focus could receive the keypresses.

Only at this point of the project did Hayden have enough knowledge of Unity, and myself with enough knowledge of Max, to realise that our original plan involving 2 computers and MakeyMakeys was going to be the best plan (one MakeyMakey for each computer). I tested 2 MakeyMakeys hooked up to a single switch and it worked fine, a lucky convenience. This of course made all of my work in early weeks battling the keyboard pointless. I was very regretful about the time wasted at the start of the project, but it was all going to be worth it to get a consistent experience during the exhibit.

One computer would run Unity, with Unity in focus to receive keypresses. The other computer would run Max and Ableton, with Max in focus the receive keypresses. The player buttons were hooked up to both MakeyMakeys at the same time. The DJ booth's USB MIDI keyboard PCBs and USB MIDI adaptor would be the bridge between the two computers, a very low latency, reliable solution. At this point, our implementation plan looked like this:

An even later diagram of how the game would work.

During this time we completed the construction of the button frames after (super safe) testing with side platform distances to determine how much exposed button there should be. The amount of exposed button was influenced by safety for users and giving adults and the target audience a comfortable stance when not pressing the button. Although students were set to come to the exhibition, we knew that most of the users would be grown adults, so the button frames were made to accommodate them the most.

Testing side platforms on the buttons.

Working on the player buttons was important, but so was the tempo changing experience for the DJ. The Unity file needed to adapt to changing tempos, so thanks to Jason, it was ready to estimate the tempo of MIDI signals it received. The tempo changing would need to be controlled by Max, so I set about controlling the tempo in Ableton with MIDI signals. Ableton gave me 2 main options for controlling the tempo from Max, firstly sending a signal to Ableton every beat, relying on ableton's tap tempo function to detect the bpm (beats per minute). Below is a video that demonstrates the results of my process:



When I wasn't in the workshop, I was often using TeamViewer to access the Mac Mini, experimenting with different ways in Max for the players to activate music with their button presses. This resulted in many, many crashes of both Ableton and Max, as I accidentally created MIDI feedback loops often. Part of these experiments included using Max for Live objects, Max patches that Ableton could use natively. From all of these experiments, only two different ways of activating sound were consistent and intuitive. Firstly, controlling the Ableton track volume level for instruments and vocals (going up and down with a button on/off), and secondly, triggering drum hits a single time for each button press.

It was during this time that I also sketched out and developed a more refined user experience plan, to make the most of the cyclic interaction the teachers suggested to us the week before. After finishing the sketch I had a much better idea of the way users could move through the interaction to their benefit, and how to ensure that all users get equal time playing the game, something the teachers pointed out as very important).

A diagram of the cyclic interaction.

Two Weeks Left

18/5 to 24/5

As DeeJump is a music game, it needed some music! Early in the semester it was clear that using songs that were not my own would be very difficult. To get a song ready for use in the game, I needed access to all the individual parts of the song so that players could interact with them. Over the semester at home I had built a collection of music to choose from for the game, from my own music, and two outside works. These songs were remixed to make them a suitable length, remove any boring parts, and to split the individual parts.

With two weeks left, I came into the workshop to sit down at the Mac for a long time, adding all the songs into the one file. The following video shows an hour's work in this process (don't worry, it doesn't go for an hour). I swap between Max and Ableton often, testing the effects of player keypresses as I set up the MIDI controls in Ableton.



After a long time adding music to the game, Hayden and I worked in the workshop tweaking the timing between Unity and Ableton. With the game's elements at a stage that made it playable (to a degree), we started to set up the game often. These first few times setting up the game really brought home to me the amount of cable and space required to play the game.

An early set up of the game.

As I was in charge of the hardware, I was in charge of organising the cables, and I essentially had to set up a small concert's worth of cables and wire each time we set up. Whenever we had the game set up, we would work together to fix timing issues, and discuss what needed to happen next. We were successful in our aim of completing the most important elements of the experience first, with the game able to demonstrate basic aspects. My next move was to implement the player feedback system, which entailed creating a scoring system in Max. A player feedback/scoring system was vital to the game as it was a big part of our intended experience and an important part of an effective learning tool according to our research. It would have been possible to implement this in Unity, but then we would need to interface and Arduino with Unity, a task much less simple than interfacing Max with Arduino. I was able to quickly build, prototype, and test a scoring system, comparing inputs from player keypresses and suggested notes to generate a score. Using a delta value of this score I was able to trigger different colours in the RGB LEDs, in line with a red=bad, green=on time, turquoise=experimenting scheme we decided on. To see more about this system, please see the Max page.

Almost Ready

25/5 to 31/5

The game!

With the scoring system working, I set about the next hurdle, interfacing Max with Arduino to display information to users. Luckily this is something the program is made for, and very quickly I was able to test scoring with RGB LEDs on a breadboard. This is also where I tested running 8 LEDs in parallel using one resistor per colour. Thankfully it all worked perfectly:

Parallel LED testing

The Arduino code used in the project was incredibly simple, it was simply pins turning high or low depending on a certain number received through the serial connection.

Max also let me test the elements of the DJ booth user interface on a breadboard. These elements where the tempo selection lights, tempo flash light, tempo change available light, and DJ controls available lights. These lights were vital elements of the interaction, as users and DJs needed to know about the current state of the system and how they were performing. All of these LEDs were turned on and off by logic within Max. To see more of how I interfaced Max to Arduino, please see the Max page.

In between setting up and testing the game often, Hayden had been hard at work constructing the frame and panel for the DJ booth. With the panel ready for LEDs, I made small retroreflectors from cardboard and aluminium foil so that one LED in each could provide sufficient light. The amount of wire necessary was intense, and I ran out twice in a matter of days. To finish up the internals of the booth, I installed buttons for tempo changes, which worked with the MakeyMakey attached to the Max computer. The buttons used felt amazing, they are momentary switches that feel like they were designed by some kind of deity.

There was a Friday night during this time that Jason helped us so much. From testing the game with it set up, we noticed that the lines coming down towards the player would oscillate in and out of time with certain tempo choices. This was because of Max working in float value tempos, while Unity was detecting the beat only in integers. Jason was able to help us convince Unity to play nice, using float values for its tempo without the need for it to detect the tempo anymore.

I had the scoring/player feedback system working in a breadboard, but to make it a reality in the button frames I needed to prepare the LEDs for wiring up. This meant a few hours soldering small sections of cable onto the legs of the LEDs, absolute hell. But it was all worth it to see a pile of 32 RGB LEDs ready to be used. I installed one player's worth of lights into a button frame, and tested them to make sure they gave correct feedback.

Outside of the work mentioned above, at this time I spent a lot of time tweaking the timing of the lines coming towards the player. Timing adjustment was done by altering the timeline position of MIDI notes in Ableton, which changed when they were spawned by Unity. As timing to a rhythm game is like air to people, this was the work we prioritised over everything else when an issue would occur. I had a very high standard for the timing, it was such a fundamental aspect of the experience we designed that any noticeable problems were completely unacceptable. In this video you can see what the timing adjustment/testing process looked like:



This standard caused a lot of heartache, because every time we set up the game it seemed like I had to tweak the timing. The upside is that I was well practised in adjusting the timing for the exhibition, but at the time it was incredibly frustrating.

Exhibition Week

1/6+

The exhibition week started with another course's presentation, not so well timed, but it had a funny result when we had people from the class test our game in business wear while tired and relived:



I installed and tested the rest of the scoring lights after this point, which took some time as I had to organise the sheer amount of wiring involved. After more and more testing, we had to take our projects to the edge and sit tight until the next morning.

The set up on exhibition morning was intense, there were a few oversights that I didn't have time to catch beforehand that needed to be solved, I was the only one who understood the wiring and could set it up, the timing needed to be adjusted, and certain things in the Unity code weren't working the way they should have.

We were ready to go just a little while after the official opening time of 12pm, ready for some school kids… Or so I thought. No matter how physically unbreakable the game's construction was, the students who tested our game first completely broke a large portion of the game's songs. This happened because of limitations of the way we built the game. By allowing changes of Ableton parameters en masse, there was no easy way to reset them to a normal, working configuration, and with more gentle usage it might have been alright, but these students were not gentle. To recover I had to disable the non-tempo changing parts of the DJ booth, and cut back on the number of songs used to make sure they were consistent. As I guessed, we never had enough time to fix the game properly in between visits to the project. You can see the controls in question at the top of the following picture:

The DJ booth.

I feel personally responsible for the failure of the DJ controls, both in hardware/software and in design. They were a painful remnant of the initial misapprehension about the course's aims, and such an important interaction of this fundamentally flawed design. Perhaps with different songs (ones written specifically for the game), and with much less choice for effects, the DJ aspect would work, but for our purposes it failed. The school kids responsible were much older than out target audience (grade 11 instead of 3-8), but I feel it may have ended up the same. Something of note from our experience with the school kids was their complete lack of any music knowledge whatsoever.

Regardless, the core interactions of the project were fine, thanks to the group's and my own focus on the most important parts first. The timing was working really well, the right sounds were coming out when a player pressed a button, and the DJ booth was able to change the tempo. I still feel regretful about the DJ controls, but seeing the rest of the project work and be enjoyed was very satisfying. Click here to see more about the exhibition.



I taught my team mates how to start, stop, and restart the game so I could see the rest of the exhibit for myself, but they didn't give me a chance to escape often, so I didn't get to see all of the exhibits. The exhibits I did see however made me feel a little disappointed in my project. With all of the things DeeJump offered, it still boiled down to buttons and knobs, when there were legitimate futuristic devices in the exhibition hall. DeeJump was wires and solder, while some exhibits were almost entirely wireless. Electromagnetic radio waves are just as physical as a steel wire, but exponentially more interesting. From trying to fit some of the themes of the course, my group and I moved further away from a more innovative match to the studio's theme that could have been more rewarding. This is why I feel that 'Imiwa?' (my concept for the week 2 inspirations) fit the themes of the course more than DeeJump did.

While my regrets for the project will last for a while, I will never regret the journey I took in designing, prototyping, and building a project as part of only a group of three in effectively 8 weeks. I know a lot more about myself and the way I work, my organisational skills were honed a bit more, but most importantly I learned so much that will help me further down the track. These things include clarifying as much detail as possible before launching into something, preparing for things to break in all manner of ways, having many backup plans, and that the simplest solution can sometimes be more effective.

The biggest outcome of the course for me is the discovery of Max. With this program I can do amazing things in ways I'd never dreamed of before, allowing me to explore my own projects, and hopefully become skilled, and use it in the industry. As an example, in the morning of a day before a due date, I required a productivity tool that read out the time. So I was able to make one in Max, in about 5 minutes.

A time reader made in 5 minutes.

About Me

I'm Alexander Balson, an aspiring designer, currently in my 3rd year at UQ. Over my time I've picked up skills in many areas:

My goals are to work in the fields of industrial design, interaction design and prototyping.

My specialty programs/languages are Ableton Live (music), Max (prototyping), Photoshop, and HTML/CSS/Javascript. But I'm also well versed in Illustrator, Flash, and ProTools.

My style and taste is a little different (as you can see by this website). It would be my goal to work in a design industry where I could bring my personal style to the table. In the past I've created other websites that explore the different ways I feel web navigation can go. 2013 saw the creation of my 'Bird' portfolio page: Release the birds. Bird and Dungeon Crawler websites © Alexander Balson, 2015.

Images used in the website backgrounds:
Abu Black Marble [Picture]. (2011). Retrieved from http://hgstones.com/shop/abu-black-marble/

Grigio Carnico Marble [Picture]. (2013). Retrieved from http://www.italianmarbles.net/italian-marble-types-price-flooring/grigio-carnico/

Statuarietto [Picture]. (2015). Retrieved from http://hgstones.com/shop/statuarietto/

Contact

Please feel free to contact me at any time,

I can be contacted at:

Alexander.balson@gmail.com

Report

Click here to view my 2015 DECO3850 Report

Site Map

The site map.

  • Home
  • About Me
  • Contact
  • Report
  • DeeJump
  • Exhibition
  • How it Works
  • Kickstarter
  • Design Process
  • Inspirations
  • Forming Groups
  • Proposal
  • Starting the Build
  • Prototype
  • Build Complexity
  • Back Tracking
  • Two Weeks Left
  • Almost Ready
  • Exhibition