
Royal Conservatory of Music -
"Catch the Egg" Game
Project Type:
Industry collaboration
Project Length:
5 weeks
How does the learning experience of "Catch the Egg" game can be improved to bridge the gap between the app's two main audience; kids and their adults?
Focus Question
TEAM:
Ezgi Cokuysal
Pu Vivian Huang
Saba Samizadehyazd
Nadia Khvan
MY ROLE:
User Experience Designer
Facilitating design sprints
Making final design decisions
Writing user testing discussion guide
Writing sprint plans and sprint packages
METHODS & ACTIVITIES
Stakeholder presentation
Running two design sprints
User journey mapping
Identifying usability problems
Making prioritization grid
Writing HMW statements
Performing lightening demos
Boot up note taking
Ideation sketching
Crazy 8's sketching
Solution sketching
Silent reviewing
Heat map voting
Storyboarding
Prototyping
User testing
BACKGROUND:
Royal Conservatory of Music (RCM) is one the world's largest music education institutions. With 30,000 teachers and about half a million students, the institution intends to develop human potential through leadership in music and the arts. RCM's web-based digital learning platform was first introduced in 2015. Since then, it has been offering music theory and history courses for 5 to 15 years old students through their extensive interactive digital resources.
As part of Humber College User Experience Design team, we collaborated with RCM to help them improve the usability of one of the games within their Music Theory Preparatory app called "Catch the Egg". The app's primary audience are pre-readers between the age of 3 to 6 years old and their adults. By playing this particular game, young learners are expected to improve their note naming skills by catching a dropping egg (note) with a nest in the right place on a music staff. During the stakeholder's presentation, we were informed that there's a divide between the way children feel about using the app and their adults who purchase it for them. User reviews indicated that children generally enjoyed interacting with the lessons and the games. However, their adults thought the app still needs a lot of improvement and weren't positive that it's quite worth the value they pay for. My team primarily focused on improving the app's communication with its users as they interact with it. By analyzing the insights discovered from user testing/interviews and through running two design sprints, we could identify usability pain-points and make recommendations for improvement.
PROBLEM

Preview of the Existing Game
APPROACH

TIMELINE

FIRST SPRINT
During the first 5-days-long design sprint, the team identified several opportunities for improvement and successfully validated the findings by making a prototype and performing usability testing. The main areas that were explored in the prototype and tested on users during the first design sprint are as follows:
introducing a practice mode where users could learn and play at the same time, tracking users performance and provide feedback on their learning progress, and lastly introducing reward incentives to highlight users accomplishments and keep them engaged.
DAY1:
Understand

User Journey Map of the Existing Game

HMW Statements Prioritization Grid
Requirements | Goals | Journey Map | HMV Statements
We started the session by taking some time to interact with the game both individually and collectively to refresh our minds and develop a more detailed understanding of the game’s features and overall experience. Then we mapped out the project’s fundamentals (focus, goals and target audience) based on our notes from stakeholders' presentation and set long-term project goals and short-term sprint goals to help us stay on the right track. We created a user journey map and individually wrote down our impressions of each stage of the existing journey (actions, thoughts, feelings and pain points). Based on the user journey map and the project’s goals, we came up with many HMW statements individually and narrowed down to two statements as a group using prioritization grid and voting. We then documented our process and findings from Day 1 and wrapped up the session by a quick debriefing.
How Might We...
...give feedback to users regarding their learning progress?
How Might We...
...make sure user learns where the egg should have gone after they failed to catch it?
DAY2:
Sketch

Ideation Sketches

Solution Sketches
Lightning Demos | Ideation Sketch | Crazy 8's Sketch | Solution Sketch
The session started with a quick recap of Day 1. Then, we each presented one example of a gaming app that we had found the night before based on the two HMW statements and explained what we liked about his or her approach to similar problems. Facilitator wrote down notes on post-its and drew quick sketches as we presented. We mapped out the lightning demos on the wall and used those as inspirations to sketch our own individual approaches. 3 sets of sketches were produced by each of us. The first set was created by going through lighting demos’ map silently and coming up with potential solutions. The second set was produced by performing crazy 8 sketching to help us come up with different iterations of our main ideas quickly. Finally, the third set which took the longest time was a more elaborate version of the iteration we thought was the most effective from our crazy 8's sketches. We then documented our process and findings from Day 2 and the facilitator wrapped up the session by a quick debriefing.
DAY3:
Decide

Heat Map Voting

Story Board and User Flow
Silent Review | Heat Map Voting | Story Board | User Flow
We put all our process work on the wall to constantly refer back to them stay on track. The facilitator of the day ran a quick recap of Day 2. We started by placing our solution sketches on the wall presenting them to the team. Then, we silently reviewed the sketches and voted for the features we liked the most. Using the heat map created from our voting, the facilitator came up with a list of the features to be used in the prototyping phase. After finalizing the list collectively, we made a storyboard of our proposed changes using our sketches and adding drawings where appropriate. The storyboard represented the user flow from the beginning to the end and depicted all the screens we were going to prototype. We then documented our process and findings from Day 3, discussed participants’ recruitment for the testing phase and wrapped up the session by a quick debriefing.
DAY4:
Prototype

Mapping out Features and Functions

Dividing tasks
Assign Tasks | Test Script | Schedule Testing Sessions
As usual, the facilitator ran a quick recap of Day 3. We discussed prototyping options as a group and decided on one to be used. Facilitator informed us of the day’s agenda, created a to-do-list for prototyping phase and assigned us tasks. The rest of the day was spent as a working period with each person working on their deliverables for the final prototype. Testing sessions were also scheduled and confirmed with the participants. Session was wrapped up by a quick debriefing.
PROTOTYPE VIDEO

DAY5:
Validate


In total, we tested the prototype on 4 different children and interviewed their parents.
The session started with a recap of Day 4. We used the first half of the day on completing the final prototype and edited the interview script for user testing sessions. The first round of user testings took place in the second half of the day. The rest of the user testing sessions were completed during the weekend and the findings from user testings were documented on a Google sheet to be discussed on the first day of the second sprint.
SECOND SPRINT
During the second 5-days-long design sprint, the team focused on analyzing the user testing sessions' observations and implementing the findings in the second iteration of our proposal for Catch the Egg game. The results from user testings positively validated the features we had introduced during the first sprint. Therefore, the majority of our time during the second sprint was spent making improvements to the features we had already designed instead of introducing new features. Unfortunately, we did not get to test our second prototype due to time restrictions and recruitment barriers. The main areas that were improved in the second prototype are as follows:
highlighting players' accomplishments as opposed to keeping track of their failed attempts, creating a more relatable and engaging reward system for children, providing a more accurate infographic on the feedback page that we had designed during the first sprint for adults to keep track of their children's learning progress.
DAY1:
Understand

Affinity Mapping Process

Final Affinity Diagram
Debrief Test Results | Affinity Map | Goals
We started the session by debriefing findings from our user testings conducted in the first sprint. We mapped out all the findings on the wall and created an affinity diagram by clustering key points. These included user comments and moderators' observations. After a short silent review, we reclustered some of the categories and labeled them with appropriate titles. We also made note of and documented emergent themes and patterns. This activity helped us to rewrite more concise HMV statements and establish a focus for the second sprint.
DAY2:
Sketch

Lightning Demo Outcome

Journey Map
Lightning Demos | Boot Up Note Taking | HMV Statements | Sketch
The session started by a quick recap of day 1. Facilitator of the day gave us a few minutes to revisit the affinity diagram we had created in day 1 and vote for the outcomes that we thought are the most relevant to our original HMW statements and can be addressed in the second sprint. Then we all took some time to individually look at the 3 game apps that our participants had mentioned during the testing sessions. We tried to dissect the elements that made our participants enjoy playing with those apps. We each took turn to present our findings to the group. The facilitator took note of the important points and placed them on the wall. In the next step, we were given a brief time to come up with HMW statements based on our original HMW statements from the first sprint but incorporate the main findings from the affinity diagramming and Lightning Demos. We printed the screens from our first sprint prototype and placed them on the wall in order to form a user journey map. After that, we started identifying the places on the map that needed or had the potential to be changed based on our new HMW statements. We sketched our ideas and placed them under corresponding screens. Finally, we voted for the best ideas and sketched our final solution. We then documented our process and findings from day 2 and the facilitator wrapped up the session by a quick debriefing.
How Might We...
......give feedbacks that engage the users?
How Might We...
...motivate users to continue using the app?
DAY3:
Decide

Lightning Demo Outcome

Journey Map
Silent Review | Heat Map Voting | Storyboard | User Flow
We put all our process and findings up until that point on the wall to be used as a reference point and help keep us on track. Facilitator ran a quick recap of day 2. Then we placed our solution sketches on the wall and presented them individually. We silently reviewed each sketch and voted for the features we liked the most. We followed the same path as our first sprint using heat map voting. We then created a new storyboard consisting of the unchanged screens and our proposed changes all to be used in our new prototype. We also added extra notes and sketches where appropriate to make the user flow clear for the prototyping phase. We then documented our process and findings from day 3 and wrapped up the session by a quick debriefing.
DAY4:
Prototype

Mapping out Features and Functions

Dividing Tasks
To-Do-List | Assign Task | Create Prototype
As usual, the facilitator ran a quick recap of Day 3. We discussed prototyping options as a group and decided on one to be used. Facilitator informed us of the day’s agenda, created a to-do-list for prototyping phase and assigned us tasks. The rest of the day was spent as a working period with each person working on their deliverables for the final prototype. Testing sessions were also scheduled and confirmed with the participants. Session was wrapped up by a quick debriefing.