Evaluating 'Math Wizard'

Investigating the Potential for Boosting Math Scores and Marketing Success

Osmo-owning families were engaged to play 'Math Wizard' for 15 minutes daily over a two-week span.

I took the helm of this research project, reaching out to families, crafting tailored surveys for both parents and children, and spearheading the qualitative evaluation. Our efforts culminated in a publication at the ACM Interaction Design and Children Conference.

Given the logistical and analytical intricacies of this undertaking, this case study will narrow its lens, primarily delving into the survey aspects.

How Magical Can Games Be? Osmo's 'Math Wizard' Under the Microscope
99%
of Parents report that their child finds Math Wizard Magical Workshops fun.
97%
of Parents think that Math Wizard Magical Workshop is educational.
95%
report playing Math Wizard Magical Workshop improves Child’s math knowledge and skills.

Problem

As Osmo transitioned its focus from purely entertainment to educational value, there was an urgent need to substantiate its learning impact with hard numbers.

Additionally, to stand out in a competitive market where rivals showcased learning gains on their packaging, Osmo needed to bolster its claims.

Solution

To address this, we initiated a comprehensive learning evaluation pilot divided into three key phases: Pre-, Mid-, and Post-Experience. Our evaluation metrics for the children encompassed learning gains, interest in math, and enjoyment of the program. Concurrently, we gathered insights on parents’ perceptions, the child’s learning progress, and their engagement with homework.

The pilot yielded promising results. On average, there was a statistically significant increase in learning.

Timeline

September 2021
2 months

Industry

Education
Gaming

Core Team

Sherry Yi (Lead UX)
Yuqi Yao (Quant. UX)
Heidy Maldonado (UX Manager)

My Role

Program concept inception
Communication with participating families
Data synthesis
Qualitative data analysis
Survey design for desktop and tablets
Implementation of surveys

Limitations

All studies come with constraints, and it's crucial to recognize them from the get-go.

Product Bias

Being an edTech company, our inclination is naturally towards showcasing a boost in scores. To counteract this bias and offer a comprehensive view, we've made our study design transparent.

We took measures to account for varying levels of math mastery and tried to capture data on external factors like the amount time spent on math homework. We also supplemented children's survey responses with parents' observations, offering a more holistic understanding beyond just the test scores.

Recruitment Bias

Our participants, being Beta testers who already owned Osmo devices and games, may inherently harbor a favorable view of our products. This predisposition can potentially introduce a positive bias, influencing outcomes.

Pre-existing Interest

Parents who invest in educational games like ours are often deeply involved in their child's academic development. They may already support their children's interests, provide homework assistance, or even engage in tutoring services. This can mean that many participants may have had a pronounced inclination towards math before our intervention.

Multi-children Homes

From the backend, we could not discern between players when considering households with multiple children using one device.

My Design Process

When my manager mentioned the upcoming 'learning pilot' project for our in-the-works game, Magical Workshop, I was skeptical.

While the term 'learning pilot' might resonate with parents, it would likely fall flat with our target audience-- children.

Recognizing this potential gap, I suggested we reimagine the experience as 'Potions Academy.' In no time, our research communications were brimming with magical puns (brewing data and wand-ering about spellbinding outcomes) and captivating themes (think Harry Potter, but math), crafting a holistic experience that appealed to all.

This creative initiative, though exciting, also unexpectedly handed me the mantle of being the primary liaison with our participating families.

1. Goal and Method

Goal

To evaluate the learning potential of Math Wizard concerning the child's:

  • learning gains,

  • interest in math,

  • and enjoyment of the program...

... in conjunction with the parent perception of the child's learning progress and their efforts on math homework.

Method

Participants: We invited Osmo-owning parents of children aged 6-8 to participate, and secured Non-Disclosure Agreements (NDAs) and informed consents.

Task: Families were asked to engage in 15-minute play sessions over a span of two weeks.

Knowledge Assessment: Deployed surveys to both parents and children over 3 time intervals: before the study, during, and after (categorized as Pre-, Mid-, and Post-Experience assessments).

Gameplay Data: We also retrieved gameplay metrics from our database, such as playtime and number of sessions.

How Magical Can Games Be? Osmo's 'Math Wizard' Under the Microscope

Overview of surveys administered throughout the learning pilot.

How Magical Can Games Be? Osmo's 'Math Wizard' Under the Microscope

Aligning with Common Core

During development of this game, I worked with Claire to ensure the game was age appropriate, aligned with Common Core State Standards, and that the game's lessons mapped to concrete, measurable skills (e.g., addition and subtraction up to 20).

2. Child's Learning Gains
How Magical Can Games Be? Osmo's 'Math Wizard' Under the Microscope

Quantitative Findings

Yuqi, the quantitative researcher on our team, took the lead for this section.

The majority of children scored 92% and above in the Pre-Experience Standard Assessment, signaling a ceiling effect, or when too large a percentage of participants achieve the highest score on a test. In the end, two approaches were taken to analyze children’s learning gains: subtracted learning gain (M = 8%; standard error: 2%) and normalized learning change (M = 55%; standard error: 7%). 

On average there was a statistically significant increase in learning, as evidenced by the comparison between scores from the Pre-Experience assessment and the Post-Experience assessment (p = .0013).

How Magical Can Games Be? Osmo's 'Math Wizard' Under the Microscope
3. Parent Perspective
How Magical Can Games Be? Osmo's 'Math Wizard' Under the Microscope

My logo design for Potions Academy.

Understanding Parental Perception for Informed Marketing

I led the analysis for the Parent Perspective and the following sections.

Issue: To gauge parents' perceptions and reception of the game for marketing insights.

Solution: We utilized Google Sheets with custom formulas for coding open-ended survey responses, tracking unique responses and tallying percentages.

How Magical Can Games Be? Osmo's 'Math Wizard' Under the Microscope

Net Promotor Score

NPS: 37.68 (69 responses)

Insights:

  • 74% of respondents would recommend the game for being:

    • Fun/engaging (61%)

    • Educational (33%)

    • Interactive/tactile (18%)

  • Reasons for not recommending the experience:

    • Not challenging enough (8%)

    • Not engaging enough (6%)

    • Lacked variety (6%)

How Magical Can Games Be? Osmo's 'Math Wizard' Under the Microscope

Age Fit Analysis

We speculated that families with multiple children might have had mixed reactions to the learning experience, even though participants were instructed to have only the target age group (6-8) child play the game.

Taking a closer look...

  • Detractor Age Breakdown: Most detractor households had younger children (4 years old and younger), with the oldest being six or seven years old.

  • Age Fit: Respondents found the Math Wizard experience best suited for six and seven-year-olds, with an equal percentage (64%) favoring five and eight-year-olds. This suggests a potential appeal to a wider age range within the target group.

How Magical Can Games Be? Osmo's 'Math Wizard' Under the Microscope

Parent Reception

Image alt tag

A Positive Experience

Respondents perceive Math Wizard as fun, that Potions Academy as a whole makes learning fun, and describes the experience as an engaging/ interactive way to learn math.

Image alt tag

Practical Skills

The three perceived learning gains by parents: Improving math skills, Multiple ways of viewing one problem, and Improving mental math.

Image alt tag

Lasting Impressions

Mid-experience: Respondents' top recall of this experience pertain to: Improves/reinforces math skills, Fun and Interactive.

Post-experience: Loved it, Makes learning fun, and Improves child’s math skills were concluding thoughts.

4. Child Perspective

Parents Reporting for Children

Given the challenges posed by young children's developing motor and speech skills, we opted for parents to report on behalf of their children in open-ended responses. This approach ensures more accurate responses, as parents are better equipped to communicate with their children effectively.

Image alt tag

Children's Reaction to the Game

The top three most common child’s reactions described by Parents: Eager to play, Making potions as rewarding, and Fun.

Image alt tag

Features Most Liked by Children

The top three most liked features by children as reported by Parents: the Spell book, Making new potions, and Making learning fun.

Time Spent on Math by Families

  • Before this experience, Respondents’ child spent six hours on average a week on Math activities.

  • At Post-Experience, Respondents reported their child spending an average of nine hours on Math activities in the past two weeks.

    On average, time spent by child on math activities increased by approx. four hours per week during this experience.

    This suggests a possible increase of interest in math for the child or perhaps more support from family to foster this interest.

5. Reflection

Areas for Improvement

Reflecting on this project, I've identified 3 areas for improvement.

Image alt tag

1: More Time to Plan

Given the opportunity for another learning pilot, I would allocate more time for planning the study. The limited lead time and a lack of comprehensive contextual understanding within the team led to rushed decision-making to meet tight deadlines. While industry timelines can be demanding, in an ideal scenario, we would prioritize adequate time for study planning and aligning on research methods.

Image alt tag

2: Lack of Challenge and Variety

A subset of respondents did not recommend the Math Wizard experience (8% cited "not challenging enough," 6% mentioned "lack of engagement," and another 6% highlighted "lack of variety"). This challenge can be traced back to the brevity of the experience's design, causing some participants to complete the game within a mere two weeks. This resulted in repetitive gameplay during short 15-minute sessions.

For future iterations, it is recommended to incorporate additional levels and track how far participants progress. This extended gameplay can serve as a valuable measure for both mastery and sustained interest in mathematics, addressing these concerns and enhancing overall engagement.

Image alt tag

3: Detractor NPS Scores and Age Fit

Detractor NPS scores were primarily influenced by the age fit of their children, who were either within or below Math Wizard's target age range. Notably, children aged six and seven found the additional mini-games too easy, resulting in lower NPS scores.

To address this issue in future iterations, it is advisable to implement a mechanism for distinguishing between child profiles. For instance, an in-game pop-up could prompt parents to select the player's profile before beginning gameplay, based on profiles previously created. This would help tailor the game experience to the specific age and skill level of the child, potentially improving overall satisfaction and NPS scores.

Deliverable

How Magical Can Games Be? Osmo's 'Math Wizard' Under the Microscope

I designed this poster for IDC.

Thank you for reading my case study!

Questions or want to work together? Feel free to contact me!