Face-2-Face was a client project for Katherine Isbister that used mobile AR and facial tracking to create a transformational game designed to bring people closer together.
Project Information
My Role: Producer and Experience/Game Designer
Platform: iPhone 10, ARKit
Project Duration: 4 months, 2019
Development Website: link
Team: Ashley Liang (Gameplay Programmer), Yoli Shen (Designer and Concept Artist), Chang Liu (Gameplay Programmer), and Freya Li (Designer and 3D Artist).
What is FACE-2-face?
Face-2-Face was a client project for Katherine Isbister and the Social Emotional Technology Lab at UC Santa Cruz, developed at the Entertainment Technology Center at Carnegie Mellon University. Our goal was to create a multiplayer transformational game that used mobile AR and facial tracking that would make people feel more connected to one another after playing.
Since we were developing on iOS, we built our final game (tentatively titled AR BeatBoxing!) using ARKit, and based our gameplay off of Social Emotional Learning (SEL) research and the previous games created by the Social Emotional Technology Lab.
An overview of the core mechanics in AR BeatBoxing!
In AR BeatBoxing!, two players sat side-by-side one another, with the camera feed splitting the screen such that half of the screen is half the face of one player, and the other half of the screen is half the face of the other player. The game then stitches these feeds together to make one Frankenstein-esque composite "face."
Players then must move parts of their face together or at different times to hit various beat-boxing "notes," with the game getting progressively more difficult as time goes on.
From playtesting with over 50 people, we found that though the concept appeared strange at first, most players walked away from it actually feeling as though they had grown a little bit closer to their partner, regardless of if they were strangers or not.