Project Overview: In an effort to understand how immersive design tools may impact the future of design and creative work, I led a project in USC’s Mobile & Environmental Media Lab in partnership with the Steelcase Workspace Futures group. The research involved fieldwork with VR designers and industrial designers. The insights we gathered helped inspire a series… Read more »
June 10, 2017
During my time at Intel Labs I researched immersive narrative opportunities for Scott Westerfeld’s Leviathan triology. As part of this process, I led immersive concept ideation sessions for Alex McDowell’s World Building Media Lab (WBMl). I facilitated the WBMl team in exploring a range of character perspectives that an immersive story experience could take in VR. One of the key… Read more »
April 29, 2016
As part of an exploration of the AudienceBot microphone concept, I wrote a dystopian science fiction screenplay for a short film called American Agora. I created a 3D model for this project and tested it in this short sequence.
LoveLog is a Sloan awarded short film about love and augmented reality. The story explores new opportunities to record our memories colliding with the messy reality of romantic relationships. Young lovers Felix and Sadie never take off their augmented reality (AR) glasses and constantly record video of their lives as they tag and catalog their memories using… Read more »
December 17, 2015
Interning at Intel Labs, I researched and designed new playful ways of engaging with data in everyday life. As part of this project I conducted fieldwork with teens, prototyped “data-sandbox” toolkits, designed and led playtests with target demographics, interviewed luminaries in the field of play, and synthesized findings with emerging opportunities in the fields of IoT and quantified-self. Outside participants included… Read more »
December 16, 2015
The AudienceBot Microphone is an interactive platform that enables audiences to convey feedback to performers in real-time. Using a mobile application audience participants collectively determine the “mood” of the microphone. The microphone utilizes six servo motors and a motion tracking device to express a range of affective states, from inattentive to attentive and from playful… Read more »
November 3, 2014
As a collaborator on Jen Stein’s dissertation project PUCK, I worked with Jeff Watson on datavisaulizations of the SCA building’s 100s of sensor feeds. [Project lead: Jen Stein; Dissertation Chair: Prof. Scott Fisher; MEML team: Jacob Boyle, Joshua McVeigh-Schultz, Hyung Oh, Amanda Tasse, Jeff Watson; Storyboard illustrations: Bryant Paul Johnson]
February 8, 2012
Designed as a critique of status monitoring in online contexts, this project presents a prototype of a prosthetic device that conversation partners wear in their mouths to provide visual and auditory feedback about the speaker’s level of online popularity (measured in retweets). The speaker with more current retweets experiences voice amplification (and their mouth glows… Read more »
This project extended our work with automotive lifelogging by using in-car sensors to engage drivers in ongoing discoveries about their vehicle, driving environment, and social context throughout the lifecycle of their car. A goal of the design was to extend the contexts of automotive user-interface design by (1) looking inward to the imagined “character” of… Read more »
‘Ambient storytelling’ — part of the design philosophy of USC’s Mobile and Environmental Media Lab — represents a departure from customization algorithms familiar to discussions of pervasive computing. Rather than thinking about how a car can play the role of glorified butler, anticipating its driver’s every need, instead we reposition the car as co-participant in an… Read more »