The work I did as an undergraduate with Dr. Harriet Oster — a former Ph.D. student advised by Paul Ekman — and at VicarVision during the summer of 2013 has evolved into MiFace, a tool for generating recognizable facial expressions using a 3D model. Still in the research stage, my cohort Stephanie Michalowicz and I are hoping to use[…]

Action Unit-driven Avatar

Facial Expression Recognition-driven Avatar

This presentation reviews the work done for my summer 2013 internship at VicarVision, working with their FaceReader automated expression recognition software. A 3D model generated using faceshift was modified and scripted in Maya to conform to the standard Action Units used in coding facial expressions is driven by the data output by FaceReader.


NYU Senior Project

A 3D Model of Infant Facial Expressions Senior project for BA in Applied General Studies My B.A. degree required a capstone project or internship. I decided to work on a project that applied design and programming to the infant facial expression fieldwork studies I had been pursuing since 2011. It centers on the development of a three-dimensional[…]

3D Infant Model

Motion Capture Infant Avatar Project

This video documents the proof-of-concept development of a 3D infant avatar that displays facial expressions retargeted from an actor (Me!) to the avatar. The base 3D model for the avatar was purchased from TurboSquid. Motion capture of my facial expressions was done using Faceshift software and a Kinect for XBox 360. Faceshift was also used[…]