Very fun project for my Human-Computer Interaction Grad class at NYU Poly. I unrolled a mesh bun form from the drugstore and put two pairs of conductive fabric inside, one connected to a strand of white LEDs and the other a strand of red ones. I rolled it all back up, soldered some parts to[…]
This presentation reviews the work done for my summer 2013 internship at VicarVision, working with their FaceReader automated expression recognition software. A 3D model generated using faceshift was modified and scripted in Maya to conform to the standard Action Units used in coding facial expressions is driven by the data output by FaceReader.
A 3D Model of Infant Facial Expressions Senior project for BA in Applied General Studies My B.A. degree required a capstone project or internship. I decided to work on a project that applied design and programming to the infant facial expression fieldwork studies I had been pursuing since 2011. It centers on the development of a three-dimensional[…]
This video documents the proof-of-concept development of a 3D infant avatar that displays facial expressions retargeted from an actor (Me!) to the avatar. The base 3D model for the avatar was purchased from TurboSquid. Motion capture of my facial expressions was done using Faceshift software and a Kinect for XBox 360. Faceshift was also used[…]
Poster for Smart Fabrics 2013, San Francisco This concept poster was accepted for presentation at the spring Smart Fabrics expo. There are plenty of devices to help monitor physical health, but mental wellness is a space that has gotten little attention.
The third project for my MoCap class at NYU, and the one with the most original content. Two other students and I designed the actor’s motion sequences. Narrative and characterization are all my own.
Done for a motion capture class taken at New York University with Professor Chris Bregler using Vicon iQ and Autodesk Motion Builder.