There is an article in IEEE Computer Society with short writeups on the winners of the IEEE/IBM Watson 2015 Student Challenge. Here’s to miface for being the only app that got a graphic included!
The work I did as an undergraduate with Dr. Harriet Oster — a former Ph.D. student advised by Paul Ekman — and at VicarVision during the summer of 2013 has evolved into MiFace, a tool for generating recognizable facial expressions using a 3D model. Still in the research stage, my cohort Stephanie Michalowicz and I are hoping to use[…]
Two very talented NYU MBA students, Anmar El-Khalil and Alex Dillon (also studying medicine, whew!) gave me the chance to help out with prototyping a Glass application and bluetooth wristband as part of an integrated patient management system concept.
PosturePulse, body-LAB’s first wearable tech product, is launching today on Kickstarter!
This presentation reviews the work done for my summer 2013 internship at VicarVision, working with their FaceReader automated expression recognition software. A 3D model generated using faceshift was modified and scripted in Maya to conform to the standard Action Units used in coding facial expressions is driven by the data output by FaceReader.
A 3D Model of Infant Facial Expressions Senior project for BA in Applied General Studies My B.A. degree required a capstone project or internship. I decided to work on a project that applied design and programming to the infant facial expression fieldwork studies I had been pursuing since 2011. It centers on the development of a three-dimensional[…]
This video documents the proof-of-concept development of a 3D infant avatar that displays facial expressions retargeted from an actor (Me!) to the avatar. The base 3D model for the avatar was purchased from TurboSquid. Motion capture of my facial expressions was done using Faceshift software and a Kinect for XBox 360. Faceshift was also used[…]