Thursday, July 30, 2009

Face Tracking for Video Analysis

Is it Hollywood or is it for real?

This is one of the question that I have wondered ever since watching the first Terminator movie where Arnold, aka Mr Governor, identified his victim by analyzing video images. I am glad to report that, with no casualty count, there is indeed similar technology at work at PARC.

I have covered some of the details on this entry Video Analysis and Responsive Mirror part II. Now, I've got Maurice Chu, who worked on and modeled this technology, to share a clip.

Face tracking on a video feed



Maurice's explanation: The clip demonstrates the processing of video to track the face and its parts. In particular, the six face parts are the left and right eyes, the left and right eyebrows, nose, and mouth. The output of the face tracker is a cloud of points representing the location of the six face parts in image coordinates. These can be used to determine the 3D orientation of the face relative to the camera. Some of the challenges that the algorithm overcomes is that it can handle when people put on glasses, eye blinking, and other deformable movements like the lips. The algorithm runs in real-time, currently at about 10 fps, is people-generic, and requires no initial calibration.

===
P@P

No comments:

Post a Comment