Paper: IEEE PAMI (1996) "Task-specific gesture analysis in real-time using interpolated views"

Darrell, T.J.; Essa, I.A.; Pentland, A.P., “Task-specific gesture analysis in real-time using interpolated views” Transactions on Pattern Analysis and Machine Intelligence , vol.18, no.12, pp.1236-1242, Dec 1996
URL: [] [DOI]


Hand and face gestures are modeled using an appearance-based approach in which patterns are represented as a vector of similarity scores to a set of view models defined in space and time. These view models are learned from examples using unsupervised clustering techniques. A supervised teaming paradigm is then used to interpolate view scores into a task-dependent coordinate system appropriate for recognition and control tasks. We apply this analysis to the problem of context-specific gesture interpolation and recognition, and demonstrate real-time systems which perform these tasks

Leave a Reply

Your email address will not be published. Required fields are marked *