Techniques for performance-based facial animation
M.Sanchez, "Techniques for performance-based, real-time facial animation", PhD, 2006. (Supervisor: Dr Steve Maddock). [pdf (author's copy)]
PhD Abstract: The purpose of this research project has been to construct a real-time Facial Animation system that reproduces a wide range of aspects of facial skin motion using information captured from actual performers. The large-scale deformation of facial skin is controlled by marker-based optical Motion Capture through the application of geometry-warping algorithms specially developed for this purpose. Smaller effects such as wrinkling and buckling are addressed through shading techniques applied onto the warped facial geometry, evaluating a model of fine-scale tissue behaviour that is also built using data retrieved from actual subjects. The synthetic vision framework required for capturing such data is also introduced in this thesis. Additional aspects regarding the analysis and reusability of Motion Capture data are also considered, enabling the system to apply the data collected from a specific individual to different physiognomies. The real-time nature of this animation system is reinforced by implementing some of its time-critical components in dedicated hardware (programmable graphics cards), and preprocessing the facial geometry to make better use of such hardware. Finally, recommendations are made for future work of this nature.
Publications
- M.Sanchez, J.Edge, S.Maddock. Performance-driven facial animation. Eurographics 2006 Animations (Showcase DVD), Eurographics 2006, Vienna, Austria, 4-8 September 2006. (video)
- Maddock, S., J.Edge, M.Sanchez (2005), "Movement realism in computer facial animation", Workshop on Human-animated Characters Interaction, 6th September, 2005, Napier University, Edinburgh, UK (part of HCI 2005: The Bigger Picture, The 19th British HCI Group Annual Conference), 4 pages. (pdf)
-
Sanchez, M., J.D.Edge, S.C.Maddock (2004), "Realistic Performance-driven Facial Animation using Hardware Acceleration", Department of Computer Science Research Memorandum CS-04-10, University of Sheffield. [ResMem, pdf]
- Edge, J., M. Sanchez, S. Maddock (2004), "Using motion capture data to animate visual speech", Symposium on Language, Speech and Gesture for Expressive Characters, March 30-31, 2004, part of the AISB 2004 Convention: Motion, Emotion and Cognition, University of Leeds, UK, March 29 - April 1, 2004 (pdf)
- Edge, J., M. Sanchez, S. Maddock (2004), "Animating speech from motion fragments", Department of Computer Science Research Memorandum CS - 04 - 02, University of Sheffield. (pdf)
- Sanchez Lorenzo, M., J.D. Edge, S. King, and S. Maddock (2003), "Use and Re-use of Facial Motion Capture Data", Proc. Vision, Video and Graphics 2003, University of Bath, July 10-11, 2003, pp. 135-142 (pdf)
- Sanchez, M. and S. Maddock, (2003) "Planar bones for MPEG-4 facial animation", Proc. EGUK2003, University of Birmingham, UK, 3-5 June, 2003 (pdf)
Downloads
- Facial expressions represented using planar bones (avi)
- Facial mocap retargeted to the Bhikku model (mpg)
- Playing with BIDS (mpg)
- An interactive demo to allow experimental deformations like those in the BIDS video (WIN32 zip)
- Mocap Retarget (mpg)
- Interactive modelling and animation (mpeg)
- Playing a mocap fragment (mpeg)
- Interactive modelling and animation - Real-time expressive wrinkling (avi)
- Performance-driven facial animation (mpg)
Invited talks