Animating Responsive Characters with Dynamic Constraints in Near-Unactuated Coordinates

Abstract

This paper presents a technique to enhance a kinematically controlled virtual character with a generic class of dynamic responses to small perturbations. Given an input motion sequence, our technique can synthesize reactive motion to arbitrary external forces with a specific style customized to the input motion. Our method re-parameterizes the motion degrees of freedom based on joint actuations in the input motion. By only enforcing the equations of motion in the less actuated coordinates, our approach can create physically responsive motion based on kinematic pose control without explicitly computing the joint actuations. We demonstrate the simplicity and robustness of our technique by showing a variety of examples generated with the same set of parameters. Our formulation focuses on the type of perturbations that significantly disrupt the upper body poses and dynamics, but have limited effect on the whole-body balance state.

Paper

Animating Responsive Characters with Dynamic Constraints in Near-Unactuated Coordinates
Yuting Ye, C. Karen Liu
ACM Trans. Graph. (SIGGRAPH Asia 2008) 27(5), Article 112.
Author Preprint ACM DL Author-ize service bibtex reference video demo (37.1M)

Acknowledgements

The authors would like to thank Satoru Ishigaki, Wei Liu, and Shuang Hao for their help in collecting motion data. This work was supported by NSF grant CCF-CISE 0742303.

Project Members

Yuting Ye
C. Karen Liu