Yuting Ye
Ph.D.
Researcher
yutingye.public AT gmail.com

Yi Zhou, Chenglei Wu, Zimo Li, Chen Cao, Yuting Ye, Jason Saragih, Hao Li, Yaser Sheikh. 2020. Fully Convolutional Mesh Autoencoder using Efficient Spatially Varying Kernels. NeurIPS
[ Project       Paper     Code ]
Mesh convolution is challenging due to the irregular local connectivities. We proposed a novel convolution operator with globally shared weights and varying local coefficients.

Sophie Jörg, Yuting Ye, Michael Neff, Franziska Mueller, Victor Zordan. 2020. Virtual hands in VR: motion capture, synthesis, and perception. SIGGRAPH and SIGGRAPH Asia Courses. (Materials available upon request)
In this course, we cover state-of-the-art methods in generating hand motions in virtual reality. Topics include a survey of motion capture hardware and algorithms, hand animation with or without physics, and how hands are perceived in VR.

Shangchen Han, Beibei Liu, Randi Cabezas, Christopher D Twigg, Peizhao Zhang, Jeff Petkau, Tsz-Ho Yu, Chun-Jung Tai, Muzaffer Akbay, Zheng Wang, Asaf Nitzan, Gang Dong, Yuting Ye, Lingling Tao, Chengde Wan, Robert Wang. 2020. MEgATrack: monochrome egocentric articulated hand-tracking for virtual reality. SIGGRAPH
[ Project       Paper     Video ]
Hand tracking technology shipped on Oculus standalone headsets.

Julian Habekost, Takaaki Shiratori, Yuting Ye and Taku Komura. 2020. Learning 3D Global Human Motion Estimation from Unpaired, Disjoint Datasets. BMVC
[ Project       Paper     Video ]
Natural human motions are constrained to maintain contacts and body size over time. These are useful information to estimate global human motion from monocular images, and we can learn such prior from a generic motion database independently.

Yikang Li, Chris Twigg, Yuting Ye, Lingling Tao, Xiaogang Wang. 2019. Disentangling Pose from Appearance in Monochrome Hand Images. ICCV Hands Workshop
[ Project       Paper ]
We explored self-supervised learning to disentangle the hand pose from its appearance in an monochrome image from unpaired data. We show that the disentanglement improves the robustness of pose estimation to appearance variations.

Ryan Canales, Aline Normoyle, Yu Sun, Yuting Ye, Massimiliano Di Luca, Sophie Jörg. 2019. Virtual Grasping Feedback and Virtual Hand Ownership. ACM SAP
[ Project       Paper     Video ]
We experimented with different visual styles for grasping virtual objects in VR. We found that showing the real hand poses with object penetration helps accomplish grasping more efficiently, but users prefer physically consistent penetration-free poses.

Lorraine Lin, Aline Normoyle, Alexandra Adkins, Yun Sun, Andrew Robb, Yuting Ye, Max Di Luca, Sophie Joerg. 2019. The Effect of Hand Size and Interaction Modality on the Virtual Hand Illusion. IEEE VR
[ Project       Paper     Video ]
Our VR experiment found that a matching hand size with dextrous finger tracking are more preferrable for most users, but other conditions still provide a fun experience.

Abhronil Sengupta, Yuting Ye, Robert Wang, Chiao Liu, Kaushik Roy. 2019. Going Deeper in Spiking Neural Networks: VGG and Residual Architectures. Frontiers in Neuroscience
[ Project       Paper ]
Deep spiking neural networks are difficult to train. We instead convert deep neural networks from the analog domain to the spiking domain, with a small loss in accuracy.

Shangchen Han, Beibei Liu, Robert Wang, Yuting Ye, Chris D. Twigg, Kenrick Kin. 2018. Online Optical Marker-based Hand Tracking with Deep Labels. ACM Trans. Graph. (SIGGRAPH)
[ Project         Paper  Preprint ACM DL Author-ize service     BibTeX     Video     Data  ]
We developed a deep-learning based marker labeling alogrithm for online capture of complex and dexterous hand motions, including hand-hand and hand-object interactions.

Jeongseok Lee, Michael X. Grey, Sehoon Ha, Tobias Kunz, Sumit Jain, Yuting Ye, Siddhartha S. Srinivasa, Mike Stilman, C. Karen Liu. 2018. DART: Dynamic Animation and Robotics Toolkit. The Journal of Open Source Software
[ Project Github     Paper     BibTeX ]

Daniel Zimmermann, Stelian Coros, Yuting Ye, Bob Sumner, Markus Gross. 2015. Hierarchical Planning and Control for Complex Motor Tasks. SCA '15 Proceedings of the ACM SIGGRAPH/Eurographics Symposuim on Computer Animation
[ Project         Paper  Preprint ACM DL Author-ize service     BibTeX     Video   ]
We presented a planning and control framework that utilizes a hierarchy of dynamic models with increasing complexity and accuracy for complex tasks.

Stéphane Grabli, Kevin Sprout, Yuting Ye. 2015. Feature-Based Texture Stretch Compensation for 3D Meshes. ACM SIGGRAPH Talks.
[ Paper  Preprint ACM DL Author-ize service     BibTeX ]
Stretching of rigid features on deforming skins usually betrays the synthetic nature of a CG creature. Our method mitigates such effects by re-parameterizing the texture coordinates to compensate for unwanted deformations on the 3D mesh.

Rachel Rose, Yuting Ye. 2015. Multi-resolution Geometric Transfer for Jurassic World. ACM SIGGRAPH Talks.
[ Paper  Preprint ACM DL Author-ize service     BibTeX ]
We developed an easy-to-use workflow for artists to transfer geometric data between different mesh resolutions. Our tool enables efficient development of highly detailed creature assets such as the dinosaurs in "Jurassic World".

Hao Li, Jihun Yu, Yuting Ye, Chris Bregler. 2013. Realtime Facial Animation with On-the-fly Correctives. ACM Trans. Graph. (SIGGRAPH)
[ Project         Paper  Preprint ACM DL Author-ize service     BibTeX     Video   ]
We developed a realtime facial tracking and retargeting system using an RGBD sensor. Our system produces accurate tracking results by continuously adapting to user specific expressions on-the-fly.

Kiran Bhat, Rony Goldenthal, Yuting Ye, Ronald Mallet, Michael Koperwas. 2013. High Fidelity Facial Animation Capturing and Retargeting With Contours. SCA '13 Proceedings of the ACM SIGGRAPH/Eurographics Symposuim on Computer Animation.
[ Paper  Preprint    BibTeX    Video ]
We developed a facial animation tracking system that utilizes eyelid and lip sillouettes for high fidelity results.

Sehoon Ha, Yuting Ye, C. Karen Liu. 2012. Falling and Landing Motion Control for Character Animation. ACM Trans. Graph. (SIGGRAPH Asia)
[ Paper  Preprint ACM DL Author-ize service     BibTeX     Video   ]
We developed a general controller that allows a character to fall from a wide range of heights and initial velocities, continuously roll on the ground, and get back on feet.

Yuting Ye, C. Karen Liu. 2012. Synthesis of Detailed Hand Manipulations Using Contact Sampling. ACM Trans. Graph. (SIGGRAPH)
[ Project         Paper  Preprint ACM DL Author-ize service     BibTeX     Video   ]
This work synthesizes detailed and physically plausible hand-object manipulations from captured motions of the full-body and objects. By sampling contact points between the hand and the object, we can efficiently discover complex finger gaits.

Yuting Ye, C. Karen Liu. 2010. Optimal Feedback Control for Character Animation Using an Abstract Model. ACM Trans. Graph. (SIGGRAPH)
[ Project         Paper  Preprint ACM DL Author-ize service     BibTeX     Video   ]
We presented an optimal feedback controller for a virtual character to follow a reference motion under physical perturbations and changes in the environment by replanning long-term goals and adjusting the motion timing on-the-fly.

Yuting Ye, C. Karen Liu. 2010. Synthesis of Responsive Motion Using a Dynamic Model. Computer Graphics Forum (Eurographics)
[ Project         Paper  Preprint EG     BibTeX     Video   ]
We presented a nonlinear dimensionality reduction model for learning responsive behaviors from very few motion capture examples. Our model can synthesize physically plausible motions of a character responding to unexpected perturbations.

Sumit Jain, Yuting Ye, C. Karen Liu. 2009. Optimization-Based Interactive Motion Synthesis. ACM Trans. Graph. (TOG)
[ Project         Paper  Preprint ACM DL Author-ize service     BibTeX     Video   ]
We presented a physics-based approach to synthesizing motions of a responsive virtual character in a dynamically varying environment. A constrained optimization problem that encodes high-level kinematic control strategies is solved at every time step.

Yuting Ye, C. Karen Liu. 2008. Animating Responsive Characters with Dynamic Constraints in Near-Unactuated Coordinates. ACM Trans. Graph. (SIGGRAPH Asia)
[ Project         Paper  Preprint ACM DL Author-ize service     BibTeX     Video   ]
We developed a novel algorithm to animate physically responsive virtual characters by combining kinematic pose control with dynamic constraints in the joint actuation space.


Sumit Jain, Yuting Ye, C. Karen Liu. 2007. Optimization-based Interactive Motion Synthesis for Virtual Characters. ACM SIGGRAPH Sketches and Posters. (Third place in Student Research Competition)
[ Sketch    Poster  ACM DL   BibTeX     Video ]

Theses


Simulation of characters with natural interactions
Phd thesis
Georgia Institute of Technology, 2012

A momentum-based Bipedal Balance Controller
Master's project
University of Virginia, 2006
Presentation (ppt, 1.09M)        Video (zip, 3.4M)

An Interactive 2D Vector Graphics Editing System
Undergraduate thesis
Peking University, 2004