Back to Projects
Tools2018

Motion Capture System for VR

Low-fidelity motion capture using HTC Vive for rapid character animation iteration

Project Image Placeholder

Tech Stack

Unity3DInverse KinematicsKalman FilterAnimationVR/AR

Overview

Professional motion capture systems are expensive and require dedicated studio space, creating a bottleneck in the animation pipeline for VR and AR projects. This project implemented a low-fidelity but rapid motion capture system using consumer VR hardware — HTC Vive headset, controllers, and body trackers — to drive humanoid 3D character models from the user's real-time motion data.

Process & Approach

The system maps tracking data from the Vive's lighthouse-tracked devices to a humanoid character rig using inverse kinematics. Raw tracking data is filtered through a Kalman filter to reduce jitter while preserving the responsiveness needed for natural-feeling animation. The captured motion can be recorded, played back, and exported for use in VR, AR, or traditional 3D applications. The focus was on speed of iteration rather than production-quality fidelity — allowing animators to quickly block out movement, test ideas, and evaluate animations in-context.

Key Features

  • Real-time body tracking from HTC Vive HMD, controllers, and trackers
  • Kalman filter for noise reduction on tracking data
  • IK-driven humanoid character animation
  • Motion recording and playback system
  • Export pipeline for VR, AR, and 3D applications

Technical Challenges

Consumer VR tracking hardware has limited body coverage compared to professional mocap suits — only the head, hands, and (with optional trackers) feet are tracked. Inferring convincing full-body motion from these sparse inputs required tuning the IK solver and adding procedural animation for the spine, shoulders, and secondary motion. The Kalman filter parameters needed careful tuning to balance smoothness against latency.

Impact & Learnings

The system reduced the animation iteration cycle from hours (with traditional keyframe or outsourced mocap) to minutes, enabling rapid prototyping of character animations for XR projects. It proved particularly valuable for testing spatial interactions and evaluating animation quality directly within VR environments.