Ti5 Robotics uses Xsens inertial motion capture to speed humanoid robot motion modeling by 30% and boost complex action success by 40%+, enabling industrial-grade precision, including ±2 mm placement in narrow aisles.
Challenge: Ti5 needed a faster path to robust humanoid robot motion development, avoiding long cycles of manual modeling and tuning, while maintaining performance that holds up in real industrial environments.
Solution: Ti5 integrated Xsens inertial motion capture into a closed-loop R&D workflow, connecting motion data capture to motion mapping, control development, testing, and optimization.
Key outcomes:
- ~30% reduction in the modeling cycle for core robotic motions
- 40%+ higher success rates for complex actions such as precise grasping and gait switching
- Material placement precision within ±2 mm in narrow workshop aisles
Ti5 Robotics is a leading Chinese robotics company developing high-precision humanoid robot hardware. Its flagship “T” series includes industrial collaborative models and commercial service robots. The focus is simple: flexible motion, real-world adaptability, and precise task execution.
These robots are built for real environments. Think material handling and assembly in smart manufacturing, customer-facing roles like retail guidance and venue reception, and R&D platforms for universities and research labs.
Why motion capture
Ti5 chose motion capture to start from real human movement. It provides high-quality, multidimensional motion trajectories, including the small adjustments skilled workers make without thinking.
With motion capture in the loop, Ti5 can build complex motions faster, reduce modeling errors, and iterate using data feedback. This helps the robot move naturally while still meeting industrial-grade precision targets.
Why Xsens
Ti5 learned about Xsens through peer recommendations and technical research, then selected Xsens as the motion capture foundation for humanoid robot development.
Their decision was driven by trust in high-quality, precise, reliable motion data and stability in demanding environments. Ti5 highlights " Xsens' low integration complexity enables rapid embedding into our robotics development workflow without complex adaptation or debugging.”
How Ti5 uses Xsens in their humanoid robot workflow
Xsens helps Ti5 connect motion capture to robot control development in a closed-loop process, so motion data is not only captured, but directly improves robot behavior.
Technical workflow:
-
Capture: Xsens inertial sensor modules are deployed on reference human models and at key robot joint locations to capture motion synchronously.
-
Process: Data acquisition, data cleaning and fusion via Xsens algorithms, import into Ti5’s motion control platform, motion mapping and algorithm training, prototype testing and optimization.
-
Real-time link: Motion data is synchronized to the robot control system through a dedicated interface, with latency controlled at the millisecond level.
The results Ti5 reports with Xsens
Ti5 points to measurable impact in both development speed and robot performance: “We reduced the modeling cycle for core robotic motions by approximately 30%, significantly lowering R&D investment.” A shorter modeling cycle means the team can run more iterations, stabilize behaviors faster, and expand the motion library across tasks sooner.
Ti5 also reports a 40% or greater increase in success rates for complex actions such as precise grasping and gait switching. These motions are sensitive; small timing or coordination errors often turn into failures.
Finally, Ti5 highlights faster robot response to environmental changes, supported by low-latency motion data synchronization. This helps the robot adjust more confidently when conditions shift in real facilities.
In an industrial material handling trial, Ti5 used Xsens to capture skilled workers performing load-bearing walking and precise placement. After implementation, Ti5 reports obstacle avoidance accuracy and material placement precision within ±2 mm in narrow workshop aisles, supporting precision manufacturing requirements where small errors can result in rework and downtime.
What’s next
In the next 1 to 2 years, Ti5 plans to scale deliveries of industrial-grade humanoids and expand into 10+ commercial service scenarios. They expect richer motion data to help robots learn more complex movements as deployments grow.
Over the next 3 to 5 years, Ti5 aims to build a universal humanoid platform for cross-industry use. As multi-sensor fusion and motion capture advance, they expect stronger full-scenario adaptability across indoor and outdoor tasks.ks.
Takeaway
When you are building humanoid robots for the real world, motion is not just a feature, it is the foundation. Ti5 Robotics uses Xsens motion capture to turn skilled human movement into robot-ready motion intelligence, helping shorten development cycles, improve success rates on complex actions, and reach industrial-grade precision where it counts.
FAQ
Why use motion capture for humanoid robot training?
Motion capture provides high-quality, multidimensional motion trajectories based on how skilled workers actually move. That data helps robotics teams reproduce complex motion logic faster, reduce trial-and-error in manual modeling, and iterate quickly using real feedback.
What results can motion capture improve in industrial humanoids?
Motion capture can improve development speed, motion realism, and task success on complex behaviors that require coordinated joint timing, balance, and transitions. In industrial settings, this can translate into more consistent execution, higher success rates on complex actions, and better performance in constrained environments.
How does Xsens integrate into robot control workflows?
Ti5 integrates Xsens by capturing motion data on reference human models, processing the data through cleaning and fusion, and importing it into their motion control platform for motion mapping and algorithm training. The workflow closes the loop through prototype testing and optimization, with real-time synchronization to the robot control system and millisecond-level latency control.