EN

Human Machine Interaction

Human Machine InteractionHuman-Machine Interaction (HMI) is a study of interactions between humans and machines. Human Machine Interaction is a multidisciplinary field with contributions from Human-Computer interaction (HCI), Human-Robot Interaction (HRI), Robotics, Artificial Intelligence (AI), Humanoid robots and exoskeleton control.

Overview

Inertial Measurement Units (IMU’s) and HMI

Inertial Measurement Units (IMU’s) of Xsens provide accurate and reliable 3D orientation, 3D acceleration, 3D rate of turn and 3D magnetic field data. For HMI purposes, IMU’s can be used as a stand-alone product or with multiple synchronized IMU’s. The MTw Awinda contains a number of 3D motion trackers of your choice + a Software Development Kit (SDK). The SDK gives you the possibility to access 3D orientation, 3D acceleration, 3D rate of turn and the 3D magnetic field data of every motion tracker. With a special sensor fusion algorithm Xsens is able to calculate very accurate 3D orientation of each motion tracker. A unique wireless radio protocol takes care of a very accurate time synchronization of less than 10 microseconds. Therefore the MTw Awinda is a flexible solution as you can choose how many MTw’s you want to integrate into your application.

 

Motion Capture and HMI

The Xsens MVN Analyze provides Motion Capture for human movement without constraining the freedom of movement and without occlusions. The fast set up and calibration allows the user to use the product for data acquisition within 10 minutes. The MVN Analyze software is capable of outputting and live streaming of human kinematic data and raw Motion Tracker data, allowing the user to use this data for real-time control of robots, exoskeletons and humanoids. Low latency live streaming of joint position, accelerations, angular velocity, 3D orientation and center of mass is directly available within the software.

 

Applications

The MTw Awinda and the Xsens MVN (Analyze) allows the user to (live) stream 3D kinematic or IMU data to their own application. This data allows users to apply typical HMI techniques like posture recognition, motion classification, motion segmentation, motion learning models, motion synchronization, command recognition and activity level recognition.

 

Cases for Human Machine Interaction

RIF e.V. – Institute for Research and Transfer applies MVN BIOMECH
Analyse manual work processes
Simulations of hybrid work processes with direct human robot collaboration
Beaming – Combining virtual reality, tracking and teleoperation technology
Beaming
Combining virtual reality, tracking and teleoperation technology
Robo Sally – Bomb disposal robot
Robo Sally
Remote controled bomb disposal robot
Study of Effects of Intent Recognition Errors on Neural Control of Powered Lower Limb Prostheses
Lower Limb Prostheses
Study of Effects of Intent Recognition Errors on Neural Control of Powered Lower Limb Prostheses
Real-time control of robot with Xsens MVN
Mahru
Real-time control of robot with Xsens MVN by the Korean Institute of Science and Technology (KIST).
MTi helps in predicting a thrown ball’s trajectory
Humanoid Rollin' Justin
MTi helps in predicting a thrown ball’s trajectory by the DLR’s Institute of Robotics and Mechatronics.
Stabilization of a walking robot
FLAME
Stabilization of a walking robot by the Delft University of Technology.

Contact


X

Xsens newsletter

Signup today to stay updated on all major Xsens news and developments. Subscribe now