Rapid development of computer vision and artificial intelligence in recent years expanded the application areas of autonomous robots. However, an autonomous system may not provide a sufficient degree of reliability for applications of increased complexity and in dynamic environments. Human intelligence is frequently employed for high-level robot decision making and control. Nonetheless, human presence at the robot site is undesirable or impossible in many cases such as space or undersea exploration, hazardous material handling, disarmament of improvised explosive devices, work in nuclear facilities during emergencies, etc. Moreover, the way a user interfaces a robotic system imposes requirements for highly sensitive tasks with potential catastrophic consequences of a failure. Furthermore, tedious operation performed by a human operator and the corresponding possible degradation of work quality also contributes to the system requirements list.
The aforementioned constraints and requirements provide a motivation for extensive discussion and numerous researches. Our project is the one motivated to overcome the given problems.
Inertial Motion Capture Based Reference Generation
In this work, a full-body inertial human motion capture system Xsens MVN was used to control the Kuka youBot mobile manipulator. Xsens MVN consists of 17 inertial motion trackers attached to the body using straps capable of providing real-time full body kinematic data. The KUKA youBot is an omnidirectional mobile platform with mecanum wheels, which accommodates a five degrees of freedom robot manipulator. The robot arm is a series chain of five revolute joints with a two-finger gripper as end effector. Both the IMC system and the mobile manipulator communicate with main control computer wirelessly.
Fig. 2. Inertial motion capture based teleoperation system consisting of the gesture recognition and reference generation subsystems.
The user kinematic data is acquired by the IMC system at 50 Hz sampling rate. In order to create an intuitive mapping between the body segments and mobile manipulator parts, we decided to use the right hand position and orientation to generate the robot end effector position and orientation references. Inverse kinematics for the mobile manipulator arm is computed at each sampling time to convert hand position and orientation to robot joint angles. The body center of mass (CoM) position and orientation is used to provide position and orientation references for the omnidirectional base.
Besides the reference generation and mobile manipulator control, a user should be able to perform a high-level robot teleoperation. For this, the left arm gesture recognition based high level control of mobile manipulator was developed. The primary objective of a gesture recognition system for robots is to make a robot understand human body language (or commands associated with it) bridging the gap between the human and the machine. Gesture recognition system incorporating inertial motion capture implies the use of natural user interface concept in machine control. This is expected to ease a user learning process, allow to perform tasks with the required level of accuracy, and reveal a stable performance for tedious and tiring tasks.
Fig. 3. Implemented gestures: neutral (a), hand sideways (b), hand on waist (c), hand up (d), hand forward (e), hand on chest (f).
Six distinct left arm gestures were selected for providing high level commands to the teleoperation system. These gestures are hand forward, hand on chest, hand on waist, hand up, hand sideways and neutral. Classification accuracy for the testing set is 95.6 percent.
The set of experiments was designed to evaluate the performance of the system. The main experiment, a real-time teleoperation scenario was designed to test both the reference trajectory generation and gesture recognition system. According to the experimental scenario, a human user guides the robot through a course with static obstacles and then performs a pick-and-place task using manipulator's end effector.
Six gestures were associated with particular commands. Some of the gestures were used to toggle some states of the system such as “Manipulator On/Off”. The arm gestures and their corresponding commands are as follows:
• Hand forward gesture command activates/deactivates the mobile platform.
• Hand up gesture activates/deactivates the manipulator; deactivation moves manipulator to the initial position.
• Hand on chest gesture halts the manipulator at last acquired position or resumes it back from that position.
• Hand sideways gesture allows a user to switch between different manipulator motion modes.
• Hand on waist gesture is used to exit a selected mode.
• Neutral gesture does nothing, so that a user can rest his/her left hand.
A graphical user interface was developed to allow the user to visualize the recognized gestures. During the experiment completion, total of 25 gestures were implemented by the operator and all of them were recognized, providing 100 percent accuracy without perceived latency by the user. The task of the experiment was successfully accomplished with the system performing as expected. The user claimed that the system was very intuitive and easy to learn. Moreover, the user pointed out that he would prefer the IMC interface over another one consisting of a joystick and keyboard.
Please visit our Youtube channel to watch other ARMS Laboratory videos