Vision-based Gesture Tracking for Teleoperating Mobile Manipulators

Vision-based Gesture Tracking for Teleoperating Mobile Manipulators

Title: Vision-based Gesture Tracking for Teleoperating Mobile Manipulators
Authors: Tianyu Wang (University of Leeds); Yuhui Wan (University of Leeds); Christopher Peers (University of Leeds); Jingcheng Sun (University of Leeds); Chengxu Zhou (University of Leeds);
Year: 2022
Citation: Wang, T., Wan, Y., Peers, C., Sun, J., Zhou, C., (2022). Vision-based Gesture Tracking for Teleoperating Mobile Manipulators. UKRAS22 Conference “Robotics for Unconstrained Environments” Proceedings, 52-53. doi: 10.31256/Bj3Qo8Y

teleoperation
hand gesture recognition
mobile manipulator

Abstract:

Abstract—In unconstrained environments, teleoperation exists in consumer-level robots. Motion capture technology has been shown to shrink the skill gap between end user and teleoperation of robots, however, the equipment is getting more expensive and complex such as with motion capture suits. This paper provides a vision-based method with a camera to reduce costs and make teleoperation accessible for consumers. The hand coordinates received from the camera are estimated through Google MediaPipe, a hand posture estimation package. Teleoperation strategies are composed of coordinates and hand gestures. We conduct a simulation study on the Husky robotic platform with a manipulator in PyBullet and demonstrate the control framework
through various gestures.

Download PDF