Exploring Rotated Object Detection Models for Antipodal Robotic Grasping

Exploring Rotated Object Detection Models for Antipodal Robotic Grasping

Title: Exploring Rotated Object Detection Models for Antipodal Robotic Grasping
Authors: Valerija Holomjova (University of Aberdeen); Pascal Meißner (University of Aberdeen);
Year: 2022
Citation: Holomjova, V., Meißner, P., (2022). Exploring Rotated Object Detection Models for Antipodal Robotic Grasping. UKRAS22 Conference “Robotics for Unconstrained Environments” Proceedings, 62-63. doi: 10.31256/Sp7Gn7W

Grasping
Deep Learning for Robotics
Perception for Grasping
Computer Vision for Automation

Abstract:

Abstract—Current deep learning approaches used by robotic grasping systems for predicting multiple valid grasps across various objects from images have achieved great results, but often stem from object detectors that were originally designed for predicting horizontal bounding boxes. Since 2D grasp poses are more naturally represented by oriented bounding boxes, in this paper, we explore the suitability of three top-performing rotated object detectors as they are composed of modules tailored for encoding rotated object features more precisely. The performance of the oriented detectors is compared against an effective grasp detection model architecture from literature on two publicly
available grasping datasets. Results show that oriented detectors
obtained comparable grasp accuracy scores on both datasets, whilst being more capable of producing confident and diverse sets of grasps. Code is available at https://github.com/valerijah/ exploring rotated object detection models.

Download PDF