Towards a Dataset of Activities for Action Recognition in Open Fields

Towards a Dataset of Activities for Action Recognition in Open Fields

Title: Towards a Dataset of Activities for Action Recognition in Open Fields
Authors: Alexander Gabriel (Lincoln Centre for Autonomous Systems, University of Lincoln); Nicola Bellotto (Lincoln Centre for Autonomous Systems, University of Lincoln); Paul Baxter (Lincoln Centre for Autonomous Systems, University of Lincoln);
Year: 2019
Citation: Gabriel, A., Bellotto, N., Baxter, P., (2019). Towards a Dataset of Activities for Action Recognition in Open Fields. UK-RAS19 Conference: “Embedded Intelligence: Enabling & Supporting RAS Technologies” Proceedings, 64-67. doi: 10.31256/UKRAS19.17

Abstract:

In an agricultural context, having autonomous robots that can work side-by-side with human workers provide a range of productivity benefits. In order for this to be achieved safely and effectively, these autonomous robots require the ability to understand a range of human behaviors in order to facilitate task communication and coordination. The recognition of human actions is a key part of this, and is the focus of this paper. Available datasets for Action Recognition generally feature controlled lighting and framing while recording subjects from the front. They mostly reflect good recording conditions but fail to model the data a robot will have to work with in the field, such as varying distance and lighting conditions. In this work, we propose a set of recording conditions, gestures and behaviors that better reflect the environment an agricultural robot might find itself in and record a dataset with a range of sensors that demonstrate these conditions.

Download PDF