Smart wheelchairs: Visual perception pipeline to improve prediction of user intention

Smart wheelchairs: Visual perception pipeline to improve prediction of user intention

Title: Smart wheelchairs: Visual perception pipeline to improve prediction of user intention
Authors: Tomos Fearn (Aberystwyth University); Frederic Labrosse (Aberystwyth University); Patricia Shaw (Aberystwyth University);
Year: 2022
Citation: Fearn, T., Labrosse, F., Shaw, P., (2022). Smart wheelchairs: Visual perception pipeline to improve prediction of user intention. UKRAS22 Conference “Robotics for Unconstrained Environments” Proceedings, 18-19. doi: 10.31256/Ad3Ko8T

robotics
wheelchairs
semantic mapping
ROS

Abstract:

Abstract—Wheelchairs aid people with physical disabilities by assisting with mobility, thus improving their independence. Autonomous assistance on wheelchairs are limited to prototypes that provide ‘smart functionality’, by completing tasks such as docking or terrain adaption. The biggest constraints are navigating within dynamic environments, such as the home. This paper describes the data pipeline to automate the
wheelchair navigation process, from classifying an object, estimating
the user’s intention via verbal command (e.g. take me to the fridge) and navigating towards a goal. Object locations will be registered within a map whilst contextual meta data is calculated. A combination of object classification confidence and object instances is used to calculate the uniqueness of all identifiable objects. Thus, assisting in predicting the user’s intention. For example, if a “go to the fridge” request is received,
the wheelchair will know that the fridge is located within the kitchen, and therefore drive to the kitchen and then the fridge. Results show that utilising contextual data reduces the likelihood of false-positive object detections being registered by the navigation pipeline, thus is more likely to interpret the user intention more accurately.

Download PDF