Hand-Eye-Object Tracking for Human Intention Inference

Samuel Adebayo*, Seán McLoone, Joost C. Dessing

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)
21 Downloads (Pure)

Abstract

Optimising human robot interaction in the performance of collaborative tasks is a challenging problem. An important aspect of this problem relates to the robot’s understanding of human intentions. Empowering robots with accurate intention inference capabilities sets the stage for more natural, safe and efficient interactions and greater confidence in the Human Robot Interaction domain. Intentions can be deduced by observing human cues as they interact with the environment, but currently there is no clear-cut method for achieving this in an effective way. Here, we present a novel method for intention inference based on the integration of three visual cues, namely, hand movement, eye fixation, and object interaction, coupled with a bidirectional LSTM neural network for classification of human intention. Experimental studies evaluating our approach against two-visual cue alternatives confirm the utility of our approach.
Original languageEnglish
Pages (from-to)174-179
Number of pages6
JournalIFAC-PapersOnLine
Volume55
Issue number12
DOIs
Publication statusPublished - 10 Aug 2022
Event6th IFAC International Conference on Intelligent Control and Automation Sciences - Cluj-Napoca, Romania
Duration: 13 Jul 202215 Jul 2022
Conference number: 6
https://icons2022.utcluj.ro/

Fingerprint

Dive into the research topics of 'Hand-Eye-Object Tracking for Human Intention Inference'. Together they form a unique fingerprint.

Cite this