Egocentric activity monitoring and recovery

Ardhendu Behera, David Hogg, Anthony Cohn

Research output: Contribution to conferencePaperpeer-review

29 Citations (Scopus)


This paper presents a novel approach for real-time egocentric activity recognition in which component atomic events are characterised in terms of binary relationships between parts of the body and manipulated objects. The key contribution is to summarise, within a histogram, the relationships that hold over a fixed time interval. This histogram is then classified into one of a number of atomic events. The relationships encode both the types of body parts and objects involved (e.g. wrist, hammer) together with a quantised representation of their distance apart and the normalised rate of change in this distance. The quantisation and classifier are both configured in a prior learning phase from training data. An activity is represented by a Markov model over atomic events. We show the application of the method in the prediction of the next atomic event within a manual procedure (e.g. assembling a simple device) and the detection of deviations from an expected procedure. This could be used for example in training operators in the use or servicing of a piece of equipment, or the assembly of a device from components. We evaluate our approach (’Bag-of-Relations’) on two datasets: ‘labelling and packaging bottles’ and ‘hammering nails and driving screws’, and show superior performance to existing Bag-of-Features methods that work with histograms derived from image features. Finally, we show that the combination of data from vision and inertial (IMU) sensors outperforms either modality alone.
Original languageEnglish
Publication statusPublished - 2013
Event11th Asian Conference on Computer Vision - Daejeon, Korea, Republic of
Duration: 5 Nov 20129 Nov 2012


Conference11th Asian Conference on Computer Vision
Country/TerritoryKorea, Republic of


Dive into the research topics of 'Egocentric activity monitoring and recovery'. Together they form a unique fingerprint.

Cite this