TY - JOUR
T1 - A novel fusion framework for robust human tracking by a service robot
AU - Gupta, Meenakshi
AU - Kumar, Swagat
AU - Behera, Laxmidhar
AU - Subramanyam, Venkatesh K.
PY - 2017/8
Y1 - 2017/8
N2 - This work investigates the problem of robust vision-based human tracking by a human following robot using point-based features like SURF. The problem is challenging owing to failures arising because of variation in illumination, change in pose, size or scale, camera motion and partial or full occlusion. While point-based features provide robust detection against photometric and geometric distortions, the tracking of these features over subsequent frames becomes difficult as the number of matching points between a pair of images drops quickly with slight variation in target attribute owing to above mentioned variations. The problem of robust human tracking by the robot is solved by proposing a multi-tracker fusion framework that allows one to combine multiple tracker to ensure long term tracking of the target. This fusion framework also allows for creating a dynamic template pool of target features which gets updated over time. The interaction between the first two trackers is used to update the template pool of the target attribute while the last tracker is used to estimate the location of the target in case of full occlusion. The working of the framework is demonstrated by combining a SURF-based mean-shift tracker, an optical-flow tracker and a Kalman filter to provide robust tracking over a long time. The efficacy of the resulting tracker is demonstrated through rigorous testing on a variety of video datasets.
AB - This work investigates the problem of robust vision-based human tracking by a human following robot using point-based features like SURF. The problem is challenging owing to failures arising because of variation in illumination, change in pose, size or scale, camera motion and partial or full occlusion. While point-based features provide robust detection against photometric and geometric distortions, the tracking of these features over subsequent frames becomes difficult as the number of matching points between a pair of images drops quickly with slight variation in target attribute owing to above mentioned variations. The problem of robust human tracking by the robot is solved by proposing a multi-tracker fusion framework that allows one to combine multiple tracker to ensure long term tracking of the target. This fusion framework also allows for creating a dynamic template pool of target features which gets updated over time. The interaction between the first two trackers is used to update the template pool of the target attribute while the last tracker is used to estimate the location of the target in case of full occlusion. The working of the framework is demonstrated by combining a SURF-based mean-shift tracker, an optical-flow tracker and a Kalman filter to provide robust tracking over a long time. The efficacy of the resulting tracker is demonstrated through rigorous testing on a variety of video datasets.
KW - Online template generation
KW - Optical flow
KW - Region growing algorithm
KW - SURF-based mean-shift algorithm
KW - Template-update
KW - Visual tracking
UR - http://www.mendeley.com/research/novel-fusion-framework-robust-human-tracking-service-robot
UR - http://www.scopus.com/inward/record.url?scp=85020264135&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85020264135&partnerID=8YFLogxK
U2 - 10.1016/j.robot.2017.05.001
DO - 10.1016/j.robot.2017.05.001
M3 - Article (journal)
SN - 0921-8890
VL - 94
SP - 134
EP - 147
JO - Robotics and Autonomous Systems
JF - Robotics and Autonomous Systems
ER -