English   español  
Please use this identifier to cite or link to this item: http://hdl.handle.net/10261/127313
Share/Impact:
Statistics
logo share SHARE logo core CORE   Add this article to your Mendeley library MendeleyBASE

Visualizar otros formatos: MARC | Dublin Core | RDF | ORE | MODS | METS | DIDL
Exportar a otros formatos:

Title

Bayesian Human Motion Intentionality Prediction in urban environments

AuthorsFerrer, Gonzalo ; Sanfeliu, Alberto
KeywordsCrowd analysis
Pattern recognition
Human motion prediction
Issue Date2014
PublisherElsevier
CitationPattern Recognition Letters 44: 134-140 (2014)
AbstractHuman motion prediction in indoor and outdoor scenarios is a key issue towards human robot interaction and intelligent robot navigation in general. In the present work, we propose a new human motion intentionality indicator, denominated Bayesian Human Motion Intentionality Prediction (BHMIP), which is a geometric-based long-term predictor. Two variants of the Bayesian approach are proposed, the Sliding Window BHMIP and the Time Decay BHMIP. The main advantages of the proposed methods are: a simple formulation, easily scalable, portability to unknown environments with small learning effort, low computational complexity, and they outperform other state of the art approaches. The system only requires training to obtain the set of destinations, which are salient positions people normally walk to, that configure a scene. A comparison of the BHMIP is done with other well known methods for long-term prediction using the Edinburgh Informatics Forum pedestrian database and the Freiburg People Tracker database.
Publisher version (URL)http://dx.doi.org/10.1016/j.patrec.2013.08.013
URIhttp://hdl.handle.net/10261/127313
DOI10.1016/j.patrec.2013.08.013
Identifiersissn: 0167-8655
Appears in Collections:(IRII) Artículos
Files in This Item:
File Description SizeFormat 
Bayesian Human Motion.null613,47 kBUnknownView/Open
Show full item record
Review this work
 

Related articles:


WARNING: Items in Digital.CSIC are protected by copyright, with all rights reserved, unless otherwise indicated.