Por favor, use este identificador para citar o enlazar a este item: http://hdl.handle.net/10261/30404
COMPARTIR / EXPORTAR:
logo share SHARE BASE
Visualizar otros formatos: MARC | Dublin Core | RDF | ORE | MODS | METS | DIDL | DATACITE

Invitar a revisión por pares abierta
Campo DC Valor Lengua/Idioma
dc.contributor.authorMoreno-Noguer, Francesc-
dc.contributor.authorSanfeliu, Alberto-
dc.contributor.authorSamaras, Dimitris-
dc.date.accessioned2010-12-17T08:02:22Z-
dc.date.available2010-12-17T08:02:22Z-
dc.date.issued2006-
dc.identifier.citationIEEE International Conference on Robotics and Automation: 4081-4087 (2006)-
dc.identifier.isbn0780395050-
dc.identifier.urihttp://hdl.handle.net/10261/30404-
dc.descriptionPresentado al ICRA/2006 celebrado en Orlando(USA).-
dc.description.abstractRobotics applications based on computer vision algorithms are highly constrained to indoor environments where conditions may be controlled. The development of robust visual algorithms is necessary for improving the capabilities of many autonomous systems in outdoor and dynamic environments. In particular, this paper proposes a tracking algorithm robust to several artifacts which may be found in real world applications, such as lighting changes, cluttered backgrounds and unexpected target movements. In order to deal with these difficulties the proposed tracking methodology integrates several Bayesian filters. Each filter estimates the state of a particular object feature which is conditionally dependent on another feature estimated by a distinct filter. This dependence provides improved representations of the target, allowing to segment it out from the background of the image. We describe the updating procedure of the Bayesian filters by a ‘hypotheses generation and correction’ scheme. The main difference with respect to previous approaches is that the dependence between filters is considered during the feature observation, i.e, into the ‘hypotheses correction’ stage, instead of considering it when generating the hypotheses. This proves to be much more effective in terms of accuracy and reliability.-
dc.description.sponsorshipThis work was supported by the project 'Integration of robust perception, learning, and navigation systems in mobile robotics' (J-0929).-
dc.description.sponsorshipThis work was supported by CICYT project DPI2004-05414 from the Spanish Ministry of Science and Technology, and by the grants from U.S Department of Justice (2004-DD-BX-1224), Department of Energy (MO-068) and National Science Foundation (ACI-0313184).-
dc.language.isoeng-
dc.publisherInstitute of Electrical and Electronics Engineers-
dc.rightsopenAccess-
dc.subjectBayesian methods-
dc.subjectObject detection-
dc.subjectComputer vision-
dc.titleIntegration of dependent Bayesian filters for robust tracking-
dc.typecomunicación de congreso-
dc.identifier.doi10.1109/ROBOT.2006.1642329-
dc.description.peerreviewedPeer Reviewed-
dc.relation.publisherversionhttp://dx.doi.org/10.1109/ROBOT.2006.1642329-
dc.contributor.funderMinisterio de Ciencia y Tecnología (España)-
dc.identifier.funderhttp://dx.doi.org/10.13039/501100006280es_ES
dc.type.coarhttp://purl.org/coar/resource_type/c_5794es_ES
item.openairetypecomunicación de congreso-
item.grantfulltextopen-
item.cerifentitytypePublications-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.fulltextWith Fulltext-
item.languageiso639-1en-
Aparece en las colecciones: (IRII) Libros y partes de libros
Ficheros en este ítem:
Fichero Descripción Tamaño Formato
Integration of dependent Bayesian.pdf1,01 MBAdobe PDFVista previa
Visualizar/Abrir
Show simple item record

CORE Recommender

Page view(s)

295
checked on 21-abr-2024

Download(s)

193
checked on 21-abr-2024

Google ScholarTM

Check

Altmetric

Altmetric


NOTA: Los ítems de Digital.CSIC están protegidos por copyright, con todos los derechos reservados, a menos que se indique lo contrario.