English   español  
Please use this identifier to cite or link to this item: http://hdl.handle.net/10261/195680
Share/Impact:
Statistics
logo share SHARE logo core CORE   Add this article to your Mendeley library MendeleyBASE

Visualizar otros formatos: MARC | Dublin Core | RDF | ORE | MODS | METS | DIDL
Exportar a otros formatos:

Title

Event-driven stereo visual tracking algorithm to solve object occlusion

AuthorsCamuñas-Mesa, L. ; Serrano-Gotarredona, Teresa ; Ieng, Sio-Hoi; Benosman, Ryad; Linares-Barranco, Bernabé
Keywordsstereo vision
Address event representation (AER)
neuromorphic vision
object occlusion
object tracking
event-driven processing
Issue DateSep-2018
PublisherInstitute of Electrical and Electronics Engineers
CitationIEEE Transactions on Neural Networks and Learning Systems 29(9): 4223-4237 (2018)
AbstractObject tracking is a major problem for many computer vision applications, but it continues to be computationally expensive. The use of bio-inspired neuromorphic event-driven dynamic vision sensors (DVSs) has heralded new methods for vision processing, exploiting reduced amount of data and very precise timing resolutions. Previous studies have shown these neural spiking sensors to be well suited to implementing single-sensor object tracking systems, although they experience difficulties when solving ambiguities caused by object occlusion. DVSs have also performed well in 3-D reconstruction in which event matching techniques are applied in stereo setups. In this paper, we propose a new event-driven stereo object tracking algorithm that simultaneously integrates 3-D reconstruction and cluster tracking, introducing feedback information in both tasks to improve their respective performances. This algorithm, inspired by human vision, identifies objects and learns their position and size in order to solve ambiguities. This strategy has been validated in four different experiments where the 3-D positions of two objects were tracked in a stereo setup even when occlusion occurred. The objects studied in the experiments were: 1) two swinging pens, the distance between which during movement was measured with an error of less than 0.5%; 2) a pen and a box, to confirm the correctness of the results obtained with a more complex object; 3) two straws attached to a fan and rotating at 6 revolutions per second, to demonstrate the high-speed capabilities of this approach; and 4) two people walking in a real-world environment.
Publisher version (URL)https://doi.org/10.1109/TNNLS.2017.2759326
URIhttp://hdl.handle.net/10261/195680
DOI10.1109/TNNLS.2017.2759326
ISSN2162-237X
E-ISSN2162-2388
Appears in Collections:(IMSE-CNM) Artículos
Files in This Item:
File Description SizeFormat 
accesoRestringido.pdf15,38 kBAdobe PDFThumbnail
View/Open
Show full item record
Review this work
 

Related articles:


WARNING: Items in Digital.CSIC are protected by copyright, with all rights reserved, unless otherwise indicated.