English   español  
Please use this identifier to cite or link to this item: http://hdl.handle.net/10261/96393
logo share SHARE   Add this article to your Mendeley library MendeleyBASE

Visualizar otros formatos: MARC | Dublin Core | RDF | ORE | MODS | METS | DIDL
Exportar a otros formatos:


FINDDD: A fast 3D descriptor to characterize textiles for robot manipulation

AuthorsRamisa, Arnau ; Alenyà, Guillem ; Moreno-Noguer, Francesc ; Torras, Carme
Issue Date2013
PublisherInstitute of Electrical and Electronics Engineers
CitationIEEE/RSJ International Conference on Intelligent Robots and Systems: 824-830 (2013)
AbstractMost current depth sensors provide 2.5D range images in which depth values are assigned to a rectangular 2D array. In this paper we take advantage of this structured information to build an efficient shape descriptor which is about two orders of magnitude faster than competing approaches, while showing similar performance in several tasks involving deformable object recognition. Given a 2D patch surrounding a point and its associated depth values, we build the descriptor for that point, based on the cumulative distances between their normals and a discrete set of normal directions. This processing is made very efficient using integral images, even allowing to compute descriptors for every range image pixel in a few seconds. The discriminative power of our descriptor, dubbed FINDDD, is evaluated in three different scenarios: recognition of specific cloth wrinkles, instance recognition from geometry alone, and detection of reliable and informed grasping points.
DescriptionTrabajo presentado al IROS celebrado en Tokyo del 3 al 7 de noviembre de 2013.
Publisher version (URL)http://dx.doi.org/10.1109/IROS.2013.6696446
Identifiersdoi: 10.1109/IROS.2013.6696446
issn: 2153-0858
e-issn: 2153-0866
Appears in Collections:(IRII) Artículos
Files in This Item:
File Description SizeFormat 
FINDDD.pdf2,03 MBAdobe PDFThumbnail
Show full item record
Review this work

WARNING: Items in Digital.CSIC are protected by copyright, with all rights reserved, unless otherwise indicated.