English   español  
Please use this identifier to cite or link to this item: http://hdl.handle.net/10261/167220
logo share SHARE   Add this article to your Mendeley library MendeleyBASE

Visualizar otros formatos: MARC | Dublin Core | RDF | ORE | MODS | METS | DIDL
Exportar a otros formatos:


Robot-aided cloth classification using depth information and CNNs

AuthorsGabás Nova, Antoni; Corona Puyane, Enric; Alenyà, Guillem ; Torras, Carme
KeywordsGarment classification
Depth images
Deep learning
Issue Date2016
PublisherSpringer Nature
CitationInternational Conference on Articulated Motion and Deformable Objects: 16-23 (2016)
SeriesLecture Notes in Computer Science 9756
AbstractWe present a system to deal with the problem of classifying garments from a pile of clothes. This system uses a robot arm to extract a garment and show it to a depth camera. Using only depth images of a partial view of the garment as input, a deep convolutional neural network has been trained to classify different types of garments. The robot can rotate the garment along the vertical axis in order to provide different views of the garment to enlarge the prediction confidence and avoid confusions. In addition to obtaining very high classification scores, compared to previous approaches to cloth classification that match the sensed data against a database, our system provides a fast and occlusion-robust solution to the problem.
DescriptionTrabajo presentado a la 9th International Conference on Articulated Motion and Deformable Objects (AMDO), celebrada en Palma de Mallorca (España) del 13 al 15 de julio de 2016.
Publisher version (URL)https://doi.org/10.1007/978-3-319-41778-3_2
Identifiersdoi: 10.1007/978-3-319-41778-3_2
isbn: 978-3-319-41778-3
Appears in Collections:(IRII) Libros y partes de libros
Files in This Item:
File Description SizeFormat 
robot-aided.pdf4,72 MBUnknownView/Open
Show full item record
Review this work

Related articles:

WARNING: Items in Digital.CSIC are protected by copyright, with all rights reserved, unless otherwise indicated.