Please use this identifier to cite or link to this item:
logo share SHARE BASE
Visualizar otros formatos: MARC | Dublin Core | RDF | ORE | MODS | METS | DIDL | DATACITE

Invite to open peer review

Multi-modal joint embedding for fashion product retrieval

AuthorsRubio Romano, Antonio CSIC; Yu, LongLong; Simo-Serra, Edgar CSIC; Moreno-Noguer, Francesc CSIC ORCID
KeywordsNeural networks
Multi-modal embedding
Issue Date2017
PublisherInstitute of Electrical and Electronics Engineers
CitationIEEE International Conference on Image Processing: 400-404 (2017)
AbstractFinding a product in the fashion world can be a daunting task. Everyday, e-commerce sites are updating with thousands of images and their associated metadata (textual information), deepening the problem, akin to finding a needle in a haystack. In this paper, we leverage both the images and textual meta-data and propose a joint multi-modal embedding that maps both the text and images into a common latent space. Distances in the latent space correspond to similarity between products, allowing us to effectively perform retrieval in this latent space, which is both efficient and accurate. We train this embedding using large-scale real world e-commerce data by both minimizing the similarity between related products and using auxiliary classification networks to that encourage the embedding to have semantic meaning. We compare against existing approaches and show significant improvements in retrieval tasks on a large-scale e-commerce dataset. We also provide an analysis of the different metadata.
DescriptionTrabajo presentado a la IEEE International Conference on Image Processing (ICIP), celebrada en Beijing (China) del 17 al 20 de septiembre de 2017.
Publisher version (URL)
Identifiersdoi: 10.1109/ICIP.2017.8296311
isbn: 978-1-5090-2176-5
Appears in Collections:(IRII) Libros y partes de libros

Files in This Item:
File Description SizeFormat
modalretrieval.pdf1,14 MBUnknownView/Open
Show full item record

CORE Recommender

Page view(s)

checked on May 19, 2024


checked on May 19, 2024

Google ScholarTM




WARNING: Items in Digital.CSIC are protected by copyright, with all rights reserved, unless otherwise indicated.