Please use this identifier to cite or link to this item:
logo share SHARE BASE
Visualizar otros formatos: MARC | Dublin Core | RDF | ORE | MODS | METS | DIDL | DATACITE

Categorizing object-actions relations from semantic scene graphs

AuthorsAksoy, Eren Erdal; Abramov, Alexey; Wörgötter, Florentin; Dellen, Babette CSIC
KeywordsObject-action categorization
Pattern recognition
Issue Date2010
CitationIEEE International Conference on Robotics and Automation: 398-405 (2010)
AbstractIn this work we introduce a novel approach for detecting spatiotemporal object-action relations, leading to both, action recognition and object categorization. Semantic scene graphs are extracted from image sequences and used to find the characteristic main graphs of the action sequence via an exact graph-matching technique, thus providing an event table of the action scene, which allows extracting object-action relations. The method is applied to several artificial and real action scenes containing limited context. The central novelty of this approach is that it is model free and needs a priori representation neither for objects nor actions. Essentially actions are recognized without requiring prior object knowledge and objects are categorized solely based on their exhibited role within an action sequence. Thus, this approach is grounded in the affordance principle, which has recently attracted much attention in robotics and provides a way forward for trial and error learning of object-action relations through repeated experimentation. It may therefore be useful for recognition and categorization tasks for example in imitation learning in developmental and cognitive robotics.
DescriptionTrabajo presentado al ICRA 2010 celebrado en Anchorage (Alska del 3 al 7 de mayo.
Publisher version (URL)
Appears in Collections:(IRII) Libros y partes de libros

Files in This Item:
File Description SizeFormat
Categorizing object-actions.pdf2,06 MBAdobe PDFThumbnail
Show full item record
Review this work

Page view(s)

checked on May 25, 2022


checked on May 25, 2022

Google ScholarTM




WARNING: Items in Digital.CSIC are protected by copyright, with all rights reserved, unless otherwise indicated.