Please use this identifier to cite or link to this item: http://hdl.handle.net/10261/55834
Share/Impact:
Título : Depth-supported real-time video segmentation with the Kinect
Autor : Alexey, Abramov, Pauwels, Karl, Papon, Jeremie, Wörgötter, Florentin, Dellen, Babette
Palabras clave : Pattern recognition
Calibration
Cameras
Fecha de publicación : Jan-2012
Editor: Institute of Electrical and Electronics Engineers
Citación : IEEE Workshop on Applications of Computer Vision: 457-464 (2012)
Resumen: We present a real-time technique for the spatiotemporal segmentation of color/depth movies. Images are segmented using a parallel Metropolis algorithm implemented on a GPU utilizing both color and depth information, acquired with the Microsoft Kinect. Segments represent the equilibrium states of a Potts model, where tracking of segments is achieved by warping obtained segment labels to the next frame using real-time optical flow, which reduces the number of iterations required for the Metropolis method to encounter the new equilibrium state. By including depth information into the framework, true objects boundaries can be found more easily, improving also the temporal coherency of the method. The algorithm has been tested for videos of medium resolutions showing human manipulations of objects. The framework provides an inexpensive visual front end for visual preprocessing of videos in industrial settings and robot labs which can potentially be used in various applications.
Descripción : Trabajo presentado al WACV celebrado en Breckenridge (USA) del 9 al 11 de enero de 2012.
Versión del editor: http://dx.doi.org/10.1109/WACV.2012.6163000
URI : http://hdl.handle.net/10261/55834
ISBN : 978-1-4673-0233-3
DOI: 10.1109/WACV.2012.6163000
Citación : IEEE Workshop on Applications of Computer Vision: 457-464 (2012)
Appears in Collections:(IRII) Libros y partes de libros

Files in This Item:
File Description SizeFormat 
Depth-supported real.pdf8,15 MBAdobe PDFView/Open
Show full item record
 
CSIC SFX LinksSFX Query

Items in Digital.CSIC are protected by copyright, with all rights reserved, unless otherwise indicated.