English   español  
Please use this identifier to cite or link to this item: http://hdl.handle.net/10261/30402
Title: Integration of shape and a multihypotheses fisher color model for figure-ground segmentation in non-stationary environments
Authors: Moreno-Noguer, Francesc; Sanfeliu, Alberto
Keywords: Tracking
Deformable contours
Color adaption
Particle filters
Pattern recognition: Computer vision
Pattern recognition: Object detection
Computer vision
Pattern recognition systems
Issue Date: 2004
Publisher: Institute of Electrical and Electronics Engineers
Citation: 17th International Conference on Pattern Recognition: pp. 771-774 (2004)
Abstract: In this paper a new technique to perform figure-ground segmentation in image sequences of scenarios with varying illumination conditions is proposed. The set of color points of both the target and background are modelled with Mixture of Gaussians (MoG), which optimum number is automatically initialized. Based on the ‘Linear Discriminant Analysis’ (LDA) a new colorspace that maximizes the foreground/background class separability is presented. Moreover, there is no need to assume gradual change of the viewing conditions over time, because the method works with multiple hypotheses about the next state of the color distribution (some considering small changes and other more abrupt variations). The hypothesis that generates the best object segmentation and the shape information in the previous iteration are fused to accurately detect the object boundary, in a stage denominated ‘sample concentration’, introduced as a final step to the classical CONDENSATION algorithm.
Description: International Conference on Pattern Recognition (ICPR), 2004, Cambridge (United Kingdom)
URI: http://hdl.handle.net/10261/30402
ISBN: 0769521282
DOI: http://dx.doi.org/10.1109/ICPR.2004.1333886
Appears in Collections:(IRII) Comunicaciones congresos
Files in This Item:
File Description SizeFormat 
doc1.pdf504,7 kBAdobe PDFThumbnail
Show full item record

WARNING: Items in Digital.CSIC are protected by copyright, with all rights reserved, unless otherwise indicated.