Please use this identifier to cite or link to this item: http://hdl.handle.net/10261/30547
Share/Export:
logo share SHARE logo core CORE BASE
Visualizar otros formatos: MARC | Dublin Core | RDF | ORE | MODS | METS | DIDL | DATACITE
Title

Neural learning methods yielding functional invariance

AuthorsRuiz de Angulo, Vicente CSIC ; Torras, Carme CSIC ORCID
KeywordsNeural learning
Regularization
Functional invariance
Input noise addition
Weight-decay
Issue Date2004
PublisherElsevier
CitationTheoretical Computer Science 320(1): 111-121 (2004)
AbstractThis paper investigates the functional invariance of neural network learning methods incorporating a complexity reduction mechanism, such as a regularizer. By functional invariance we mean the property of producing functionally equivalent minima as the size of the network grows, when the smoothing parameters are fixed. We study three different principles on which functional invariance can be based, and try to delimit the conditions under which each of them acts. We find out that, surprisingly, some of the most popular neural learning methods, such as weight-decay and input noise addition, exhibit this interesting property.
DescriptionA preliminary version of this work was presented at the International Conference on Arti cial Neural Networks (ICANN'01), Vienna, August 2001.
Publisher version (URL)http://dx.doi.org/10.1016/j.tcs.2004.03.046
URIhttp://hdl.handle.net/10261/30547
DOI10.1016/j.tcs.2004.03.046
ISSN0304-3975
Appears in Collections:(IRII) Artículos




Files in This Item:
File Description SizeFormat
Neural learning methods.pdf155,62 kBAdobe PDFThumbnail
View/Open
Show full item record
Review this work

SCOPUSTM   
Citations

1
checked on May 22, 2022

WEB OF SCIENCETM
Citations

2
checked on May 22, 2022

Page view(s)

331
checked on May 25, 2022

Download(s)

203
checked on May 25, 2022

Google ScholarTM

Check

Altmetric

Dimensions


WARNING: Items in Digital.CSIC are protected by copyright, with all rights reserved, unless otherwise indicated.