2024-03-29T00:39:29Zhttp://digital.csic.es/dspace-oai/requestoai:digital.csic.es:10261/305472019-06-10T15:59:52Zcom_10261_106com_10261_4col_10261_359
Neural learning methods yielding functional invariance
Ruiz de Angulo, Vicente
Torras, Carme
Neural learning
Regularization
Functional invariance
Input noise addition
Weight-decay
A preliminary version of this work was presented at the International Conference on Arti cial Neural Networks (ICANN'01), Vienna, August 2001.
This paper investigates the functional invariance of neural network learning methods incorporating a complexity reduction mechanism, such as a regularizer. By functional invariance we mean the property of producing functionally equivalent minima as the size of the network grows, when the smoothing parameters are fixed. We study three different principles on which functional invariance can be based, and try to delimit the conditions under which each of them acts. We find out that, surprisingly, some of the most popular neural learning methods, such as weight-decay and input noise addition, exhibit this interesting property.
2010-12-17T13:18:53Z
2010-12-17T13:18:53Z
2004
artÃculo
Theoretical Computer Science 320(1): 111-121 (2004)
0304-3975
http://hdl.handle.net/10261/30547
10.1016/j.tcs.2004.03.046
eng
Postprint
http://dx.doi.org/10.1016/j.tcs.2004.03.046
openAccess
Elsevier