English   español  
Please use this identifier to cite or link to this item: http://hdl.handle.net/10261/30519
Share/Impact:
Statistics
logo share SHARE logo core CORE   Add this article to your Mendeley library MendeleyBASE

Visualizar otros formatos: MARC | Dublin Core | RDF | ORE | MODS | METS | DIDL
Exportar a otros formatos:

Title

On-line learning with minimal degradation in feedforward networks

AuthorsRuiz de Angulo, Vicente ; Torras, Carme
KeywordsLearning
On-line learning,incremental learning
Neural networks
Robots
Issue Date1995
PublisherInstitute of Electrical and Electronics Engineers
CitationIEEE Transactions on Neural Networks 65(4): 657-668 (1995)
AbstractDealing with non-stationary processes requires quick adaptation while at the same time avoiding catastrophic forgetting. A neural learning technique that satisfies these requirements, without sacrifying the benefits of distributed representations, is presented. It relies on a formalization of the problem as the minimization of the error over the previously learned input-output (i-o) patterns, subject to the constraint of perfect encoding of the new pattern. Then this constrained optimization problem is transformed into an unconstrained one with hidden-unit activations as variables. This new formulation naturally leads to an algorithm for solving the problem, which we call Learning with Minimal Degradation (LMD). Some experimental comparisons of the performance of LMD with back-propagation are provided which, besides showing the advantages of using LMD, reveal the dependence of forgetting on the learning rate in back-propagation. We also explain why overtraining affects forgetting and fault-tolerance, which are seen as related problems.
Publisher version (URL)http://dx.doi.org/10.1109/72.377971
URIhttp://hdl.handle.net/10261/30519
DOI10.1109/72.377971
ISSN1045-9227
Appears in Collections:(IRII) Artículos
Files in This Item:
File Description SizeFormat 
doc1.pdf1,27 MBAdobe PDFThumbnail
View/Open
Show full item record
Review this work
 

Related articles:


WARNING: Items in Digital.CSIC are protected by copyright, with all rights reserved, unless otherwise indicated.