Please use this identifier to cite or link to this item: http://hdl.handle.net/10261/30194
Share/Export:
logo share SHARE BASE
Visualizar otros formatos: MARC | Dublin Core | RDF | ORE | MODS | METS | DIDL | DATACITE
Title

A neuro-fuzzy model for nonlinear plants identification

AuthorsBaruch, Ieroham Solomon; Gortcheva, Elena A.; Thomas, Federico CSIC ORCID ; Garrido-Moctezuma, Rubén Alejandro
KeywordsAutomation
Issue Date1999
CitationIASTED International Conference on Modelling and Simulation: 326-331 (1999)
AbstractA improved parallel Recurrent Neural Network (RNN) model and an improved dynamic Back-propagation (BP) method of its learning, are proposed. The RNN model is given as a two layer Jordan canonical architecture for both continuous and discrete-time cases. The output layer is of Feedforward type. The hidden layer is a recurrent one with self-feedbacks and full forward connections with the inputs. A linearisation of this RNN model is performed and the stability, observability and controllability conditions, are studied.. To preserve the RNN stability, sigmoid activation functions are introduced in RNN feedback loops. The paper suggests to improve RNN realisation using saturation function instead of a sigmoid one. A new improved RNN learning algorithm of dynamic BP-type containing momentum term, is proposed. For a complex non-linear plants identification, a fuzzy-rule-based system and a neuro-fuzzy model, are proposed. The proposed neuro-fuzzy model is applied for identification of a mechanical system with friction.
DescriptionIASTED International Conference on Modelling and Simulation (MS), 1999, Filadelfia (EE.UU.)
URIhttp://hdl.handle.net/10261/30194
Appears in Collections:(IRII) Comunicaciones congresos




Files in This Item:
File Description SizeFormat
doc1.pdf57,61 kBAdobe PDFThumbnail
View/Open
Show full item record
Review this work

Page view(s)

288
checked on May 27, 2022

Download(s)

83
checked on May 27, 2022

Google ScholarTM

Check


WARNING: Items in Digital.CSIC are protected by copyright, with all rights reserved, unless otherwise indicated.