English   español  
Please use this identifier to cite or link to this item: http://hdl.handle.net/10261/133728
Share/Impact:
Statistics
logo share SHARE logo core CORE   Add this article to your Mendeley library MendeleyBASE

Visualizar otros formatos: MARC | Dublin Core | RDF | ORE | MODS | METS | DIDL | DATACITE
Exportar a otros formatos:

Title

Delay-based reservoir computing: Noise effects in a combined analog and digital implementation

AuthorsSoriano, Miguel C. ; Ortín González, Silvia ; Keuninckx, Lars; Appeltant, Lennert; Danckaert, Jan; Pesquera, Luis ; Van der Sande, Guy
Issue DateFeb-2015
PublisherInstitute of Electrical and Electronics Engineers
CitationIEEE Transactions on Neural Networks and Learning Systems 26(2): 388-393 (2015)
AbstractReservoir computing is a paradigm in machine learning whose processing capabilities rely on the dynamical behavior of recurrent neural networks. We present a mixed analog and digital implementation of this concept with a nonlinear analog electronic circuit as a main computational unit. In our approach, the reservoir network can be replaced by a single nonlinear element with delay via time-multiplexing. We analyze the influence of noise on the performance of the system for two benchmark tasks: 1) a classification problem and 2) a chaotic time-series prediction task. Special attention is given to the role of quantization noise, which is studied by varying the resolution in the conversion interface between the analog and digital worlds.
Publisher version (URL)http://dx.doi.org/10.1109/TNNLS.2014.2311855
URIhttp://hdl.handle.net/10261/133728
DOI10.1109/TNNLS.2014.2311855
Identifiersissn: 2162-2388
Appears in Collections:(IFCA) Artículos
(IFISC) Artículos
Files in This Item:
File Description SizeFormat 
accesoRestringido.pdf15,38 kBAdobe PDFThumbnail
View/Open
Show full item record
Review this work
 

Related articles:


WARNING: Items in Digital.CSIC are protected by copyright, with all rights reserved, unless otherwise indicated.