English   español  
Por favor, use este identificador para citar o enlazar a este item: http://hdl.handle.net/10261/133728
logo share SHARE logo core CORE   Add this article to your Mendeley library MendeleyBASE

Visualizar otros formatos: MARC | Dublin Core | RDF | ORE | MODS | METS | DIDL
Exportar a otros formatos:

Delay-based reservoir computing: Noise effects in a combined analog and digital implementation

AutorSoriano, Miguel C. ; Ortín González, Silvia ; Keuninckx, Lars; Appeltant, Lennert; Danckaert, Jan; Pesquera, Luis ; Van der Sande, Guy
Fecha de publicaciónfeb-2015
EditorInstitute of Electrical and Electronics Engineers
CitaciónIEEE Transactions on Neural Networks and Learning Systems 26(2): 388-393 (2015)
ResumenReservoir computing is a paradigm in machine learning whose processing capabilities rely on the dynamical behavior of recurrent neural networks. We present a mixed analog and digital implementation of this concept with a nonlinear analog electronic circuit as a main computational unit. In our approach, the reservoir network can be replaced by a single nonlinear element with delay via time-multiplexing. We analyze the influence of noise on the performance of the system for two benchmark tasks: 1) a classification problem and 2) a chaotic time-series prediction task. Special attention is given to the role of quantization noise, which is studied by varying the resolution in the conversion interface between the analog and digital worlds.
Versión del editorhttp://dx.doi.org/10.1109/TNNLS.2014.2311855
Identificadoresissn: 2162-2388
Aparece en las colecciones: (IFCA) Artículos
(IFISC) Artículos
Ficheros en este ítem:
Fichero Descripción Tamaño Formato  
accesoRestringido.pdf15,38 kBAdobe PDFVista previa
Mostrar el registro completo

NOTA: Los ítems de Digital.CSIC están protegidos por copyright, con todos los derechos reservados, a menos que se indique lo contrario.