Класифікація патернів в резервуарних обчисленнях за допомогою показників динамічного стану/ Pattern classification in reservoir computing with dynamical state measures

Loading...
Thumbnail Image
Date
2022
Authors
Ровнік, Вероніка
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Reservoir computing is a brain-inspired paradigm for RNN training th a t has been successfully applied to different computational problems involving sequential data. One of such problems is temporal pattern classification, where time series patterns are to be assigned to a specific category. In neuroscience, the pattern recognition is known as a function of the brain important for a range of intelligent behaviors: hearing, speech, vision, music, and motor control. Here we present a reservoir network architecture with the phase oscillator models as neurons th a t solves the given task. The proposed computational scheme consists of d a ta encoding, reservoir dynamics tuning, dynamical state decoding, and readout training steps. At the d a ta encoding stage we solve the task of presenting the static d a ta to the network as a temporal signal. Next, we find suitable network parameter regimes using dynamical systems theory. At the d ata decoding stage we extract information about the dynamical state of the network of oscillators, i.e., frequency synchronization emerging given the input. Finally, the readout layer is trained to linearly separate the high-dimensional states of the reservoir with accordance to the pattern recognition task. We evaluate the computational performance of the reservoir architectures by solving an image classification task on the well-known MNIST digits recognition dаtaset. The designed reservoir computing scheme shows decent performance in the image classification task. We assume th a t the proposed approach generalizes to other neuron models th a t exhibit similar properties of intrinsic dynamics. We also presume the possibility of direct mapping between the theoretical model of the reservoirs and their physical implementation in hardware. Due to the considerably reduced number of trainable parameters, the approach promises to be more competitive in power efficiency than recurrent neural networks trained with the conventional back-propagation through time algorithm.
Description
Keywords
Python, MNIST, temporal pattern classification, reservoir computing, RNN, магістерська робота
Citation