IEEE Access (Jan 2023)
Self-Organizing Multiple Readouts for Reservoir Computing
Abstract
With advancements in deep learning (DL), artificial intelligence (AI) technology has become an indispensable tool. However, the application of DL incurs significant computational costs, making it less viable for edge AI scenarios. Consequently, the demand for cost-effective AI solutions, other than DL-based approaches, is increasing. Reservoir computing (RC) has attracted interest owing to its ability to provide low-cost training alternatives, holding great promise for edge AI applications. However, the training capability of RC is constrained by its reliance on a single linear layer, while weight connections in the remaining layers remain static during training. Moreover, accomplishing continuous learning tasks is difficult owing to the catastrophic forgetting in the linear layer. Therefore, we propose the integration of self-organizing multiple readouts to enhance RC’s training capability. Our method distributes training data across multiple readouts, which prevents catastrophic forgetting of readouts and empowers each readout to adeptly assimilate new data, thereby elevating the overall training performance. The self-organizing function, which assigns similar data to the same readout, optimizes the memory utilization of these multiple readouts. Experimental results show that an RC equipped with the proposed multiple readouts successfully solved a continuous learning task by mitigating catastrophic forgetting because of the data distribution to the multiple readouts. Additionally, the RC achieved higher accuracy in a sound recognition task compared with the existing RC paradigm because of ensemble learning in the multiple readouts. Multiple readouts are effective in enhancing the training capability of RC and can contribute to the realization of RC applications.
Keywords