Abstract

This paper explores the benefit of added noise in increasing the computational complexity of digital recurrent neural networks (RNNs). The physically accepted model of the universe imposes rational number, stochastic limits on all calculations. An analog RNN with those limits calculates at the super-Turing complexity level BPP/log∗. In this paper, we demonstrate how noise aids digital RNNs in attaining super-Turing operation similar to analog RNNs. We investigate moving limited-precision systems from not being chaotic at small amounts of noise, through consistency with chaos, to overwhelming it at large amounts of noise. A Kolmogorov-complexity-based proof shows that an infinite computational class hierarchy exists between P, the Turing class, and BPP/log∗. The hierarchy offers a possibility that the noise-enhanced digital RNNs could operate at a super-Turing level less complex than BPP/log∗. As the uniform noise increases, the digital RNNs develop positive Lyapunov exponents intimating that chaos is mimicked. The exponents maximize to the accepted values for the logistic and Hénon maps when the noise equals eight times the least significant bit of the noisy recurrent signals for the logistic digital RNN and four times the Hénon digital RNN.

Department(s)

Electrical and Computer Engineering

International Standard Serial Number (ISSN)

2643-1564

Document Type

Article - Journal

Document Version

Final Version

File Type

text

Language(s)

English

Rights

© 2023 The Authors, All rights reserved.

Creative Commons Licensing

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.

Publication Date

08 Feb 2021

Share

 
COinS