Noise Quality And Super-Turing Computation In Recurrent Neural Networks
Noise and stochasticity can be beneficial to the performance of neural networks. Recent studies show that optimized-magnitude, noise-enhanced digital recurrent neural networks are consistent with super-Turing operation. This occurred regardless of whether true random or sufficiently long pseudo-random number time series implementing the noise were used. This paper extends prior work by providing additional insight into the degrading effect of shortened and repeating pseudo-noise sequences on super-Turing operation. Shortening the repeat length in the noise resulted in fewer chaotic time series. This was measured by autocorrelation detected repetitions in the output. Similar rates of chaos inhibition by the shortening of the noise repeat lengths hint to an unknown, underlying commonality in noise-induced chaos among different maps, noise magnitudes, and pseudo-noise functions. Repeat lengths in the chaos-failed outputs were predominately integer multiples of the noise repeat lengths. Noise repeat lengths only marginally shorter than output sequences cause the noise-enhanced digital recurrent neural networks to repeat and, thereby, fail in being consistent with chaos and super-Turing computation. This implies that noise sequences used to improve neural network operation should be at least as long as any sequence it produces.
E. Redd and T. Obafemi-Ajayi, "Noise Quality And Super-Turing Computation In Recurrent Neural Networks," Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 12894 LNCS, pp. 469 - 478, Springer; Internatinal Conference on Artifical Neural Networks, Jan 2021.
The definitive version is available at https://doi.org/10.1007/978-3-030-86380-7_38
Electrical and Computer Engineering
Keywords and Phrases
Chaos; Pseudo-random noise; Recurrent neural networks; Super-turing
International Standard Book Number (ISBN)
International Standard Serial Number (ISSN)
Article - Conference proceedings
© 2023 Springer; International Conference on Artificial Neural Networks, All rights reserved.
01 Jan 2021