Home > Papers

 
 
Large sampling intervals for learning and predicting chaotic systems with reservoir computing
XIE Qing-yan 1, YAN Zi-Xiang 1, GAO Jian 1, ZHAO Hui 2, ZHAO Hui 2, XIAO Jing-Hua 1 *
1. School of Science, Beijing University of Posts and Telecommunications, Beijing 100876
2.
*Correspondence author
#Submitted by
Subject:
Funding: National Natural Science Foundation of China (NSFC) (No.Grant No. 62333002), Fundamental Research Funds for the Central Universities (No.Contract No. 2023RC44), National Natural Science Foundation of China (NSFC) (No.Grant No. 62371056), Opening Project of State Key Lab of Information Photonics and Optical Communications (No.Grant No. IPOC2023ZJ02), National Natural Science Foundation of China (No.Grant No. 62103165), Key Laboratory of Computing Power Network and Information Security(No.Grant No.2023ZD038)
Opened online:20 March 2024
Accepted by: none
Citation: XIE Qing-yan, YAN Zi-Xiang, GAO Jian.Large sampling intervals for learning and predicting chaotic systems with reservoir computing[OL]. [20 March 2024] http://en.paper.edu.cn/en_releasepaper/content/4762703
 
 
Reservoir computing is an efficient artificial neural network with low training cost and low hardware overhead. It is widely used in time sequence information processing, such as waveform classification, speech recognition, time series prediction, etc. However, in practical applications, researchers can only use limited information from the system for predictions, and the sampling interval cannot be adjusted freely due to the limitations of the actual system. Based on the above situation, we demonstrate the impact of time and space sampling intervals on the short-term and long-term prediction capabilities of the reservoir computing and compare it with the existing numerical methods. It can be found that for chaotic systems, the reservoir computing can learn and reproduce the systems' states at almost five times larger spatio-temporal intervals compared to classical numerical methods, such as fourth-order Runge-Kutta and spectral methods. Our results show the captivity of reservoir computing in the applications with limitation of spatio-temporal intervals, and pave the way to reservoir-based fast numerical simulation methods.
Keywords:Machine learning, Reservoir computing, Sample interval
 
 
 

For this paper

  • PDF (0B)
  • ● Revision 0   
  • ● Print this paper
  • ● Recommend this paper to a friend
  • ● Add to my favorite list

    Saved Papers

    Please enter a name for this paper to be shown in your personalized Saved Papers list

Tags

Add yours

Related Papers

Statistics

PDF Downloaded 2
Bookmarked 0
Recommend 0
Comments Array
Submit your papers