首页/下载资源/人工智能/NAR-RNN时间序列预测代码(数据集+训练+预测+预测效果对比)

ZIPNAR-RNN时间序列预测代码(数据集+训练+预测+预测效果对比)

m0_56146217274.92KB需要积分:1

资源文件列表:

nar.zip 大约有4个文件
  1. data.mat 266.73KB
  2. nar_predict_multistep.m 656B
  3. nar_train.m 4.13KB
  4. net.mat 6.09KB

资源介绍:

新手入门必备,可以尝试一下,这里可以直接用自己的数据集替换掉就好了,也可以私信我进行替换!
% Solve an Autoregression Time-Series Problem with a NAR Neural Network % Script generated by Neural Time Series app % Created 01-Mar-2023 19:33:45 % % This script assumes this variable is defined: % % y - feedback time series. clc clear load data.mat y=data(:,8); T = tonndata(y,false,false); %输入和输出矩阵须为cell类型的矩阵,且不能用num2cell来转换,如果使用二维cell矩阵,将会被认为是两个输入从而不能训练。 % Choose a Training Function % For a list of all training functions type: help nntrain % 'trainlm' is usually fastest. % 'trainbr' takes longer but may be better for challenging problems. % 'trainscg' uses less memory. Suitable in low memory situations. trainFcn = 'trainlm'; % Levenberg-Marquardt backpropagation. %默认的lmh函数训练时间序列效果很差,采用贝叶斯正则化算法 % 对于大多数问题,推荐使用 Levenberg-Marquardt (trainlm),不过,对于一些含噪小型问题,贝叶斯正则化 (trainbr) 虽然可能需要更长的时间,但会获得更好的解。 %但是,对于大型问题,推荐使用量化共轭梯度 (trainscg),因为它使用的梯度计算比其他两种算法使用的 Jacobian 矩阵计算更节省内存。 %由于样本数据特性,此处选择贝叶斯正则化(Bayesian Regularization)。 % Create a Nonlinear Autoregressive Network feedbackDelays = 1:20;%延迟向量1:10,即输出参考了前10个值 hiddenLayerSize = 10; net = narnet(feedbackDelays,hiddenLayerSize,'open',trainFcn); %这一项可以来调整神经元的个数,这个需要结合performance的误差来结合调整。一般地,靠增加隐层节点数来获得较低的误差,其训练效果要比增加隐层数更容易实现。 %需要注意的是,与在每一层中添加更多的神经元相比,添加层层数将获得更大的性能提升。因此,不要在一个隐藏层中加入过多的神经元。 % Prepare the Data for Training and Simulation % The function PREPARETS prepares timeseries data for a particular network, % shifting time by the minimum amount to fill input states and layer % states. Using PREPARETS allows you to keep your original time series data % unchanged, while easily customizing it for networks with differing % numbers of delays, with open loop or closed loop feedback modes. [x,xi,ai,t] = preparets(net,{},{},T); % Setup Division of Data for Training, Validation, Testing net.divideParam.trainRatio = 70/100; net.divideParam.valRatio = 15/100; net.divideParam.testRatio = 15/100; % Train the Network [net,tr] = train(net,x,t,xi,ai); save('net,mat',net) % Test the Network y = net(x,xi,ai); e = gsubtract(t,y); performance = perform(net,t,y) % View the Network view(net) % Plots % Uncomment these lines to enable various plots. %figure, plotperform(tr) %figure, plottrainstate(tr) %figure, ploterrhist(e) %figure, plotregression(t,y) %figure, plotresponse(t,y) %figure, ploterrcorr(e) %figure, plotinerrcorr(x,e) % Closed Loop Network % Use this network to do multi-step prediction. % The function CLOSELOOP replaces the feedback input with a direct % connection from the output layer. netc = closeloop(net); netc.name = [net.name ' - Closed Loop']; view(netc) [xc,xic,aic,tc] = preparets(netc,{},{},T); yc = netc(xc,xic,aic); closedLoopPerformance = perform(net,tc,yc) % Step-Ahead Prediction Network % For some applications it helps to get the prediction a timestep early. % The original network returns predicted y(t+1) at the same time it is % given y(t+1). For some applications such as decision making, it would % help to have predicted y(t+1) once y(t) is available, but before the % actual y(t+1) occurs. The network can be made to return its output a % timestep early by removing one delay so that its minimal tap delay is now % 0 instead of 1. The new network returns the same outputs as the original % network, but outputs are shifted left one timestep. nets = removedelay(net); nets.name = [net.name ' - Predict One Step Ahead']; view(nets) [xs,xis,ais,ts] = preparets(nets,{},{},T); ys = nets(xs,xis,ais); stepAheadPerformance = perform(nets,ts,ys)
100+评论
captcha