Bashar Al asked . 2022-07-23

Neural Network Tool Box

Hello everybody
 
I am new to neural networks, but I have studied the theory and everything is OK. Now I've come to the practical part. I am using the Neural Network toolbox in Matlab, and start using NARX where x(t) is Excel file (1 column and 3500 rows) and y(t) is also an Excel file (1 column and 3500 rows). I import these files and train the system, and get the figures and everything is OK. My question is:
 
After doing all of the above , how do I use this network to predict the value of y for a given x?
 
Here is the resulting script file :
 
% Solve an Autoregression Problem with External Input with a NARX Neural Network
% Script generated by NTSTOOL
% Created Fri Oct 12 20:48:21 IST 2012
%
% This script assumes these variables are defined:
%
%   EMA - input time series.
%   Close - feedback time series.

inputSeries = tonndata(EMA,false,false);
targetSeries = tonndata(Close,false,false);

% Create a Nonlinear Autoregressive Network with External Input
inputDelays = 1:2;
feedbackDelays = 1:2;
hiddenLayerSize = 10;
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize);

% Choose Input and Feedback Pre/Post-Processing Functions
% Settings for feedback input are automatically applied to feedback output
% For a list of all processing functions type: help nnprocess
% Customize input parameters at: net.inputs{i}.processParam
% Customize output parameters at: net.outputs{i}.processParam
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'};

% Prepare the Data for Training and Simulation
% The function PREPARETS prepares timeseries data for a particular network,
% shifting time by the minimum amount to fill input states and layer states.
% Using PREPARETS allows you to keep your original time series data unchanged, while
% easily customizing it for networks with differing numbers of delays, with
% open loop or closed loop feedback modes.
[inputs,inputStates,layerStates,targets] = preparets(net,inputSeries,{},targetSeries);

% Setup Division of Data for Training, Validation, Testing
% The function DIVIDERAND randomly assigns target values to training,
% validation and test sets during training.
% For a list of all data division functions type: help nndivide
net.divideFcn = 'dividerand';  % Divide data randomly
% The property DIVIDEMODE set to TIMESTEP means that targets are divided
% into training, validation and test sets according to timesteps.
% For a list of data division modes type: help nntype_data_division_mode
net.divideMode = 'value';  % Divide up every value
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;

% Choose a Training Function
% For a list of all training functions type: help nntrain
% Customize training parameters at: net.trainParam
net.trainFcn = 'trainlm';  % Levenberg-Marquardt

% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
% Customize performance parameters at: net.performParam
net.performFcn = 'mse';  % Mean squared error

% Choose Plot Functions
% For a list of all plot functions type: help nnplot
% Customize plot parameters at: net.plotParam
net.plotFcns = {'plotperform','plottrainstate','plotresponse', ...
  'ploterrcorr', 'plotinerrcorr'};

% Train the Network
[net,tr] = train(net,inputs,targets,inputStates,layerStates);

% Test the Network
outputs = net(inputs,inputStates,layerStates);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs)

% Recalculate Training, Validation and Test Performance
trainTargets = gmultiply(targets,tr.trainMask);
valTargets = gmultiply(targets,tr.valMask);
testTargets = gmultiply(targets,tr.testMask);
trainPerformance = perform(net,trainTargets,outputs)
valPerformance = perform(net,valTargets,outputs)
testPerformance = perform(net,testTargets,outputs)

% View the Network
view(net)

% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, plotregression(targets,outputs)
%figure, plotresponse(targets,outputs)
%figure, ploterrcorr(errors)
%figure, plotinerrcorr(inputs,errors)

% Closed Loop Network
% Use this network to do multi-step prediction.
% The function CLOSELOOP replaces the feedback input with a direct
% connection from the outout layer.
netc = closeloop(net);
netc.name = [net.name ' - Closed Loop'];
view(netc)
[xc,xic,aic,tc] = preparets(netc,inputSeries,{},targetSeries);
yc = netc(xc,xic,aic);
closedLoopPerformance = perform(netc,tc,yc)

% Early Prediction Network
% For some applications it helps to get the prediction a timestep early.
% The original network returns predicted y(t+1) at the same time it is given y(t+1).
% For some applications such as decision making, it would help to have predicted
% y(t+1) once y(t) is available, but before the actual y(t+1) occurs.
% The network can be made to return its output a timestep early by removing one delay
% so that its minimal tap delay is now 0 instead of 1.  The new network returns the
% same outputs as the original network, but outputs are shifted left one timestep.
nets = removedelay(net);
nets.name = [net.name ' - Predict One Step Ahead'];
view(nets)
[xs,xis,ais,ts] = preparets(nets,inputSeries,{},targetSeries);
ys = nets(xs,xis,ais);
earlyPredictPerformance = perform(nets,ts,ys)

 

Deep Learning Toolbox , AI, Data Science, and Statistics , Deep Learning Toolbox , Function Approx

Expert Answer

Kshitij Singh answered . 2024-12-21 21:20:41

% Neural Network Tool Box
% Asked by Bashar Ali on 12 Oct 2012 at 18:14
% Latest activity Edited by Image Analyst on 12 Oct 2012 at 19:30
% Hello everybody
% I am new to neural networks, but I have studied the theory and everything is
% OK. Now I've come to the practical part. I am using the Neural Network
% toolbox in Matlab, and start using NARX where x(t) is Excel file (1 column
% and 3500 rows) and y(t) is also an Excel file (1 column and 3500 rows). I
% import these files and train the system, and get the figures and everything
% is OK. My question is:
% % After doing all of the above , how do I use this network to predict the
% value of y for a given x?

I think it is as simple as

 

y = net(x);
% Here is the resulting script file :
% Solve an Autoregression Problem with External Input with a NARX Neural
% Network Script generated by NTSTOOL
% Created Fri Oct 12 20:48:21 IST 2012
% This script assumes these variables are defined:
% EMA - input time series.
% Close - feedback time series.
close all, clear all, clc, plt = 0; % Input purge
>inputSeries = tonndata(EMA,false,false);
>targetSeries = tonndata(Close,false,false);
Not a good name. lower case close is a MATLAB function (see above).
I'll just use X and T below
plt=plt+1;figure(plt)
plot(T)
plot(X)
% Create a Nonlinear Autoregressive Network with External Input
>inputDelays = 1:2;
Often need 0 for input (but not feedback)
>feedbackDelays = 1:2;
Use xcorr, crosscorr or a corrected version of nncorr (see below) too obtain the lags at significant peak values.
N = length(T);
ZT = zscore(T);
ZX = zscore(X);
autocorrT = nncorr(ZT,ZT,N-1);
crosscorrXTR = nncorr(ZX,ZT,N-1); % Incorrectly symmetric
crosscorrXTL = nncorr(ZT,ZX,N-1); % Incorrectly symmetric
crosscorrXT = [ crosscorrXTL(1:N-1), crosscorrXTR(N:end) ];
lags = -(N-1):N-1;
Enter code to find POSITIVE lags at significant peaks (Yeah, I could
have just used crosscorrXTR) but ..., well, you understand)
>hiddenLayerSize = 10;
Choose optimal small value ( H <= (N-1)/3 ) by trial and error.
>net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize);
% Choose Input and Feedback Pre/Post-Processing Functions
% Settings for feedback input are automatically applied to feedback output
% For a list of all processing functions type: help nnprocess
% Customize input parameters at: net.inputs{i}.processParam
% Customize output parameters at: net.outputs{i}.processParam
>net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
>net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'};
Unnecessary. These are defaults
% Prepare the Data for Training and Simulation
% The function PREPARETS prepares timeseries data for a particular network,
% shifting time by the minimum amount to fill input states and layer states.
% Using PREPARETS allows you to keep your original time series data unchanged,
% while easily customizing it for networks with differing numbers of delays,
% with open loop or closed loop feedback modes.
>[inputs,inputStates,layerStates,targets] = preparets(net,inputSeries, {},targetSeries);
% Setup Division of Data for Training, Validation, Testing
% The function DIVIDERAND randomly assigns target values to training,
% validation and test sets during training.
% For a list of all data division functions type: help nndivide
>net.divideFcn = 'dividerand'; % Divide data randomly
NO. This screws up the time series uniform spacing. Use another divide
function. For example:
net.divideFcn = 'divideblock';
% The property DIVIDEMODE set to TIMESTEP means that targets are divided
% into training, validation and test sets according to timesteps.
% For a list of data division modes type: help nntype_data_division_mode
>net.divideMode = 'value'; % Divide up every value
>net.divideParam.trainRatio = 70/100;
>net.divideParam.valRatio = 15/100;
>net.divideParam.testRatio = 15/100;
Unnecessary ..
% Choose a Training Function
% For a list of all training functions type: help nntrain
% Customize training parameters at: net.trainParam
>net.trainFcn = 'trainlm'; % Levenberg-Marquardt
Unnecessary ...
% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
% Customize performance parameters at: net.performParam
>net.performFcn = 'mse'; % Mean squared error
Unnecessary...
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
% Customize plot parameters at: net.plotParam
>net.plotFcns = {'plotperform','plottrainstate','plotresponse', ...
'ploterrcorr', 'plotinerrcorr'};
Unnecessary ...
% Train the Network
>[net,tr] = train(net,inputs,targets,inputStates,layerStates);
tr = tr % No semicolon to expose results
% Test the Network
>outputs = net(inputs,inputStates,layerStates);
>errors = gsubtract(targets,outputs);
>performance = perform(net,targets,outputs)
Normalize and obtainR^2 statistic
NMSE = performance/var(T,1)
R2 = 1-NMSE
% Recalculate Training, Validation and Test Performance
>trainTargets = gmultiply(targets,tr.trainMask);
>valTargets = gmultiply(targets,tr.valMask);
>testTargets = gmultiply(targets,tr.testMask);
>trainPerformance = perform(net,trainTargets,outputs)
>valPerformance = perform(net,valTargets,outputs)
>testPerformance = perform(net,testTargets,outputs)
Training MSE should be adjusted to mitigate bias of using same data for training and testing.
[ I Ntrn ] = size(Xtrn)
[ O Ntrn ] = size(Ttrn)
Ntrneq = Ntrn*O % No. of training equations
ND = length( [ inputDellays, FeedbackDelays] )
Nw = (ND+1)*H +(H+1)*O % Number of estimated weights
MSEtrna = Ntrneq*MSEtrn/(Ntrneq-Nw) %Decreases with H
NMSEtrna = MSEtrna/var(Ttrn,0)
R2trna = 1-NMSEa
% View the Network
>view(net)
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, plotregression(targets,outputs)
%figure, plotresponse(targets,outputs)
%figure, ploterrcorr(errors)
%figure, plotinerrcorr(inputs,errors)
% Closed Loop Network
% Use this network to do multi-step prediction.
% The function CLOSELOOP replaces the feedback input with a direct
% connection from the outout layer.
>netc = closeloop(net);
>netc.name = [net.name ' - Closed Loop'];
>view(netc)
>[xc,xic,aic,tc] = preparets(netc,inputSeries,{},targetSeries);
>yc = netc(xc,xic,aic);
>closedLoopPerformance = perform(netc,tc,yc)
% Early Prediction Network
% For some applications it helps to get the prediction a timestep early.
% The original network returns predicted y(t+1) at the same time it is given y(t+1).
ARE YOU SURE? HAVE YOU DEMONSTRATED THIS?
I THOUGHT THIS COULD ONLY HAPPENS IF THE MINIMUM FEEDBACK DELAY IS ZERO.
HOWEVER, MATLAB SHOULD THROW AN ERROR FOR A ZERO FEEDBACK DELAY
WHEN I HAVE TIME I WILL CHECK
% For some applications such as decision making, it would help to have predicted
% y(t+1) once y(t) is available, but before the actual y(t+1) occurs.
% The network can be made to return its output a timestep early by removing one delay
% so that its minimal tap delay is now 0 instead of 1. The new network returns the
% same outputs as the original network, but outputs are shifted left one timestep.
>nets = removedelay(net);
>nets.name = [net.name ' - Predict One Step Ahead'];
>view(nets)
>[xs,xis,ais,ts] = preparets(nets,inputSeries,{},targetSeries);
>ys = nets(xs,xis,ais);
>earlyPredictPerformance = perform(nets,ts,ys)
 
 


Not satisfied with the answer ?? ASK NOW

Frequently Asked Questions

MATLAB offers tools for real-time AI applications, including Simulink for modeling and simulation. It can be used for developing algorithms and control systems for autonomous vehicles, robots, and other real-time AI systems.

MATLAB Online™ provides access to MATLAB® from your web browser. With MATLAB Online, your files are stored on MATLAB Drive™ and are available wherever you go. MATLAB Drive Connector synchronizes your files between your computers and MATLAB Online, providing offline access and eliminating the need to manually upload or download files. You can also run your files from the convenience of your smartphone or tablet by connecting to MathWorks® Cloud through the MATLAB Mobile™ app.

Yes, MATLAB provides tools and frameworks for deep learning, including the Deep Learning Toolbox. You can use MATLAB for tasks like building and training neural networks, image classification, and natural language processing.

MATLAB and Python are both popular choices for AI development. MATLAB is known for its ease of use in mathematical computations and its extensive toolbox for AI and machine learning. Python, on the other hand, has a vast ecosystem of libraries like TensorFlow and PyTorch. The choice depends on your preferences and project requirements.

You can find support, discussion forums, and a community of MATLAB users on the MATLAB website, Matlansolutions forums, and other AI-related online communities. Remember that MATLAB's capabilities in AI and machine learning continue to evolve, so staying updated with the latest features and resources is essential for effective AI development using MATLAB.

Without any hesitation the answer to this question is NO. The service we offer is 100% legal, legitimate and won't make you a cheater. Read and discover exactly what an essay writing service is and how when used correctly, is a valuable teaching aid and no more akin to cheating than a tutor's 'model essay' or the many published essay guides available from your local book shop. You should use the work as a reference and should not hand over the exact copy of it.

Matlabsolutions.com provides guaranteed satisfaction with a commitment to complete the work within time. Combined with our meticulous work ethics and extensive domain experience, We are the ideal partner for all your homework/assignment needs. We pledge to provide 24*7 support to dissolve all your academic doubts. We are composed of 300+ esteemed Matlab and other experts who have been empanelled after extensive research and quality check.

Matlabsolutions.com provides undivided attention to each Matlab assignment order with a methodical approach to solution. Our network span is not restricted to US, UK and Australia rather extends to countries like Singapore, Canada and UAE. Our Matlab assignment help services include Image Processing Assignments, Electrical Engineering Assignments, Matlab homework help, Matlab Research Paper help, Matlab Simulink help. Get your work done at the best price in industry.