概述
《MATLAB神经网络及其应用》由会员分享,可在线阅读,更多相关《MATLAB神经网络及其应用(24页珍藏版)》请在人人文库网上搜索。
1、MATLAB中的神经网络及其应用:以BP为例,1 一个预测问题,已知:一组标准输入和输出数据(见附件) 求解:预测另外一组输入对应的输出 背景:略,2 BP网络,3 MATLAB中的newff命令,NEWFF Create a feed-forward backpropagation network. Syntax net = newff net = newff(PR,S1 S2.SNl,TF1 TF2.TFNl,BTF,BLF,PF,命令newff中的参数说明,NET = NEWFF creates a new network with a dialog box. NEWFF(PR,S1 S。
2、2.SNl,TF1 TF2.TFNl,BTF,BLF,PF) takes, PR - Rx2 matrix of min and max values for R input elements. Si - Size of ith layer, for Nl layers. TFi - Transfer function of ith layer, default = tansig. BTF - Backprop network training function, default = trainlm. BLF - Backprop weight/bias learning function, 。
3、default = learngdm. PF - Performance function, default = mse. and returns an N layer feed-forward backprop network,参数说明,The transfer functions TFi can be any differentiable transfer function such as TANSIG, LOGSIG, or PURELIN. The training function BTF can be any of the backprop training functions s。
4、uch as TRAINLM, TRAINBFG, TRAINRP, TRAINGD, etc,参数说明,WARNING*: TRAINLM is the default training function because it is very fast, but it requires a lot of memory to run. If you get an out-of-memory error when training try doing one of these: (1) Slow TRAINLM training, but reduce memory requirements, 。
5、by setting NET.trainParam.mem_reduc to 2 or more. (See HELP TRAINLM.) (2) Use TRAINBFG, which is slower but more memory efficient than TRAINLM. (3) Use TRAINRP which is slower but more memory efficient than TRAINBFG,参数说明,The learning function BLF can be either of the backpropagation learning functio。
6、ns such as LEARNGD, or LEARNGDM. The performance function can be any of the differentiable performance functions such as MSE or MSEREG,4 MATLAB中的train命令,TRAIN Train a neural network. Syntax net,tr,Y,E,Pf,Af = train(NET,P,T,Pi,Ai,VV,TV) Description TRAIN trains a network NET according to NET.trainFcn。
7、 and NET.trainParam,输入参数说明,TRAIN(NET,P,T,Pi,Ai) takes, NET - Network. P - Network inputs. T - Network targets, default = zeros. Pi - Initial input delay conditions, default = zeros. Ai - Initial layer delay conditions, default = zeros. VV - Structure of validation vectors, default = . TV - Structure。
8、 of test vectors, default =,输出参数说明,and returns, NET - New network. TR - Training record (epoch and perf). Y - Network outputs. E - Network errors. Pf - Final input delay conditions. Af - Final layer delay conditions,说明,Note that T is optional and need only be used for networks that require targets. 。
9、Pi and Pf are also optional and need only be used for networks that have input or layer delays,输入参数数据结构说明,The cell array format is easiest to describe. It is most convenient for networks with multiple inputs and outputs, and allows sequences of inputs to be presented: P - NixTS cell array, each elem。
10、ent Pi,ts is an RixQ matrix. T - NtxTS cell array, each element Pi,ts is an VixQ matrix. Pi - NixID cell array, each element Pii,k is an RixQ matrix. Ai - NlxLD cell array, each element Aii,k is an SixQ matrix. Y - NOxTS cell array, each element Yi,ts is an UixQ matrix. E - NtxTS cell array, each el。
11、ement Pi,ts is an VixQ matrix. Pf - NixID cell array, each element Pfi,k is an RixQ matrix. Af - NlxLD cell array, each element Afi,k is an SixQ matrix,输入参数数据结构说明,Where: Ni = net.numInputs Nl = net.numLayers Nt = net.numTargets ID = net.numInputDelays LD = net.numLayerDelays TS = number of time step。
12、s Q = batch size Ri = net.inputsi.size Si = net.layersi.size Vi = net.targetsi.size,5 实现,数据处理和准备 把WORD数据转换成TXT文件格式 利用dlmread读取数据 是否进行归一化处理,生成网络,为调用newff命令做好各种准备 pr矩阵的形成 网络结构确定:网络层数以及每层的神经元个数 每一层的传输函数的确定 注意参数的含义,进行网络训练,为调用train命令进行数据准备 输入样本的确定 标准输出的确定 网络训练参数(次数)的确定 net. trainParam.epochs=100 调用网络训练命令。
13、:net=train(net,p,t,进行输出模拟,调用y=sim(net,p)进行输出模拟 画图进行对比,查看网络参数及权值,net net参数引用和查看,6 预测及分析,sim输出 重新训练并sim输出 画图对比,7 程序实现,clc clear all clear net load data; load data_pre; c1=in(:,1); c2=in(:,2); c3=in(:,3); c4=in(:,4); c5=in(:,5); c6=in(:,6); c7=in(:,7); c8=in(:,8,c1_max=max(c1); c2_max=max(c2); c3_max=m。
14、ax(c3); c4_max=max(c4); c5_max=max(c5); c6_max=max(c6); c7_max=max(c7); c8_max=max(c8,续, c1=c1/c1_max; % c2=c2/c2_max; % c3=c3/c3_max; % c4=c4/c4_max; % c5=c5/c5_max; % c6=c6/c6_max; % c7=c7/c7_max; % c8=c8/c8_max; , in(:,1)=c1; % in(:,2)=c2; % in(:,3)=c3; % in(:,4)=c4; % in(:,5)=c5; % in(:,6)=c6; %。
15、 in(:,7)=c7; % in(:,8)=c8; ,续, c1_max=max(c1); c1_min=min(c1); % c2_max=max(c2); c2_min=min(c2); % c3_max=max(c3); c3_min=min(c3); % c4_max=max(c4); c4_min=min(c4, c5_max=max(c5); c5_min=min(c5); % c6_max=max(c6); c6_min=min(c6); % c7_max=max(c7); c7_min=min(c7); % c8_max=max(c8); c8_min=min(c8,续,pr=c1_min,c1_max;c2_min,c2_max;c3_min,c3_max;c4_min,c4_max;c5_min,c5_max;c6_min,c6_max;c7_min,c7_max;c8_min,c8_max; p=in; t=out; net=newff(pr,8 11 4,logsig logsig logsig,net.trainParam.epochs=100; net=train(net,p,t); y=sim(net,p); % plot(t); % figure; % plot(y。
最后
以上就是聪慧高山为你收集整理的matlab命令 elempro,MATLAB神经网络及其应用的全部内容,希望文章能够帮你解决matlab命令 elempro,MATLAB神经网络及其应用所遇到的程序开发问题。
如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。
发表评论 取消回复