节点文献

基于RPROP-SVR混合算法的DRNN网络非线性系统辨识

DRNN Nonlinear System Identification Based on RPROP-SVR Hybrid Algorithm

【作者】 王晓燕

【导师】 张翠芳;

【作者基本信息】 西南交通大学 , 控制理论与控制工程, 2009, 硕士

【摘要】 非线性系统辨识是控制理论研究的难点和热点。利用对角递归神经网络对复杂的非线性系统的模型进行辨识时,可通过调整内部神经元的权值来实现非线性系统的动态映射,表现出了很强的动态映射能力,同时使得网络权值调节的计算量很小。但是由于现有的用于训练对角递归神经网络的学习算法存在收敛速度慢、辨识精度不够理想的缺点,因此本文针对这一问题进行了探讨和研究。本文首先针对在对角递归神经网络中广泛采用的DBP算法的辨识误差较大且收敛速度慢的缺点,分别采用了Lyapunov函数算法和遗传算法作为改进算法训练对角递归神经网络。并对DBP算法、Lyapunov函数算法及遗传算法的辨识效果进行了比较,仿真结果表明基于Lyapunov函数算法的辨识误差和辨识精度都要优于DBP算法和遗传算法,其辨识误差最小,且收敛速度最快。其次,为了避免在Lyapunov函数算法中梯度大小对网络权值改变的影响,本文首次将一种局部自适应学习算法RPROP算法用于训练对角递归神经网络,该算法不受梯度大小对权值调整的影响,而只是决定网络权值的调整方向,该算法辨识精度高,能加速收敛,并在一定程度上克服了局部最小问题。然后,针对神经网络隐层节点数任要靠经验选取的问题,本文提出将RPROP算法与支持向量回归算法相结合的混合算法——RPROP-SVR算法,用于对角递归神经网络的训练,其中利用SVR算法自动确定网络隐层节点数,利用RPROP算法训练网络权值,并将这一新算法用于非线性系统辨识,取得了很好的辨识效果。最后,将基于RPROP算法与RPROP-SVR混合算法的辨识效果进行了对比,仿真结果表明,两种算法的辨识误差、辨识精度及收敛速度基本相同,这就证明了利用RPROP-SVR混合算法自动确定网络隐层节点数的有效性,说明该方法可替代人工试取的方式一次性自动确定出神经网络结构,从而可节省大量因人工试取隐层节点数而耗费的试验时间。

【Abstract】 The identification of nonlinear system is always the difficulty and the focus of control theory study. When identifying the complex nonlinear system model based on the diagonal recurrent neural network, dynamic mapping for nonlinear system is available via adjusting internal neurons weight. The results show the highly dynamic mapping capability as well as the little regulation for network weight. But due to existing learning algorithm of the diagonal recurrent neural network being defects of slow convergence and bad identification inaccuracy, so this paper discussed and researched the topic.As the large identification errors and the slow convergence rate for dynamic back propagation (DBP) algorithm in the diagonal recurrent neural network, Lyapunov function algorithm (LFA) and genetic algorithm (GA) here are developed as an improved algorithm to train the diagonal recurrent neural network. The identification results are fully compared to demonstrate that both the identification errors and the convergence rate for LFA are better than that of DBP algorithm and GA, the identification residual is the smallest, and the convergence rate is the most fast.Secondly, in order to avoid the gradient affection for network weight in Lyapunov function, a local adaptive learning algorithm, namely resilient back-propagation (RPROP) algorithm, is applied to train the diagonal recurrent neural network, which can almost neglected the gradient variation but determine the adjustment direction for the network weight. Therefore, RPROP algorithm has the advanced characteristics as high identification precision, convergence-acceleration capacity and appropriate solution for local minimum problem.According to problem of selecting the number of hidden neural network nodes by experience, a hybrid algorithm named RPROP-SVR is proposed by integrating RPROP algorithm and vector support regression (SVR) algorithm to train the diagonal recurrent neural network. Detailedly, the number of hidden neural network nodes is selected by SVR algorithm, as well as that network weight is trained by RPROP algorithm. Furthermore, this new algorithm shows that expected identification effect has been obtained with its application in a nonlinear identification system.Finally, the identification effects for RPROP algorithm and RPROP-SVR algorithm, respectively, are fully compared basing on the simulation results that the identification errors, the identification precision and the convergence speed are almost the same. Therefore, RPROP-SVR hybrid algorithm is demonstrated to automatically effectively determine the number of hidden neural network nodes. Interestingly, this technique can be an alternative way of artificial test to automatically build the structure of a neural network at a time, which improves the efficiency a lot without artificial test of the number of hidden neural network nodes.

  • 【分类号】TP183;O231.2
  • 【被引频次】9
  • 【下载频次】117
  • 攻读期成果
节点文献中: 

本文链接的文献网络图示:

本文的引文网络