节点文献

几类离散神经网络模型的动力学分析

Dynamical Behavior Analysis of Several Class of Discrete-time Neural Network Models

【作者】 王军平

【导师】 阮炯;

【作者基本信息】 复旦大学 , 应用数学, 2006, 博士

【摘要】 本文首先介绍了几类离散神经网络模型的由来及其研究概况,利用Schauder不动点原理证明了一类具有广义输入输出函数的离散神经网络模型平衡点(也就是不动点)的存在性,利用Lyapunov函数逆定理给出了这类离散神经网络模型在时变权值下的一致渐近稳定性的充分条件。其次,利用反向可积极限方法证明了一类周期输入输出函数的离散神经网络模型在参数达到一定范围时产生Devaney意义下的混沌,并讨论了其一维情形的神经元模型出现倍周期分支和鞍-结点分支的情况。最后,利用重合度理论讨论了一类离散Cohen-Grossberg神经网络模型在定时滞和变时滞两种情形下周期解的存在性,并给出了这两种情形下周期解的存在唯一性及其全局指数稳定性的充分条件。在本文的第一章中,首先介绍了神经网络动力学中一些问题的研究背景,然后具体描述了几类重要的离散神经网络模型及其应用,阐述了用动力学方法研究神经网络的重要性,同时也给出了本文的结构。在本文的第二章中,首先介绍了一类具有广义输入输出函数的非自治离散神经网络模型,该模型把瞬时混沌神经网络模型中的输入输出函数推广到了一般的单调递增且连续可微的函数;其次,利用Schauder不动点原理和利用Lyapunov函数逆定理依次证明了模型平衡点的存在性和该模型在时变权值下的一致渐近稳定性;最后,对几个具体的例子进行数值模拟,数值模拟的结果更好地说明了我们的结论。在本文的第三章中,首先介绍了一类输入输出函数是正弦周期函数的离散神经网络模型,它比传统的单调输入输出函数的混沌神经网络模型具有更好记忆存储功能和更惊人的大存储容量;其次,利用倍周期分支和鞍-结点分支的判别法研究了一维正弦输出输入函数的神经元模型的分支情况;最后,利用反向可积极限方法证明了这类高维离散神经网络模型在参数达到一定范围时具有Devaney意义下的混沌,并给出了两个数值模拟的具体例子来进一步表明我们结论的正确性和有效性。在本文的第四章中,首先介绍了重合度理论,它是一些微分方程和差分方程证明周期解存在性的重要理论基础;其次,利用重合度理论和Lyapunov函数法,依次给出了一类定时滞和变时滞的离散Cohen-Grossberg神经网络模型周期解的存在性及其全局指数稳定性的充分条件;最后,同样给出了几个例子的数值模拟。本文的第五章列出了在离散神经网络模型中一些正在研究或者即将研究的问题。

【Abstract】 In this thesis, we firstly introduce several class of discrete-time neural network models and the research progress of the neural networks. By Schauder fixed-point principle we prove the existence of an equilibrium (i.e. a fixed point) of a discrete-time neural network with generalized input-output function and by using the converse theorem of Lyapunov function we study the uniformly asymptotical stability of equilibrium in this discrete-time neural network with variable weight and give some sufficient conditions that guarantee the stability of it. Secondly, the existence of chaos in a special class of discrete-time neural network models with sinusoidal activation function in the sense of Devaney with some parameters of the systems entering some regions are rigorously presented by means of anti-integrable limit method. What’s more, period-doubling bifurcation and saddle-node bifurcation in the neuron model are studied. Finally, the existence and global exponential stability of periodic solutions in a discrete-time Cohen-Grossberg neural network with variable and invariable delay are investigated by using the continuation theorem of coincidence degree theory. And sufficient conditions are given to guarantee the existence ofω—periodic solution and all other solutions are convergent to it globally exponentially.This thesis is divided into five chapters. In chapter 1, we introduce the mathematical models and the research progress for the discrete-time neural networks. We illustrate several class of important discrete-time neural networks and show that it is necessary to analysis the stability and the complex dynamics in the concrete mathematical models.In chapter 2, we firstly introduce the model of a discrete-time neural network with generalized input-output function. The model generalizes the input-output function in transiently chaotic neural network to a class of continuous, differentiable and monotone increasing functions. Secondly we study the uniformly asymptotical stability of equilibrium in the non-autonomous model. Finally, several examples and numerical simulations are given to illustrate and reinforce our theories.In chapter 3, we firstly introduce a specific class of discrete-time neural network models with sinusoidal activation function. This class of models have the ability of embedded pattern retrieval in the neural network beyond the conventional one with a monotonous activation function and possess a remarkably larger memory capacity than the conventional association system. Secondly, We obtain some sufficient conditions to ensure that there exists period-doubling bifurcation and saddle-node bifurcation in the neuron model. Finally, By means of anti-integrable limit method, the existence of chaos in the discrete-time neural network models in the sense of Devaney with some parameters of the systems entering some regions are rigorously presented. In addition, several concrete examples with their numerical simulations are further provided to reinforce our theoretical results.In chapter 4, we first introduce the continuation theorem of coincidence degree theory. It is very important theoretical basis for some differential equations and difference equations to prove the existence of the periodic solution. Secondly, the existence and global exponential stability of periodic solutions in a discrete-time Cohen-Grossberg neural network with variable and invariable delay are investigated by using the continuation theorem of coincidence degree theory. And sufficient conditions are given to guarantee the existence ofω—periodic solution and all other solutions are convergent to it globally exponentially. At the end of this dissertation, we list some problems for our future works which include stability and chaos in some more complex neural networks.

  • 【网络出版投稿人】 复旦大学
  • 【网络出版年期】2007年 06期
  • 【分类号】TP183;O193
  • 【被引频次】5
  • 【下载频次】555
  • 攻读期成果
节点文献中: 

本文链接的文献网络图示:

本文的引文网络