节点文献

具有连续或不连续输出函数的神经网络模型的动力学研究

Dynamics Research of Neural Network Models with Continuous or Discontinuous Output Functions

【作者】 李立平

【导师】 黄立宏;

【作者基本信息】 湖南大学 , 应用数学, 2009, 博士

【摘要】 本论文首先研究了一类对信号强度进行改进后的时滞细胞神经网络模型的动力学行为,然后对几类具有不连续激活函数的一般神经网络模型的动力学性质进行了分析.全文共分六章:第一章回顾了神经网络的研究历史与发展概况,并对本文所研究的几类神经网络模型的应用背景及现状,以及本文的研究内容进行了介绍.第二章介绍了本文需要用到的一些基本定义和基本引理.主要涉及到有关矩阵理论、集值分析和微分包含等方面的内容.第三章对一类由不变模块所描述且改进了信号强度的时滞细胞神经网络的平衡点进行了分析.与标准的时滞细胞神经网络相比较,改进后的模型允许输出函数无界,具有耗能更低、细胞密度更大以及运行速度更快等特点.本章首先利用耗散理论分析了该系统的全局吸引集和正不变集的大小,从而确定出平衡点存在的可能区域.然后,根据模型中状态反馈模块和时滞反馈模块的不变性特点,通过构造合适的迭代映射,研究了饱和区域内平衡点的存在性、个数以及局部渐近稳定性.最后,给出了该时滞细胞神经网络模型全局指数稳定的充分条件,并同时确定了此时该系统唯一平衡点存在的确定区域.该章获得的结果推广了已有对标准时滞细胞神经网络平衡点的上述性质所做的一些工作.第四章探讨了一类具有不连续输出函数的神经网络模型的动力学行为,该类模型包含了许多文献中所研究的具有不连续输出函数的Hopfield神经网络模型.在不要求输出函数有界的条件下,利用M矩阵理论、集值映射中的Leray-Schauder不动点定理并结合广义Lyapunov泛函方法,研究了这类模型状态平衡点的存在唯一性、全局指数稳定性以及对应输出平衡点的全局收敛性.进一步,给出了系统在有限时间内收敛的充分条件,该性质也是由右端不连续微分方程描述的神经网络模型所具有的特殊性质.在第五章中,我们对一类具有不连续输出函数的时滞神经网络模型进行了分析.首先,讨论了这类模型平衡点存且唯一的充分条件.然后,通过研究与模型等价的右端不连续泛函微分方程零解的全局渐近稳定性,给出了这类神经网络模型状态平衡点全局渐近稳定以及对应的输出平衡点全局渐近收敛的一般判据.该章中的主要结果和已有文献相比较,一方面去除了对输出函数要求有界及单调的假设,另一方面也放松了对连接矩阵的严格限制条件.第六章研究了一类可以描述为右端不连续微分方程的周期环境下的神经网络模型.在并不要求输出函数连续、有界及单调非减的情况下,通过利用线性矩阵不等式,微分包含中的Cellina近似选择定理,以及右端不连续微分方程δ-解的一致收敛定理,得到了这类神经网络模型存在周期解的充分条件.最后,进一步结合Lyapunov泛函方法证明了周期解的全局指数稳定性.

【Abstract】 In this thesis, the dynamic behaviors are addressed for both the so-called improved signal range model of cellular neural networks and several classes of neural networks with discontinuous neuron activations.In Chapter One, the history and development of neural networks are briefly intro-ducted. The background and motivations of this work are also given in this chapter.In Chapter Two, we list some basic definitions and lemmas concerning with matrix theory, set-valued analysis and differential inclusion.In Chapter Three, the equilibrium points of a class of improved signal range cellular neural networks described by invariant cloning templates are considered. Comparing to the standard delayed cellular neural networks, the output functions of improved models are allowed to be unbounded, which results that the improved models have the features such as lower power consumption, greater cell density and running faster. This chapter firstly address the global attractive set and positive invariant set to make sure of the general potential domains of equilibrium points. Secondly, according to the invariability of state feedback templete and delay feedback templete, the existence, number and local asymptotic stability of equilibrium points are studied in each saturation region by constructing the suitable iterative mapping. Finally, one sufficient condition is obtain to ensure the global exponential stability of this system, and the explicit existing region of the unique equilibrium point is further located. The obtained results extend previous works on above issues of standard delayed cellular neural networks.In Chapter Four, the dynamics of a class of recurrent neural networks are investigated, where the neural activations are modeled by discontinuous functions. The moldels studied in this chapter includes the known Hopfield neural network models. Without presuming the boundedness of activation functions, we establish some sufficient conditions to ensure the existence, uniqueness, global exponential stability of state equilibrium point and global convergence of output equilibrium point, respectively. Furthermore, under certain conditions, we prove that this system is convergent globally in finite time, which is a special character of neural networks described by differential equations with discontinuous right hand. The analysis is based on the properties of M-matrix, Leray-Schauderfixed point theorem of multivalued version and generalized Lyapunov-like approach.Chapter Five addresses a class of delayed neural networks with discontinuous neu- ron activations. Some sufficient conditions to ensure the existence and global asymptotic stability of the state equilibrium point are presented. Furthermore, the global convergence of the output solutions are also discussed. The assumptive conditions imposed on activation functions are allowed to be unbounded and nonmonotonic, and the restrained conditions on connection weigh matrix are less than previews works on the discontinuous or continuous neural networks.In Chapter Six, we consider a class of discontinuous recurrent neural network models described as a periodic differential system. Without presuming the activation functions to be continuous, bounded and monotone nondecreasing, by utilizing linear matrix inequality, Cellina approximate selection theory in differential inclusion and the uniform convergence Theorem ofδ-solutions in differential equations with discontinuous right hand, a sufficient condition is provided to ensure the existence of periodic solutions of this system. The global exponential stability of periodic solution is also proven by employing the generalized Lyapunov-like approach.

  • 【网络出版投稿人】 湖南大学
  • 【网络出版年期】2010年 01期
节点文献中: 

本文链接的文献网络图示:

本文的引文网络