节点文献

多稳定神经网络的动力学分析

Dynamical Analysis of Multistable Neural Networks

【作者】 徐芳

【导师】 章毅;

【作者基本信息】 电子科技大学 , 应用数学, 2011, 博士

【摘要】 大脑是一个结构最复杂、机理最神秘、功能最完善的体系。人类高级智能活动如感觉、思考、学习和记忆等都是大脑皮层中由大量神经元所构成的神经网络作用的结果。人工神经网络是模拟生物神经网络的结构和原理而提出的。20世纪80年代以来,神经网络的研究再次得到许多科学家的关注,并在理论和应用方面,取得了大量新的研究成果。神经网络的动力学分析是其走向实际应用的重要理论基础。一般地,网络稳定模式有两种:单稳定性和多稳定性。多稳定性是指网络中有多个稳定的平衡点。它抓住了生物神经网络的最本质特征,更深刻地揭示出了生物神经网络的内在机制。当大脑受到外界刺激时,大脑皮层中的神经元被激活,这时神经元的活动就会发生一些微妙的变化,从而导致由神经元所构成的神经网络对该刺激产生某种响应。然而,我们对这些复杂而微妙的变化知之甚少。2003年,背景神经网络模型的提出,为解决这一问题提供了一条新的思路。然而,由于原始的背景神经网络模型是一个耦合的非线性动力系统,方程中“除”运算给理论分析带来了很大的困难。该模型的理论研究至今尚不完善,仍然存在很多亟待解决的问题。因此,在本文中,我们主要提出了几类改进的背景神经网络模型,并讨论了它们的动力学行为。本文的主要创新成果如下:(1)提出了一类改进的具有一致点火率的背景神经网络模型,研究了该网络的动力学行为。找到了网络的不变集;给出了网络有界性的充分条件;证明了网络的完全收敛性。(2)提出了一类具有两个子网络的改进的背景神经网络模型,分析了该网络的收敛性。给出了网络全局吸引集的数学表达式;利用Jacobian矩阵,导出了网络平衡点的局部稳定性条件;通过构造一个新的能量函数,严格地证明了网络的完全收敛性。(3)提出了一类N维的改进的背景神经网络模型,详细讨论了该网络的四个基本的动力学性质:有界性、不变性、全局吸引性和完全收敛性。导出了网络具有不变集的条件;给出了网络不变集和全局吸引集的数学表达式;通过构造新的能量函数,证明了网络的完全收敛性。(4)利用原始的背景神经网络模型的一个等价模型,提出了一类具有无限个神经元的改进的背景神经网络模型。给出了在两种情况下,即背景输入为零和不为零时,网络的连续吸引子的存在条件及其数学表达式。(5)研究了任意指数的改进的背景神经网络模型的多稳定计算。给出了具有任意指数的一维网络的有限个平衡点存在的条件以及稳定性的判定条件,证明了该网络的完全收敛性;导出了具有任意指数的二维网络的平衡点的局部稳定性条件。(6)研究了一类二维神经网络的动力学行为。给出了该网络的不变集并证明了它的有界性;构造了一个封闭的曲线,并利用向量场的旋转数理论,导出了网络至少存在一个平衡点的条件。上述研究结果,对我们进一步研究多稳定神经网络,将起到积极的推动作用。

【Abstract】 The brain is a system, which has the most complex structure, the most mysterious mechanism and the most perfect function. The high-ranking intelligent activities of human, such as the feeling, thinking, learning, and memory, are the results of neural networks. Artificial neural networks are proposed to simulate structure and mechamism of biological neural networks. Since the 1980s, artificial neural networks have attracted much interest of many scientists and have made significant process in both theory and application.The dynamical properties of neural networks play crucial roles in their applications. Generally speaking, the stable mode of neural networks can be divided into two classes: monostability and multistability. A multistable neural network can possess multiple stable equilibria. It characterizes properties of biological neural network in essence and depicts internal mechanism of neural biology in deep.Neurons in the brain cortex are activitied when the brain is affected by an external stimulus. A response is triggered by a certain stimulus in the neural system composed of huge neurons. Neural activity evoked by the stimulus has been changed subtly. However, little is known about the complex and subtle change. In 2003, the proposed background neural netowrk has provided a new theoretical model. However, since the model is a coupled and nonlinear dynamical system, the form of division in the equation brought many difficulties to us. Moreover, there still exist a lot of unresolved problems about the original model. Therefore, in the dissertation, we propose several classes of improved background neural network model and discuss its dynamical behavior.The main contributions of the dissertation are as follows:(1) In Chapter 2, a class of improved background neural network model with uniform firing is proposed. Dynamical behavior of the proposed model is studied. Conditions for boundness and invariant set of networks are estabilished. By constructing a new energy function, complete convergence of networks is proved.(2) Chapter 3 focuses on a class of improved background neural networks with two subnetworks. Convergence of the networks is investigated. Global attractive set of the network is obtained. By using Jacobian matrix, a local stable condition for equilibrium point of the network is derived. Complete convergence of the network is rigorous proved by constructing a new energy function.(3) Chapter 4 presents a class of improved n-dimensional background neural networks. Four basic dynamical properties are discussed in detail: boundness, invarianty, global attractivity, and complete convergence. An invariant set is obtained. Moreover, the expressions of invariant set and global attractive set are given respectively. By using a new energy function again, complete convergence of the networks is proved.(4) In Chapter 5, based on the equivalent model of original background neural networks, a class of improved background neural network model with a relatively large number of neurons is proposed. In two cases, i.e, the background input is zero or nonzero, the conditions for continuous attractors are derived and the representations of them are obtained.(5) In Chapter 6, multistability for one-dimensional and two-dimensional improved background neural network models with arbitrary exponents are studied. For one-dimensional case, conditions for the existence and stability of equilibrium points are derived and complete convergence is investigated. For two-dimensional case, local stable condition for equilibrium point is also derived.(6) Finally, in Chapter 7, the dynamical behavior of a class of two-dimensional neural networks is discussed. An invariant set and boundedness of networks are given. By using the winder number of the vector field and constructing a closed curve, we obtain a condition under which there at least exists an equilibrium point in the network.

节点文献中: 

本文链接的文献网络图示:

本文的引文网络