节点文献

基于微分包含的非光滑动力系统分析及其应用

Analysis of Nonsmooth Dynamical System Based on Differential Inclusion and Its Applications

【作者】 秦泗甜

【导师】 薛小平;

【作者基本信息】 哈尔滨工业大学 , 基础数学, 2010, 博士

【摘要】 基于微分包含与非光滑分析,本文系统地研究了带有不连续激励函数延时神经网络、次梯度系统神经网络、非光滑类梯度系统和Hilbert空间中带有Clarke次微分发展包含这四大类微分包含的动力学性质,所得到的结果如下:1.研究了一类延时神经网络的指数稳定性与有限时间收敛性。这类问题目前已经有的结果基本上都是在激励函数连续有界情形下获得的,而本文在激励函数不连续且无界的情形下,证明了这类神经网络的两种稳定性:指数稳定性与有限时间收敛性。首先,借助于集值映射的拓扑度理论证明了该神经网络存在唯一的平衡点。然后通过构造Lyapunov函数证明了过任意初始点该神经网络不但存在唯一的全局解,而且这个全局解是按照指数速度收敛到平衡点,即该网络是指数稳定的。许多文献的结论都可以看成是这个定理的推论,另外这个定理的条件比较容易验证而且还具有鲁棒性。接着,在某些给定的条件下,本文证明了该网络的任意轨迹都会在有限时间内收敛到平衡点,即所谓的有限时间收敛,它是不连续系统特有的一种现象。同时,利用两个数值算例解释了上述结论的可行性。2.研究了一类次梯度系统神经网络的动力学行为。这类次梯度系统是从目前广泛用于解优化问题的多种神经网络模型中抽象出来的,而且全域细胞神经网络也是它的特例。本文首先证明了这类系统全局解与平衡点的存在性。然后研究了这类系统的稳定性,目前已有的这类系统稳定性结果是拟收敛性。本文利用非光滑的?ojasiewicz不等式,证明了它的全局渐近稳定性,即从任意点出发的解都会渐近收敛到一个平衡点。这个定理的直接推论就是全域细胞神经网络的渐近稳定性,它大大改进了以前有关全域细胞神经网络稳定性的结论。另外,通过?ojasiewicz指数可以计算出解的收敛速度。本文接着研究了一个与这类系统有关的约束极小值问题,证明了这类次梯度系统的(或渐近)稳定平衡点集恰好就是这个约束极小值问题的(或严格)极小点集。最后给出了两个有关这类次梯度系统解的逼近定理,并通过具体例子来阐明了这两个定理的可行性。3.研究了一类非光滑类梯度系统的动力学行为。著名的Hopfield神经网络和细胞神经网络都可以看成是这类系统的特例。首先,利用拓扑度的同伦不变性和凸函数次微分的极大单调性,证明了这类系统存在平衡点与全局解。然后构造Lyapunov函数并借助于反证法得到了这类系统全局解的渐近稳定性。接着,利用这个系统求解了一类非光滑函数在集合{0,1}n上的局部极小值问题和一类非线性规划问题,并列举了相关的数值算例来详细说明。最后,分三种情况研究了这类系统周期解的存在性问题:(1)激励函数有界,(2)激励函数满足次线性增长条件,(3)激励函数是C2的而且严格单增。4.研究了Hilbert空间中一类发展包含解的存在性问题。在近几十年时间里,人们集中研究了带有凸函数次微分发展包含解的存在性问题。而本文将研究这类发展包含更为一般的情形:带有Clarke次微分的发展包含。与凸函数次微分相比,Clarke次微分有着更广泛的理论和现实应用,然而由于其不具备极大单调性,这大大增加了此类问题研究的难度。本文首先证明了这类发展包含在扰动项是单值时解的存在唯一性定理,并得到了两个非常重要的不等式估计。基于这两个不等式估计并借助于连续选择定理和Schauder不动点定理证明了这类发展包含在扰动项是下半连续集值映射时强解存在性定理。然后利用端点选择定理证明了这类发展包含端点解的存在性,在此基础上得到了松弛型定理,即这类发展包含的端点解集在强解集中是稠密的。最后,将这些结论应用到了两个抛物型偏微分方程的例子,证明了它们解的存在性定理。

【Abstract】 Based on differential inclusion and nonsmooth analysis, the dissertation studies thedynamical property of delayed neural networks with discontinuous activations, neuralnetworks of subgradient system, nonsmooth gradient-like systems and evolution inclusionwith Clarke subdifferential type in Hilbert space. The main result of the dissertation islisted as follows:1. Study the exponential stability and convergence in finite time of delayed neuralnetwork. At present, most existing results of stability of this neural network are mainlybased on continuity and boundedness of activation function. In this dissertation, withoutassumption of continuity and boundedness of activation function, we prove two stabil-ity of such neural network: exponential stability and convergence in finite time. Firstly,by set-valued topology degree theory, we get the existence and uniqueness of equilib-rium point of such neural network. Then by constructing Lyapunov function, we provethat there exists unique global solution for any initial value problem, and this solutionconverges to the equilibrium point with exponential rate,i.e. the neural network is expo-nential stability. Many existing results can be considered as a corollary of this theorem.Moreover, the conditions of theorem are easily testable and robust. In the end, under somemild hypothesises, we prove that any trajectory of such neural network will converge tothe equilibrium point in finite time, i.e., convergence in finite time, which is a peculiarphenomena of discontinuous system. Meanwhile, two numerical examples are presentedto illustrate the applicability of our results.2. Study dynamical behaviors of a class of neural networks of subgradient system,which can be regarded as a generalization of neural network models considered in the op-timization context. Full range cellular neural networks (FR-CNNs) is also its special case.At first, we prove the existence of global solution and equilibrium point, and then studyits stability. Most results on stability of such system are quasi-convergence. In this disser-tation, by nonsmooth ?ojasiewicz inequality, we prove the asymptotic convergence of thetrajectories of this subgradient system, i.e., starting from any initial point, its trajectorywill converge to an equilibrium point in the end. As a direct application, this theorem im-plies asymptotic stability of FR-CNNs, which greatly improves the results of stability of FR-CNNs. Moreover, by ?ojasiewicz exponent, we can easily compute the convergencerate of its solution. Then a constrained minimization problem is studied, which can be as-sociated with this neural network. It is proved that the local constrained (strict) minimumof the objective function coincides with the (asymptotically) stable equilibria point of thisneural network. Finally, we present two theorems about approximation of solutions ofthis subgradient system and serval examples are given to explain these theorems.3. Study dynamical behaviors of a class of nonsmooth gradient-like systems. Thewell-known Hopfield neural network and cellular neural network are all special cases ofit. Firstly, by homotopic invariance of topology degree and the maximal monotonicityof convex subdifferential, we prove the existence and uniqueness of global solution andequilibrium point of this system. Then, by virtue of a constructed Lyapunov function andreduction to absurdity, we get the asymptotic stability of this system. After that, we applyresults above into seeking local minimum point of nonsmooth function over {0, 1}n anda class of nonlinear programming problems, and some examples are presented to showits applicability. In the end, we investigate the existence of periodic solution of this non-smooth gradient-like system, which is divided into three cases:(1) activation function isbounded, (2) activation function satisfies sublinear growth condition, (3) activation func-tion belongs to C2 and strictly increase.4. Study the existence of solution of evolution inclusions in Hilbert space. In re-cent decades, people focus on the evolution inclusion with convex subdifferential, andin this dissertation, we will study more general case: evolution inclusion with Clarkesubdifferential. Compared with convex subdifferential, Clarke subdifferential has widerapplications in theory and practice. However, Clarke subdifferential doesn’t have maxi-mal monotonicity, which means it will be more difficult to study evolution inclusion withClarke subdifferential. In this dissertation, at first, we get the existence and uniqueness ofits solution in case that the perturbation is a single-valued function, and we also get twoimportant inequalities. Based on these two inequalities, by continuous selection theoremand Schauder fixed point theorem, we get existence theorem of strong solution of thisevolution inclusion when the perturbation is a multivalued lower semicontinuous map.Then, the existence theorem of extremal solution is proved by extremal selection theo-rem, under which we prove the relaxation theorem, i.e., the extremal solution set is densein the strong solution set of this evolution inclusion. Finally, we apply these results into two examples of parabolic PDE’s and get their existence theorems of solution.

节点文献中: 

本文链接的文献网络图示:

本文的引文网络