节点文献

非线性优化问题的过滤线搜索方法

Filter Line Search Methods for Nonlinear Optimization

【作者】 王祝君

【导师】 朱德通;

【作者基本信息】 上海师范大学 , 计算数学, 2010, 博士

【摘要】 线搜索方法是保证最优化方法总体收敛的基本策略之一,具有简单、可靠等优点。求解搜索方向和步长是线搜索方法的关键组成部分,搜索方向的设计影响方法的收敛速度而搜索步长可确保下降方向方法的收敛性。本文主要以Armijo准则为基础确定搜索步长,关于线搜索方法的扩展都是建立在扩展Armijo准则的基础上。过滤算法一般用来解约束优化问题,其主要思想在于试验步在减少了目标函数或约束违反度情况下被接受成为新迭代。过滤技术很好地平衡了目标函数和约束条件,代替传统的罚函数方法保证了优化算法的总体收敛性。过滤方法不仅可以用于信赖域序列二次规划(SQP)框架,也可用于线搜索框架。Wa¨chter和Biegler给出了过滤算法基础上线搜索方法的总体收敛性。本文引入Fletcher和Leyffer提出的过滤技术,结合过滤方法和非单调方法、投影既约Hessian方法、完全投影正割方法、仿射内点方法、内点障碍法等,建立过滤线搜索算法框架,并将其应用于几类典型的优化问题,从理论上研究算法总体收敛性与局部收敛速率,用数值实验检验算法的效果。过滤方法是典型的解约束优化问题的方法,而Gould、Toint和Sainvitu提出了用多维过滤思想结合信赖域方法求解无约束优化问题。本文利用无约束优化问题的有关特征,将其转化为有特殊结构的等式约束优化问题,结合过滤线搜索方法和牛顿法、非单调方法、MBFGS方法(即修正的BFGS方法),借助Wa¨chter和Biegler解非线性等式约束优化问题的方法求解无约束优化问题。在一定条件下证明了提供的算法具有总体收敛性和局部收敛速率,数值实验结果表明新算法要优于经典的线搜索方法。Fontecilla提出的正割方法是很成功的解非线性等式约束优化问题的方法。通过DFP或BFGS正割校正近似Lagrange函数的Hessian阵,大大降低了存贮空间和运算量。本文将正割方法与过滤线搜索方法相结合求解非线性等式约束优化问题,其特点是修正正割方法产生搜索方向,过滤线搜索程序确定步长,二阶校正技术克服Maratos效应。在保持总体收敛性的情况下,算法具有2-步超线性收敛速率,数值结果表明算法是有效的。既约Hessian二次规划算法被证实是求解较大规模约束优化问题的有效方法之一,它只利用了Lagrange函数Hessian阵的部分信息,每次迭代的计算量小且算法所需内存也小。本文构造了既约Hessian过滤线搜索方法求解非线性等式约束优化问题,在合理的假设下,证明了算法具有总体收敛性和超线性收敛速率,数值实验结果证明该算法是可行的。基于过滤线搜索有利于不等式约束的可行性,本文依据有界约束和线性不等式约束的特定条件结合内点投影和仿射投影技术,研究了过滤线搜索方法分别在有界约束优化问题和线性不等式约束优化问题中的应用。在合理的假设下,该方法具有总体收敛性和局部超线性收敛速率。数值结果说明了算法具有一定的实际价值。很多文献提出了用内点法求解不等式约束优化问题,但如何有效地大规模求解非线性等式和线性不等式混合约束优化问题,基于内点法的研究尚少见。本文将牛顿法、正割方法的思想与技术分别用于等式约束,结合过滤线搜索方法、内点投影和仿射投影技术适于不等式的特点综合地解决这类问题。同时用光滑Fletcher罚函数中关于等式约束的Lagrange函数和约束违反度作为过滤对的组成部分,避免了遭遇Maratos效应。我们证明了提供的算法具有总体收敛性和快速的局部收敛速率,给出了数值结果以说明算法的有效可行。最后对本文的研究工作进行总结,提出了今后的研究设想。

【Abstract】 Line search method, which is simple and reliable, is one of the basic strategies to ensurethe global convergence of optimization method. The key points of the method are finding searchdirection and identifying step size. The search direction is used to determine the speed of conver-gence and the step size to ensure the convergence of the descent direction algorithm. In this paper,step size is determined by the Armijo criteria, and then the extension of line search methods isbased on that of the Armijo criteria.Filter algorithm is usually used in solving constrained optimization problems, and its mainconcept is that trial points are accepted if they improve the objective function or improve theconstraint violation. Filter technique obtains a good balance between the objective function andthe constraint violation, instead of the traditional penalty function method to ensure the globalconvergence of the algorithm. Filter methods can be applied to trust region sequential quadraticprogramming (SQP) method framework as well as line search method framework. Wa¨chter andBiegler proposed global convergence results for line search method based on filter algorithms.The thesis introduces the filter technique proposed by Fletcher and Leyffer, then incorporatefilter method into nonmonotone technique, projected reduced Hessian method, complete projectedsecant method, affine-scaling interior-point approach, interior-point barrier methods and so on. Wefocus on establishing filter line search algorithm framework and applying them to a few typicaloptimization problems. We analyze the global and local convergence of these proposed algorithmsand examine their practical results by numerical experiments.The filter methods are traditionally used for constrained nonlinear system and have beenextended creatively by Gould, Toint and Sainvitu to the unconstrained optimization with multidi-mensional filter trust region technique. By employing the relevant characteristics of unconstrainedoptimization, we can transform it into a special equality constrained minimization, then combiniethe filter line search with Newton’s method, nonmonotone technique and MBFGS method for un-constrained optimization. The global convergence and fast local convergence rate of the proposedalgorithms are established under some reasonable conditions. The results of numerical experi-ments indicate that the proposed methods are superior to the classical line search algorithms.The secant methods proposed by Fontecilla are very successful for nonlinear equality con-strained optimization problem. Utilizing the DFP or the BFGS secant updates to approximate theHessian matrix of the Lagrange function, we can greatly reduce the storage space and the com-puting cost. The feature of these algorithms is that the improved secant algorithms are used to produce a search direction, a filter line search procedure to generate step size and second ordercorrection technique to overcome the Maratos effects. Besides global convergence, we prove ourapproach converges locally 2-step superlinear. The numerical results show the effectiveness ofour algorithms.A reduced Hessian successive quadratic programming algorithm has been shown to be oneof the successful method for large-scale nonlinear constrained optimization, which uses the partinformation of the Hessian matrix of Lagrange function, and has a small memory requirements ineach iteration. We present a reduced Hessian filter line search method for nonlinear equality con-strained optimization. The global convergence and fast local superlinear convergence rates of theproposed algorithm are established under some reasonable conditions. The numerical experimentsare reported to show the feasibility of the algorithm.Based on filter line search being favorable to the feasibility of inequality constraints, wecombine interior point projection and affine-scale projection technique according to the particularconditions of box constraints and linear inequality constraints, study the application of filter linesearch method to optimization with simple bounds and with linear inequality constraints, respec-tively. Under mild conditions, the global convergence and local superlinear convergence of theproposed algorithms are obtained. The numerical results show that the new algorithms have somepractical value.There are quite a few articles proposing interior-point methods for optimization with simplebounds or with linear inequality constraints. However, for solving effectively large-scale nonlinearequality and inequality constrained mixed optimization problem, the researches on interior-pointalgorithms are very few. We employ the idea and technique of Newton’s method and secantmethod to the equality constraints, of interior point projection method and affine-scale projectiontechnique to the inequality constraints, solve comprehensively such problems by combining withfilter line search method. The algorithms avoid suffering from the Maratos effect by employing theLagrangian function and the constraint violation in the filter, which come from smooth Fletcher’spenalty function and are relative to equality constraints. We show that the proposed algorithmshave global convergence and fast local convergence rate. Numerical results indicate that thesealgorithms are feasible and effective.Finally, we conclude the main research results of this thesis and give some ideas of the futureresearch.

节点文献中: 

本文链接的文献网络图示:

本文的引文网络