节点文献

稀疏约束下反问题理论与算法的研究

Studies on Inverse Problem Theory and Algorithm with Sparsity Constraints

【作者】 焦雨领

【导师】 樊启斌;

【作者基本信息】 武汉大学 , 应用数学, 2014, 博士

【摘要】 在生产生活的众多领域,人们需要根据观测到的数据,反演或提取相应的信息.这些问题归属于反问题的范畴,通常具有很强的不适定性.建立在Hilbert空间的经典Tikhonov正则化有着完备的理论和算法,是处理这类问题的常用方法.为了能够更好的刻画所反演变量的特征,去十几年里,在信号处理,图像处理,医学成像,非破坏性检测,偏微分方程中参数识别,机器学习,统计分析,金融,生物信息等领域里展开了Banach空间正则化问题理论与算法的研究.本文在Banach空间正则化理论框架下以稀疏正则化为主线讨论压缩感知,变量选择,椭圆型偏微分方程中Robin参数识别里的建模与计算.本文结构如下:第一章,总结文献中与我们研究相关的已有工作并简单介绍我们的研究动机.第二章,对压缩感知,变量选择问题中的(?)1和一类非凸稀疏正则化模型提出了连续化的原对偶有效集算法(PDASC)对(?)1正则化模型,在合理的条件下,我们证明了原对偶有效集算法的局部一步收敛性;同时还证明了PDASC的全局收敛性.对(?)0正则化模型,在合理的假设下,我们证明了全局最优解的唯一性,证明了PDASC的全局收敛性.我们还讨论了基于BIC,差异原则和修正差异原则的后验正则化参数选取准则.在第三章中,考虑椭圆型偏微分方程中Robin参数的反演问题.在充分考虑问题的物理背景和实际意义的情况下提出了带稀疏约束的变分正则化模型并讨论了该非凸非光滑模型解的存在性,稳定性,正则化参数先验选取,以及有限元离散的收敛性问题.算法上,我们提出了一个简单有效的滞后牛顿法并采用经典的差异原则做为停机准则.

【Abstract】 In many areas of life and production, people need to extract useful infor-mation by inverting the observed data. These issues belong to the category of inverse problems which are usually ill-posed. Classic Tikhonov regularization that established in the Hilbert space with complete theory and algorithm is a powerful tool to deal with these inverse problems. However, in order to preserve some of the main features of the inversion variables, regularization in Banach space occurred in the past ten years in the fields of signal process-ing, image processing, medical imaging, non-destructive detection, parameter identification in PDE, machine learning, statistical analysis, finance, bioinfor-matics and other areas. This thesis studies several modeling and computing problems raised in compressed sensing, variable selection, Robin parameter identification of elliptic partial differential equations with the main line of sparse regularization under the framework of Banach space regularization.This thesis is structured as follows:Chapter1, We summarize the relative works in the literature and give simple introduction of our research motivation.Chapter2, We proposed primal dual active set algorithm with continu-ation (PDASC) to the e1and non-convex regularized model in compressive sensing and variable selection. For the e1regularized model, we prove that PDAS enjoys the local one step convergence property while PDASC con-verges globally. For the e0regularized model we prove the uniqueness of the global minimizer and the global convergence of PDASC under mild assump-tions. And we discuss a posterior regularization parameter selection rules based on discrepancy principle, modified discrepancy principle and Bayesian information criterion.Chapter3, We consider inversion of Robin parameters in elliptic partial differential equations. By taking full account of the physical background and practical significance, we propose a new variational regularization model with sparsity constraint. We established the existence, stability and prior regu-larization parameter rule and the convergence of finite element discretization even if the model we considered is non-convex and non-smooth. We propose a simple but effective lagged Newton algorithm to solve the model and use the discrepancy principle to select the regularization parameter.

  • 【网络出版投稿人】 武汉大学
  • 【网络出版年期】2014年 09期
节点文献中: