节点文献

关于强极限定理的若干研究及应用

Some Researches on Strong Limit Theorems and Their Applications

【作者】 杨卫国

【导师】 叶中行;

【作者基本信息】 上海交通大学 , 应用数学, 2007, 博士

【摘要】 概率论是研究大量随机现象的统计规律的学科。用数学语言来说,就是研究对随机现象的观察次数趋于无穷时,“它的极限”呈现出的某种规律性。因此强极限理论在概率论中占有重要地位。二十世纪六十年代以来,继独立随机变量和序列的极限理论获得完善发展之后,各种混合随机变量序列、相伴随机变量序列及鞅的强极限理论又有很大发展,我国学者在这方面做出了许多出色的工作,在国际上也有一定的影响(参见[43,76,108,81,77,82])。强极限理论在国际上的文献浩如烟海。关于强极限理论的经典结果可参见专著[14,13,70,28,79],而最近的文献可参见[31,4,32,69,16,11]。信息论的熵定理也称Shannon—McMillan定理或信源的渐进均分割性(AEP),是信息论的基本定理,是各种编码定理的基础。关于熵定理的最新发展可参考文献[26]。 设{Xn,n≥0}为随机变量序列,如果 E[f(Xn+1)X0,…,Xn]=E[f(Xn+1)|XN]a.s. (-1.0.1)其中f为有界函数。则称{Xn,n≥0}为马氏链。如果E[f(Xn+1)|Xn]与n有关,则称{Xn,n≥0}为非齐次马氏链,如果E[f(Xn+1)|Xn]与n无关,则称{Xn,n≥0}为齐次马氏链。如果{Xn,n≥0}在有限或可列状态空间取值,则称之为有限或可列马氏链,如果{Xn,n≥0}在一般状态空间取值,则称之为在一般状态空间取值的马氏链。如果 E[f(Xn+1)|X0,…,Xn]=E[f(Xn+1)|Xn,…,Xn-k+1]a.s. (-1.0.2)且{Xn,n≥0}与n有关,则称{Xn,n≥0}为非齐次K阶马氏链, 马氏随机场是马氏过程推广到多维指标情形。由于有广泛的应用前景而受到物理学、概率论、信息论界的广泛兴趣。由于马氏随机场具有相变现象,其研究内容更加深刻且具有很大的难度。马氏随机场理论是近年来发展起来的概率论重要分支之一,而马氏随机场极限理论又是其中重要的研究内容,其中关于马氏随机场强极限定理的研究目前尚无系统和深刻的结果。本博士论文将推进这方面的研究。

【Abstract】 Probability theory is a subject which concerns the study statistical laws for different kinds of random phenomenon, paticularly, the limit behavior when the abserving times tend to infinity. Hence the strong limit theorems consist of an important part of probability theory. In 1960’s, the limit theorem for the sequences of independent random variables has been well established. Since then, the limit theorems for mixing sequences of random variables and correlated sequences of random variables have been developed greatly. Many chinese researchers have contributed outstandingly in this field. Their influential works have been international recognized (Cf[43, 76, 108, 81, 77, 82] ). There are vast number of references upon limit theorems in literature, of which the classical ones (Cf[14, 13, 70, 28, 79]). Some recently updated references are [31, 4, 32, 69, 16, 11]. The entropy theorem in information theory, which is of core interst in this thesis, is also frequently as the Shannon-McMillan theorem or asymptotic equipartition property (AEP). It is fundamental theorem in information theory which lays the foundation of almost all the coding theorems. The most recent development of entropy theorem could be found in [26].Let {Xn, n ≥ 0} be a stochastic sequence. IfE[f(Xn+l)|X0,... ,Xn] = E[f(Xn+l)|Xn] a.s., (-1.0.3)where f is a bounded function, {Xn, n ≥ 0} will be called a Markov chain. If E[f (Xn+1)|Xn] are dependent on n, {Xn,n≥ 0} will be called a nonhomogeneous Markov chain. If E[f(Xn+1)|Xn] are independent on n, {Xn,n ≥ 0} will be called a homogeneous Markov chain. If {Xn, n ≥ 0} take values in a finite set or a countable set, it will be called a finite Markov chain or countable Markov chain. If {Xn, n ≥ 0} take values in a general state space, it will be called a Markov chain

节点文献中: