数理统计教学视频38集0 N3 q, I; j6 v' m; m1 Y- J
主讲人:邵军 9 F/ i7 C* I% a) h: Q/ t2 `1 `% a
美国威斯康星大学麦迪逊分校,统计学教授。 - }" p; r# I' Z1 D2 Y5 {: n
- g2 A( c) ^0 l % g m; S6 [ W ` w* q8 Y) K: D" [. k) e
1 E- c" Q8 u/ `, c! a0 R! r
邵军,男,1957年出生,1982年毕业于华东师范大学数学系后留校任教.1983年进入University of Wisconsin-Madison攻读博士学位。1987年,邵军教授从WU-Madison毕业,先后执教于Purdue University 以及 University of Ottawa。并于1994年,回到UW-Madison,现为UW-Madison统计系教授,兼系主任。除此以外,邵军教授还曾是美国国家统计局以及著名生物制药咨询公司Covance的Seniro Research Fellow (1996-1997, 1998)。
3 g( P9 e$ f9 f x2 ~9 j
在学术界,邵军教授曾是JASA的Associate Editor(1993-1996,1999-2005),是J. of Multivariate Analysis的Co-editor(2002-2005)。现在,邵军教授还是Statistica Sinica 的Assocaite Editor 以及Sankhya的Co-editor。除此以外,邵军教授还曾是 1 |' a$ X [1 z Y2 E7 H8 G& A国际泛华统计学会(International Chinese Statistical Association)的President-elect以及Board of Directors。自1996年来已培养了多名统计学博士生。) f- y+ Z. x9 H. \ _
在匿名评审的英文学术刊物上,邵军教授有着逾百篇的论文发表,并著有多部学术专著。具体地说,邵军教授的 1 r7 k: X. P9 C4 h1 R& M主要研究兴趣如下: ; D9 c* {* s3 B8 F% }6 R& u% r. m(1) The jackknife, bootstrap and other resampling methods4 W% m0 y5 {3 J3 {5 G3 u. } e
(2) Asymptotic theory in linear and nonlinear models) d& H! C( K5 ^& l) j" X
(3) Model selection: v4 |6 y: M# n- b* [. q
(4) Sample surveys (variance estimation, imputation for nonrespondents)9 X) m- X. k) H" [
(5) Longitudinal data analysis with missing data/covariates " `8 _# ^; V# n: B) ]& @(6) Medical statistics (bioequivalence, shelf-life estimation, clinical trials) 1 ~6 N4 ` {+ z+ [
?* K" a# E! @9 wMathematical Statistics 2nd ed - J. Shao (Springer Texts in Statistics, 2003) .pdf 3 m! V: Z- x; J$ x9 Z+ p5 X' F/ U) y4 U# p& r 目录Preface to the First Edition ' X. S& l( r+ X* X" IPreface to the Second Edition2 C8 a# ~3 C+ V. ]# I9 y! N" L
Chapter 1.Probability Theory2 e: S$ w5 C- Q/ E3 u# K
1.1 Probability Spaces and Random Elements! m7 f5 `% F5 p
1.1.1σ-fields and measures 5 d% a7 C, B- e5 a- h9 \1.1.2 Measurable functions and distributions $ ]: l$ w. @" V1.2 Integration and Differentiation$ `9 G4 _% M# S; N( N+ @7 w
1.2.1 Integration * \1 ^2 @( w* p' s) p. X1.2.2 Radon.Nikodym derivative . Y$ w, N) a) ]3 ^1.3 Distributions and Their Characteristics 0 S. d1 e5 V. h3 I# X9 V1.3.1 Distributions and probability densities; j9 c! ~" E0 g) \, {+ F
1.3.2 Moments and moment inequalities( i) t0 f6 X2 o4 X+ ~6 e
1.3.3 Moment generating and characteristic functions ~. e7 @2 ~7 u1.4 Conditional Expectations , e) b6 L, G; x7 ]2 l1.4.1 Conditional expectations . |- ~" N7 z! y- L$ b* o* Y' p1.4.2 Independence: n) }2 S* E, K9 r) ]( s! V {) c- y
1.4.3 Conditional distributions 7 V7 W7 p$ ^2 b1.4.4 Markov chains and martingales & P- C0 L. [& K2 E1 [( R1 B1.5 Asymptotic Theory) ^% P8 P S$ P7 p5 m
1.5.1 Convergence modes and stochastic orders% M, [$ G% A* P' v7 m' @9 R& e8 C
1.5.2 Weak convergence ; x7 w& g' D' m8 o+ @6 [1.5.3 Convergence of transformations 5 K% [' q! O: X( @6 Q3 k) d1.5.4 The law of large numbers : s& W0 z3 {2 X: B* X1.5.5 The central limit theorem$ G! E2 H/ \( Z8 }
1.5.6 Edgeworth and Cornish-Fisher expansions & j, [; O) A) U4 m, H0 i1.6 Exercises8 s7 Q* ^6 ^0 ^( P
- g9 x* ]4 U' ^& N, P# P0 z% h
Chapter 2. Fundamentals of Statistics3 T1 J6 S) a9 L- \/ }: ^- |; H; M
2.1 Populations,Samples,and Models8 i {, t: |/ K% v- x- R
2.1.1 Populations and samples* F7 z1 O( d; H
2.1.2 Parametric and nonparametric models# ?9 L5 @1 ]' o& r
2.1.3 Exponential and location.scale families 0 z; y0 D4 z7 j* M' T( j1 o8 G2.2 Statistics.Sufficiency,and Completeness 9 H! O! e' ?1 ^* x; v) [2.2.1 Statistics and their distributions % \8 A3 j! U$ T2.2.2 Sufficiency and minimal sufficiency; k5 |2 c6 z2 M$ J( k0 c
2.2.3 Complete statistics 0 G! d# J0 s6 b G2.3 Statistical Decision Theory2 i& ~! a) s3 l. x
2.3.1 Decision rules,lOSS functions,and risks 5 ^; B, i- e$ M% Z" u7 S) u. N6 e, K2.3.2 Admissibility and optimality2 `9 x2 X( B; Y2 j% c ^9 H1 U( O
2.4 Statistical Inference % \& k" I7 b I% x G9 ~2.4.1 P0il)t estimators # r6 L- P- E. v6 |2 Y) I* b# \2.4.2 Hypothesis tests , d( L' `- m2 C9 }# L2.4.3 Confidence sets! N" m: x! f# S! C. f9 w' x
2.5 Asymptotic Criteria and Inference - ?8 _8 A& G, O; ^9 l. |. ]) s2.5.1 Consistency0 L2 O7 {/ q& m B% ~
2.5.2 Asymptotic bias,variance,and mse - p6 `' ]/ M( Y% |( }2.5.3 Asymptotic inference 0 S4 S" c1 {# G. H% O# ?# J2.6 Exercises 8 s0 @1 B' P; B! ]$ t% e5 t z& J. T+ Y7 @% N/ D/ g3 ?( a
Chapter 3.Unbiased Estimation 9 I- x% a1 J0 Z9 f" ]$ u# \$ Z: Q3.1 The UMVUE% l7 B6 R, p/ g
3.1.1 Sufficient and complete statistics* I3 k# ~, u' I
3.1.2 A necessary and.sufficient condition% ^+ k( j3 S$ j0 v/ u& I
3.1.3 Information inequality9 d& l( t, U' B+ f' _8 _
3.1.4 Asymptotic properties of UMVUE's 9 O' ~% f5 n, u* c; {0 R3 K) p7 T3.2 U-Statistics1 u& J# ^1 Y% J# V( j* r
3.2.1 Some examples; y! \! _' I0 b2 ^! q. J
3.2.2 Variances of U-statistics 3 Q: S/ ]6 f. h3.2.3 The projection method , Z! V- F8 [/ @3.3 The LSE in Linear Models& n( ~' H7 P' y- y
3.3.1 The LSE and estimability# r& j5 n" i4 E- N7 @' g: g q
3.3.2 The UMVUE and BLUE0 o7 J3 M& h, A6 C
3.3.3 R0bustness of LSE's! q7 q& [6 j+ l' \! T" M; a
3.3.4 Asymptotic properties of LSE's% n8 @9 S. K8 W X
3.4 Unbiased Estimators in Survey Problems 1 v) E+ M4 q7 u% n" I3.4.1 UMVUE's of population totals , ~+ a/ n, m" q2 W( z3.4.2 Horvitz-Thompson estimators 4 j; U9 l+ ?! Z% @# ]3.5 Asymptotically Unbiased Estimators$ |' @ T# o/ {" P4 d4 w
3.5.1 Functions of unbiased estimators 4 t" y7 l+ z' f: D' d3.5.2 The method ofmoments2 P6 B7 |) X$ N4 s* Y& b
3.5.3 V-statistics- H' ?/ C+ b8 r4 i3 g, w7 u+ B" y" ~2 ` B
3.5.4 The weighted LSE# A- u1 E( ?! k6 Y: \4 D. m
3.6 Exercises) l$ J! ]0 n; l- D
% H' `) Z# D. s* f7 A% _Chapter 4.Estimation in Parametric Models3 C8 U) ~4 i7 j+ c! m
4.1 Bayes Decisions and Estimators & I. k6 I" o* R" m2 {4 F4.1.1 Bayes actions " q, h1 M! n3 m% R! v$ v4.1.2 Empirical and hierarchical Bayes methods: B& }! H4 H. N9 g; |
4.1.3 Bayes rules and estimators% K8 H7 O; ]5 k0 \: ~% J Q
4.1.4 Markov chain Mollte Carlo3 {- z! H0 G) d: p9 Y- z9 Y# {
4.2 Invariance....../ o5 t& s2 l' z9 S/ s6 R- }
4.2.1 One-parameter location families4 n g |% i& V$ l
4.2.2 One-parameter seale families ! E# Q& y7 ~' l6 b8 s2 l4.2.3 General location-scale families ; c% A# E. K- ^7 y8 C4.3 Minimaxity and Admissibility - n) _9 U( f% H0 K) X2 n4.3.1 Estimators with constant risks ( ]0 |% H+ u% P* V5 s( T0 Q2 R4.3.2 Results in one-parameter exponential families 6 J9 n( l) v3 r& S2 n) h4.3.3 Simultaneous estimation and shrinkage estimators / s3 }) I4 D6 b% L4.4 The Method of Maximum Likelihood" V6 C6 i) a# n7 L8 S0 t
4.4.1 The likelihood function and MLE's 1 X5 o' I. Z5 n- D, @4.4.2 MLE's in generalized linear models" B! S& B/ c4 K- v/ J
4.4.3 Quasi-likelihoods and conditional likelihoods/ z$ ^3 U6 ]8 x9 k/ L% _
4.5 Asymptotically Efficient Estimation : F2 g! n! |3 ~4.5.1 Asymptotic optimality 7 ^8 g1 T9 u' T/ ?8 ]! Q9 n/ Z; b4.5.2 Asymptotic efficiency of MLE's and RLE's ; W6 c6 ~, m! E+ r W. z" S0 P4.5.3 Other asymptotically efficient estimators! @# ]. a. h- z3 f8 N
4.6 Exercises1 @$ F( n3 H6 ?2 Z3 N- v0 h
/ D3 A+ Y( {' Y/ A& N7 k$ f
Chapter 5.Estimation in Nonparametric Models0 J4 N: n( D6 U& {5 B
5.1 Distribution Estimators( Z! U$ z3 y2 c6 i& l
5.1.1 Empirical C.d.f.'s in i.i.d.cases, Y( `# z/ n u |- R7 o7 H+ n
5.1.2 Empirical likelihoods " O: t7 N. P" q0 g# M' y2 P5.1.3 Density estimation 9 G1 c% I! z0 m+ d$ E- U5.1.4 Semi-parametric methods 4 L" p9 N, q! Q$ Y2 |1 `0 p9 @( o/ R5.2 Statistical Functionals$ v9 q2 {7 t/ [$ n2 @
5.2.1 Differentiability and asymptotic normality & w# Y( x9 O) Z& q& e+ J: S3 _% Z5.2.2 L-.M-.and R-estimators and rank statistics 6 t/ s; e5 a& H0 n5.3 Linear Functions of Order Statistics8 J/ y9 D i6 C1 {8 W4 ?) H
5.3.1 Sample quantiles: L# m! _* s: K }! Z& F
5.3.2 R0bustness and efficiency * u$ f0 K! W, v5.3.3 L-estimators in linear models # V: v( |: \# z* l5.4 Generalized Estimating Equations" X7 g& E1 J/ x" x* E5 e. g
5.4.1 The GEE method and its relationship with others F8 N g% ]$ Q: b$ S0 D5.4.2 Consistency of GEE estimators ; {! J, h7 Z9 X5 |0 V) ]& K, B5.4.3 Asymptotic normality of GEE estimators ' _$ E* z w# M' B5.5 Variance Estimation & M* U K# f) [: q! i) X; N5.5.1 The substitution.method9 c1 [" `% E. _/ ^% z
5.5.2 The jackknife l' O. R. P, | u4 W( p% C5.5.3 The bootstrap0 |8 ]" d2 w8 z" T# F* {
5.6 Exercises2 J1 x' \* [2 ?. }
* J4 e7 O' i; ]# b& \) kChapter 6.Hypothesis Tests F# M! V* h) u
6.1 UMP Tests 7 d7 W1 f- N4 P( J6.1.1 The Neyman-Pearson lemma" X* ~( k" E2 \7 w* Q* Y; w
6.1.2 Monotone likelihood ratio 9 S$ h5 b' k- h6 K& F5 V5 b+ V( n6.1.3 UMP tests for two-sided hypotheses. b6 ^7 `8 L9 a8 q: s& B
6.2 UMP Unbiased Tests 7 Z' p& ?6 c8 V0 m4 s' s7 y6.2.1 Unbiasedness,similarity,and Neyman structure) v: f5 M1 C' \) Q" o4 n7 {& t- @
6.2.2 UMPU tests in exponential families1 |) D* r7 {0 `0 D0 y2 m2 N
6.2.3 UMPU tests in normal families/ W* ~% ?) Q7 B# O" I4 m( J8 W/ x. r. q
…… W$ X# k" y6 a% J& ZChapter 7 Confidence Sets A0 h1 w1 \, g) h( ?6 F" c
References% A: M' r3 V/ q! d, Z1 |
List of Notation# Z7 _6 P5 z& ?) h2 z9 x
List of Abbreviations 7 X0 n! B% _! v; ]: S mIndex of Definitions,Main Results,and Examples ( v) ~8 B' X/ J' IAuthor Index ! Z- u6 P9 ]6 P) r" ESubject Index , e- V; c/ |; M5 w & F `* B6 u2 \Mathematical Statistics -- Exercises and Solutions (Shao Jun).pdf# `. P, C; T7 q, O4 \ D( k
《数理统计:问题与解答》内容简介:this book consists of solutions to 400 exercises, over 95% of which arein my book Mathematical Statistics. Many of them are standard exercisesthat also appear in other textbooks listed in the references. It is onlya partial solution manual to Mathematical Statistics (which contains over900 exercises). However, the types of exercise in Mathematical Statistics notselected in the current book are (1) exercises that are routine (each exerciseselected in this book has a certain degree of difficulty), (2) exercises similarto one or several exercises selected in the current book, and (3) exercises foradvanced materials that are often not included in a mathematical statisticscourse for first-year Ph.D. students in statistics (e.g., Edgeworth expan-sions and second-order accuracy of confidence sets, empirical likelihoods,statistical functionals, generalized linear models, nonparametric tests, andtheory for the bootstrap and jackknife, etc.). On the other hand, this isa stand-alone book, since exercises and solutions are comprehensibleindependently of their source for likely readers. To help readers notusing this book together with Mathematical Statistics, lists of notation,terminology, and some probability distributions are given in the front ofthe book. ) j$ i8 W4 g3 `8 }% V5 H& uPreface 0 C. s$ w" R7 O! K" ENotation 6 K' y+ o' B2 z H+ `Terminology ( g/ |' P' I" O& wSome Distributions 5 Z& v; x+ N- z2 U: x4 Q" oChapter 1. Probability Theory / X' ^* L- K8 m1 c/ iChapter 2. Fundamentals of Statistics: R m* M, }- q
Chapter 3. Unbiased Estimation ! t4 V5 ], P, |# \# P$ Y6 FChapter 4. Estimation in Parametric Models 7 \6 d' t0 N- ]0 Z8 qChapter 5. Estimation in Nonparametric Models 1 @. }5 p+ [2 D% j6 [- G3 P) RChapter 6. Hypothesis Tests$ h7 x4 c( Y( Y5 C- T
Chapter 7. Confidence Sets 6 Y. t0 @5 V* ]# f ?References+ R/ ^% q" A# t6 I7 m0 b4 R
Index ! K, c) H# x% M) V0 ~% u: h. ~$ l: c# R& X; I( M1 P3 M
/ b, N- x' u* z' D( G
. Y( f* u, \) M) \9 ]" V( S% j8 L
% ~( z2 d6 N! l5 z, ~( W! E1 F