数理统计教学视频38集2 p# I/ `, D! E1 o j7 k e
主讲人:邵军 : f+ M7 i" \/ X' ]6 M美国威斯康星大学麦迪逊分校,统计学教授。 $ I" n3 k7 j7 P+ N. }$ G& T4 C# g * F! u/ ^* Q. V z6 `, C4 l & F. c# z2 ?: z$ q. l( g$ c
" b1 [1 k* r/ t! O. w1 F邵军,男,1957年出生,1982年毕业于华东师范大学数学系后留校任教.1983年进入University of Wisconsin-Madison攻读博士学位。1987年,邵军教授从WU-Madison毕业,先后执教于Purdue University 以及 University of Ottawa。并于1994年,回到UW-Madison,现为UW-Madison统计系教授,兼系主任。除此以外,邵军教授还曾是美国国家统计局以及著名生物制药咨询公司Covance的Seniro Research Fellow (1996-1997, 1998)。
6 V$ g8 u8 r D1 s/ [" g; K4 U
在学术界,邵军教授曾是JASA的Associate Editor(1993-1996,1999-2005),是J. of Multivariate Analysis的Co-editor(2002-2005)。现在,邵军教授还是Statistica Sinica 的Assocaite Editor 以及Sankhya的Co-editor。除此以外,邵军教授还曾是- k2 V- w' [. q. m; g# ~2 t
国际泛华统计学会(International Chinese Statistical Association)的President-elect以及Board of Directors。自1996年来已培养了多名统计学博士生。0 o* \! d% ]8 a ~5 M+ S" V
在匿名评审的英文学术刊物上,邵军教授有着逾百篇的论文发表,并著有多部学术专著。具体地说,邵军教授的 5 A1 R9 }2 B' \! F' {/ Z5 W主要研究兴趣如下: . W/ u$ x2 N; f$ A8 k(1) The jackknife, bootstrap and other resampling methods M8 P1 F# ^4 G% R4 d3 i7 j
(2) Asymptotic theory in linear and nonlinear models / D; ]' L1 k& z+ r. g7 r(3) Model selection " e% k+ z) ^) q6 e7 @& S(4) Sample surveys (variance estimation, imputation for nonrespondents) 1 f( f) K0 M; [; p(5) Longitudinal data analysis with missing data/covariates ( F& Q" ?8 U" M! r; U(6) Medical statistics (bioequivalence, shelf-life estimation, clinical trials)) R7 v8 ~! Y$ w) o; F! W
! F$ B% Y* u* g
Mathematical Statistics 2nd ed - J. Shao (Springer Texts in Statistics, 2003) .pdf( B; Y5 O8 h5 y
0 |! u7 ^" E# q$ o# h. c" ~( [ 目录Preface to the First Edition. h4 |% l$ l3 M/ f
Preface to the Second Edition, |: ^% G1 ^) {
Chapter 1.Probability Theory " P% X; O& X# F5 J$ W* d- ]9 E5 H$ k1.1 Probability Spaces and Random Elements' [. B8 |0 z$ N! G
1.1.1σ-fields and measures* m" e) p. @- k1 K0 N
1.1.2 Measurable functions and distributions 4 S) G* A$ _4 v& r1 f; x+ g$ H1.2 Integration and Differentiation 9 P$ a1 a+ y( V1.2.1 Integration3 A+ l8 ]8 `; l
1.2.2 Radon.Nikodym derivative * R2 b* I( G1 S2 n. U/ w1 z1.3 Distributions and Their Characteristics ! I2 ~% y; C" i3 G. }, t1.3.1 Distributions and probability densities$ ]% _$ o7 f9 e6 j5 L) C
1.3.2 Moments and moment inequalities, k% _/ \; ~% d/ \( U6 c
1.3.3 Moment generating and characteristic functions ( [( ?5 t# o( l- [1.4 Conditional Expectations) x& E, n: ?& I) q9 i7 [
1.4.1 Conditional expectations 3 X- A9 z5 q2 q' ^; |4 c) Y9 }' ?. z) Q1.4.2 Independence c1 g8 P U0 F4 W8 x. T/ V+ F
1.4.3 Conditional distributions 8 ? b# j0 V" {1.4.4 Markov chains and martingales' q: S) |% u7 A* G
1.5 Asymptotic Theory3 _ T. P) e8 l A" c- ]* ?
1.5.1 Convergence modes and stochastic orders5 j5 R6 I" c% _. {% ~
1.5.2 Weak convergence) I5 q8 w& r' J2 _
1.5.3 Convergence of transformations; [" V2 p4 w' R& o: ], }: C1 b, B
1.5.4 The law of large numbers7 S+ Y5 K3 m4 X' g4 e
1.5.5 The central limit theorem 7 d1 r# ?: w9 u: m5 ^1.5.6 Edgeworth and Cornish-Fisher expansions 9 l2 H* V; ]% t1.6 Exercises . v5 N0 d: t% L5 Z2 I( w8 r * |5 d% H o) F* s& RChapter 2. Fundamentals of Statistics 6 s4 c! Q7 _3 s* t: w$ i S2.1 Populations,Samples,and Models9 D& M( O$ O5 X$ Z& {$ q" U
2.1.1 Populations and samples Q9 t6 ^( d- a0 d' z3 R
2.1.2 Parametric and nonparametric models + O2 w/ w$ o! ^7 w+ _0 f) D2.1.3 Exponential and location.scale families 4 r" r# v8 }! b3 ^# P! k; I0 c5 E2.2 Statistics.Sufficiency,and Completeness / P) N( W# ?. C- y$ s5 M2.2.1 Statistics and their distributions 1 ?8 `* W2 \& ]6 T6 Q2.2.2 Sufficiency and minimal sufficiency4 t# M6 f' }0 F6 h
2.2.3 Complete statistics $ C. L1 A5 `2 z: I- _2.3 Statistical Decision Theory / W6 `7 W4 ^ W0 n( o5 q/ s! c; n2.3.1 Decision rules,lOSS functions,and risks: k& l+ G7 Y$ d6 K7 X6 N
2.3.2 Admissibility and optimality; H, E0 ^ d; m
2.4 Statistical Inference 8 p$ E+ d1 B6 w* ?2.4.1 P0il)t estimators1 w$ @3 h6 P" c6 I' h7 c; M
2.4.2 Hypothesis tests0 f/ ]5 V Z: ?1 Q% i. t
2.4.3 Confidence sets , _% e. u% L2 u, l7 n( i2.5 Asymptotic Criteria and Inference $ p0 `4 w9 m- g2.5.1 Consistency+ w- P E2 M4 w
2.5.2 Asymptotic bias,variance,and mse* R z6 C- G2 O p2 y
2.5.3 Asymptotic inference 7 E0 f9 R ?& Q) m$ N9 l U+ j2.6 Exercises 9 H" G. U2 `0 D" Q% ?; J ; @ U, Q: X7 T. nChapter 3.Unbiased Estimation ; `0 W d; x& c( I3.1 The UMVUE " c) \) h9 ~) P% r3.1.1 Sufficient and complete statistics, X% l$ {* i9 |$ d
3.1.2 A necessary and.sufficient condition( c; v$ f x" F% @
3.1.3 Information inequality* F& D& t: t( H0 z( b
3.1.4 Asymptotic properties of UMVUE's$ G6 I( A1 ~3 f
3.2 U-Statistics @$ \; i- M: ~( x
3.2.1 Some examples0 A9 w X) h" r, |
3.2.2 Variances of U-statistics # Z1 O$ {. t* f' e3 v3.2.3 The projection method 0 W6 U" i4 y, t+ ]4 L3.3 The LSE in Linear Models ) H6 Z. A* I. W' O# N! O' p' w6 ?3.3.1 The LSE and estimability - W" E# Q4 G' T$ ?4 q1 ?3.3.2 The UMVUE and BLUE5 F& N# {. v6 y
3.3.3 R0bustness of LSE's 7 n5 H) }2 \6 m* e7 h9 t3.3.4 Asymptotic properties of LSE's # P/ a j. D0 e+ l0 @3.4 Unbiased Estimators in Survey Problems 3 o8 @# I, D: i7 G! N3.4.1 UMVUE's of population totals% v" `6 X' o. V+ J" e* J0 d* ~
3.4.2 Horvitz-Thompson estimators 3 L/ C3 G5 U6 j( L$ u- F3.5 Asymptotically Unbiased Estimators , }* x. _9 C' {* J5 q ?3.5.1 Functions of unbiased estimators 6 Y: `" k% W! O5 y; v! D+ @( K3.5.2 The method ofmoments " ` H- Y( L5 E+ B: e3.5.3 V-statistics) b; d" z$ j, l; L
3.5.4 The weighted LSE + g7 [6 q- w* g4 w5 P5 |+ x3.6 Exercises/ s: k, b! K* ~% Z' D2 |
9 ^ _0 E, Y- `; G& T; J7 G# iChapter 4.Estimation in Parametric Models 6 X* |/ G! K( `8 Z4.1 Bayes Decisions and Estimators* {( y: y1 x( Q5 C2 D/ G- |
4.1.1 Bayes actions5 J: p3 @. _( k( R* f
4.1.2 Empirical and hierarchical Bayes methods; X: v. c8 {1 v/ D$ |" I
4.1.3 Bayes rules and estimators) I1 F2 H; h; p b0 M/ N# H% T
4.1.4 Markov chain Mollte Carlo . E; z: E6 o5 L% [4.2 Invariance......- d7 c- u' {* u9 K+ G3 O, j6 M
4.2.1 One-parameter location families0 x: E+ g" n" b/ }' Q" Z
4.2.2 One-parameter seale families4 |. s4 h% ?* Z, W' v2 c) U
4.2.3 General location-scale families& q; }: v9 U3 n# V3 B& d
4.3 Minimaxity and Admissibility$ i' P+ d% _& w/ i0 K: `1 P, p7 c
4.3.1 Estimators with constant risks* A0 w+ I; p6 b6 g
4.3.2 Results in one-parameter exponential families 7 s, j$ X' `) E4 @4.3.3 Simultaneous estimation and shrinkage estimators ; T7 R3 Z; C9 j/ W( U4.4 The Method of Maximum Likelihood , ^# o, B+ o; Q E4.4.1 The likelihood function and MLE's( }! S: d, z9 \( @/ `8 {
4.4.2 MLE's in generalized linear models0 y+ W- V4 H* r" N1 D! z
4.4.3 Quasi-likelihoods and conditional likelihoods $ Y9 F* N/ Z- Q# N4.5 Asymptotically Efficient Estimation # u( s! M) M2 k' Y, u+ l4.5.1 Asymptotic optimality 4 j" W' X% ]* t& x2 q' K4.5.2 Asymptotic efficiency of MLE's and RLE's8 K3 E+ w. E- }$ d1 p! p$ E/ u0 l
4.5.3 Other asymptotically efficient estimators3 b4 u* y' J7 C9 Z$ i3 X9 y' s
4.6 Exercises , ` D" e% S8 N3 P/ e 9 [; S8 H8 ^% }' S! x FChapter 5.Estimation in Nonparametric Models1 P8 r8 R; O J7 [
5.1 Distribution Estimators ; d" H/ L% G2 R- h. b7 j5.1.1 Empirical C.d.f.'s in i.i.d.cases* K& z, j1 F+ C2 {% W1 T
5.1.2 Empirical likelihoods 7 [; T6 C8 g' _2 B' X4 J5.1.3 Density estimation- v' P7 i/ O; s$ T5 f
5.1.4 Semi-parametric methods$ _- f! r6 c. U
5.2 Statistical Functionals , Q$ x) {0 Q0 h" Q) l5.2.1 Differentiability and asymptotic normality 4 l t9 q$ r7 N/ [6 N; F* Y5.2.2 L-.M-.and R-estimators and rank statistics5 E/ Y- L/ p* A( m' e9 o
5.3 Linear Functions of Order Statistics . ~4 F. T. O: e! N& y' I5.3.1 Sample quantiles+ ?8 G, {/ ~" j3 \
5.3.2 R0bustness and efficiency+ K2 Y4 m/ D, d+ b6 P
5.3.3 L-estimators in linear models/ ^' r; N, _3 {- ]$ Z8 B
5.4 Generalized Estimating Equations % [3 j% C0 }! a$ y$ l5.4.1 The GEE method and its relationship with others ! F G: V% F$ H- {4 R# Q5.4.2 Consistency of GEE estimators : W* s0 u0 E! ?; {9 U9 v5.4.3 Asymptotic normality of GEE estimators 1 B" d. D3 n+ Z2 R5.5 Variance Estimation* s. F( l0 o: X9 l; y
5.5.1 The substitution.method# z7 m: g1 Q# |. b* O5 {$ c
5.5.2 The jackknife& b% r: i2 n- e
5.5.3 The bootstrap ! Y4 \1 W6 s1 C8 d4 u5.6 Exercises 2 J% l" f% U$ {3 B! Q* K ( J4 D0 n! o& W( T) ?5 wChapter 6.Hypothesis Tests7 Z0 O* c2 q5 L# C" w& e3 z
6.1 UMP Tests - _9 X- J; @* k# n) ?6.1.1 The Neyman-Pearson lemma 8 ^- G3 x. J. |8 a' Q/ Q6.1.2 Monotone likelihood ratio+ Y$ e+ K! R2 S/ c5 d& }& [5 |8 R
6.1.3 UMP tests for two-sided hypotheses 5 r! T. a0 t) P6 C4 U6.2 UMP Unbiased Tests: z. s! P3 ?8 m. t
6.2.1 Unbiasedness,similarity,and Neyman structure ; ?8 P x& ~" Y5 N6.2.2 UMPU tests in exponential families 0 S% S* [! G5 X4 A0 w! ^6.2.3 UMPU tests in normal families ; S2 w0 V. {/ u" G…… ( d9 _' ?' Y+ f7 h$ y; `7 ]Chapter 7 Confidence Sets/ B* m7 {% x/ z0 X4 M+ g! {! I
References % K, r8 _9 p5 T$ MList of Notation * X' H/ N; d$ S9 C: `% @List of Abbreviations 3 b2 k* e! e$ ]3 K+ \2 r0 r% MIndex of Definitions,Main Results,and Examples 8 }7 W4 R9 z9 L7 WAuthor Index 3 c+ p' s: S1 C7 PSubject Index 0 q8 i8 N1 s' q e% C7 `5 V
, n1 m1 t# B$ D2 J6 _ G6 I& FMathematical Statistics -- Exercises and Solutions (Shao Jun).pdf . p3 K( J+ ]$ F《数理统计:问题与解答》内容简介:this book consists of solutions to 400 exercises, over 95% of which arein my book Mathematical Statistics. Many of them are standard exercisesthat also appear in other textbooks listed in the references. It is onlya partial solution manual to Mathematical Statistics (which contains over900 exercises). However, the types of exercise in Mathematical Statistics notselected in the current book are (1) exercises that are routine (each exerciseselected in this book has a certain degree of difficulty), (2) exercises similarto one or several exercises selected in the current book, and (3) exercises foradvanced materials that are often not included in a mathematical statisticscourse for first-year Ph.D. students in statistics (e.g., Edgeworth expan-sions and second-order accuracy of confidence sets, empirical likelihoods,statistical functionals, generalized linear models, nonparametric tests, andtheory for the bootstrap and jackknife, etc.). On the other hand, this isa stand-alone book, since exercises and solutions are comprehensibleindependently of their source for likely readers. To help readers notusing this book together with Mathematical Statistics, lists of notation,terminology, and some probability distributions are given in the front ofthe book.5 |( A9 L) ~' N
Preface0 }' r0 R( I2 }+ s
Notation* C% W* A& d; A1 s9 o
Terminology$ V- a: d5 _% ]1 w9 Z d
Some Distributions ) e4 c7 h7 X' S M" I/ ~Chapter 1. Probability Theory9 J6 N& x6 m, _0 z: s
Chapter 2. Fundamentals of Statistics& ^- ~# v; k& E i8 `
Chapter 3. Unbiased Estimation 9 E4 ?- q1 N% U- w: OChapter 4. Estimation in Parametric Models # v" f. g) E& I k5 `+ u
Chapter 5. Estimation in Nonparametric Models , I6 G# ^; d- i/ C* X3 O. d4 {3 nChapter 6. Hypothesis Tests! a9 \2 h0 T: \8 ~# Z& o
Chapter 7. Confidence Sets3 J- d5 d8 E1 N% ?. Y. U: {1 Y* M
References2 p0 I7 b: C& b, ?
Index % C. F9 m3 i2 h9 A0 g; Z/ F: L6 y. t* b
7 w0 x# W% R. u' |- _: d5 P