Theory Of Statistics - Information Technology Services

1y ago
20 Views
2 Downloads
6.25 MB
925 Pages
Last View : 28d ago
Last Download : 3m ago
Upload by : Averie Goad
Transcription

James E. GentleTheory of StatisticsTheory of Statistics c 2000–2020 James E. Gentle

Theory of Statistics c 2000–2020 James E. Gentle

Preface: Mathematical StatisticsAfter teaching mathematical statistics for several years using chalk on a blackboard (and, later, smelly “dry erase markers” on a whiteboard) mostly doingproofs of theorems, I decided to lecture from computer slides that providean outline of the “big picture”. Rather than spend class time “doing” proofsthat are given in standard texts, I decided that time would be better spentdiscussing the material from a different, higher-level perspective.While lecturing from canned slides, I cannot, however, ignore certain details of proofs and minutiae of examples. But what details and which minutiae?To be effective, my approach depends on an understanding between studentsand the instructor, an understanding that is possibly implicit. I lecture; but Iask “what is . ?” and “why is . ?”; and I encourage students to ask “what is. ?” and “why is . ?”. I adopt the attitude that there are many things that Idon’t know, but if there’s something that I wonder about, I’ll admit ignoranceand pursue the problem until I’ve attained some resolution. I encourage mystudents to adopt a similar attitude.I am completely dissatisfied with a class that sits like stumps on a logwhen I ask “what is . ?” or “why is . ?” during a lecture. What can I say?After writing class slides (in LATEX 2ε , of course), mostly in bullet form, Ibegan writing text around the bullets, and I put the notes on the class website.Later I decided that a single document with a fairly extensive subject index(see pages ? through ?) would be useful to serve as a companion for thestudy of mathematical statistics. The other big deal about this document isthe internal links, which of course is not something that can be done with ahardcopy book. (One thing I must warn you about is that there is a (known)bug in the LATEX package hyperref; if the referenced point happens to occurat the top of a page, the link takes you to the previous page – so if you don’tsee what you expect, try going to the next page.)Much of the present document reflects its origin as classroom notes; itcontains incomplete sentences or sentence fragments, and it lacks connectivematerial in some places. (The connective material was (probably!) suppliedorally during the lectures.)Theory of Statistics c 2000–2020 James E. Gentle

viPrefaceAnother characteristic of this document that results from the nature of itsorigin, as well as from the medium itself (electronic) is its length. A long bookdoesn’t use any more paper or “kill any more trees” than a short book. Usually,however, we can expect the density of “importance” in a short document tobe greater than that in a long document. If I had more time I could make thisbook shorter by prioritizing its content, and I may do that someday.Several sections are incomplete and several proofs are omitted.Also, I plan to add more examples. I just have not had time to type upthe material.I do not recommend that you print these notes. First of all, they are evolving, so a printed version is just a snapshot. Secondly, the PDF file containsactive internal links, so navigation is easy. (For versions without active links, Itry to be friendly to the reader by providing page numbers with most internalreferences.)This document is directed toward students for whom the theory of statistics is or will become an important part of their lives. Obviously, such studentsshould be able to work through the details of “hard” proofs and derivations;that is, students should master the fundamentals of mathematical statistics.In addition, students at this level should acquire, or begin acquiring, a deepappreciation for the field, including its historical development and its relation to other areas of mathematics and science generally; that is, studentsshould master the fundamentals of the broader theory of statistics. Some ofthe chapter endnotes are intended to help students gain such an appreciationby leading them to background sources and also by making more subjectivestatements than might be made in the main body.It is important to understand the intellectual underpinnings of our science.There are certain principles (such as sufficiency, for example) that guide ourapproaches to statistical inference. There are various general approaches (seepage 239) that we follow. Within these general approaches there are a numberof specific methods (see page 240). The student should develop an appreciationfor the relations between principles, approaches, and methods.This book on mathematical statistics assumes a certain amount of background in mathematics. Following the final chapter on mathematical statisticsChapter 8, there is Chapter 0 on “statistical mathematics” (that is, mathematics with strong relevance to statistical theory) that provides much of thegeneral mathematical background for probability and statistics. The mathematics in this chapter is prerequisite for the main part of the book, and itis hoped that the reader already has command of the material; otherwise,Chapter 0 can be viewed as providing “just in time” mathematics. Chapter 0grew (and is growing) recursively. Every time I add material, it seems that Ineed to add some background for the new material. This is obviously a gameone cannot win.Probability theory is the most directly relevant mathematical background,and it is assumed that the reader has a working knowledge of measure-theorybased probability theory. Chapter 1 covers this theory at a fairly rapid pace.Theory of Statistics c 2000–2020 James E. Gentle

PrefaceviiThe objective in the discussion of probability theory in Chapter 1, as in thatof the other mathematical background, is to provide some of the most relevantmaterial for statistics, which is the real topic of this text. Chapter 2 is also onprobability, but the focus is on the applications in statistics. In that chapter, Iaddress some important properties of probability distributions that determineproperties of statistical methods when applied to observations from thosedistributions.Chapter 3 covers many of the fundamentals of statistics. It provides anoverview of the topics that will be addressed in more detail in Chapters 4through 8.This document is organized in the order in which I cover the topics (moreor-less!). Material from Chapter 0 may be covered from time to time duringthe course, but I generally expect this chapter to be used for reference asneeded for the statistical topics being covered.The primary textbooks I have used in the past few years are Shao(2003), Lehmann and Casella (1998), and Lehmann (1986) (the precursor toLehmann and Romano (2005)). At various places in this document, referencesare given to the related sections of Shao (2003) (“MS2”), Lehmann and Casella(1998) (“TPE2”), and Lehmann and Romano (2005) (“TSH3”). These textsstate all of the important theorems, and in most cases, provide the proofs.They are also replete with examples. Full bibliographic citations for thesereferences, as well as several other resources are given in the bibliographybeginning on page 873.It is of course expected that the student will read the primarytextbook, as well as various other texts, and to work through allproofs and examples in the primary textbook. As a practical matter,obviously, even if I attempted to cover all of these in class, there just is notenough class time to do it.The purpose of this evolving document is not just to repeat allof the material in those other texts. Its purpose, rather, is to providesome additional background material, and to serve as an outline and a handyreference of terms and concepts. The nature of the propositions vary considerably; in some cases, a fairly trivial statement will be followed by a proof, andin other cases, a rather obtuse statement will not be supported by proof. In allcases, the student should understand why the statement is true (or, if it’s not,immediately send me email to let me know of the error!). More importantly,the student should understand why it’s relevant.Each student should read this and other texts and work through the proofsand examples at a rate that matches the individual student’s understandingof the individual problem. What one student thinks is rather obtuse, anotherstudent comprehends quickly, and then the tables are turned when a differentproblem is encountered. There is a lot of lonely work required, and this is whylectures that just go through the details are often not useful.It is commonplace for textbooks in mathematics to include examplesand exercises without reference to the source of the examples or exercisesTheory of Statistics c 2000–2020 James E. Gentle

viiiPrefaceand yet without implying any claim of originality. (A notable exception isGraham et al. (1994).) My book is not intended to present new and original work, and it follows the time-honored tradition of reusing examples andexercises from long-forgotten sources.NotationAdoption of notation is an overhead in communication. I try to minimize thatoverhead by using notation that is “standard”, and using it locally consistently.Examples of sloppy notation abound in mathematical statistics. Functionsseem particularly susceptible to abusive notation. It is common to see “f(x)”and “f(y)” used in the same sentence to represent two different functions.(These often represent two different PDFs, one for a random variable X andthe other for a random variable Y . When I want to talk about two differentthings, I denote them by different symbols, so when I want to talk about twodifferent PDFs, I often use notation such as “fX (·)” and “fY (·)”. If x y,which is of course very different from saying X Y , then fX (x) fX (y);however, fX (x) 6 fY (x) in general.)For a function and a value of a function, there is a certain amount ofambiguity that is almost necessary. I generally try to use notation such as“f(x)” or “Y (ω)” to denote the value of the function f at x or Y at ω, andI use “f”, “f(·)”, or “Y ” to denote the function itself (although occasionally,I do use “f(x)” to represent the function — notice the word “try” in thisdiscussion).If in the notation “f(x)”, “x” denotes an element of a set A, and B A(that is, B is a set of the same kinds of elements as A), then “f(B)” does notmake much sense. For the image of B under f, I use “f[B]”.I also freely use the notation f 1 (y) or f 1 [B] to denote the preimage,whether or not f 1 is actually a function; that is, whether or not f is invertible.There are two other areas in which my notation may be slightly differentfrom common notation. First, to designate the open interval between the realnumbers a b, I use the Bourbaki notation “]a, b[”. (I eschew most of theweird Bourbaki notation, however. This particular notation is also used in myfavorite book on analysis, Hewitt and Stromberg (1965).) Second, I do notuse any special notation, such as boldface, for vectors; thus, x may representa scalar or a vector.All vectors are “column vectors”. This is only relevant in vector-vector orvector-matrix operations, and has nothing to do with the way we represent avector. It is far more convenient to represent the vector x asx (x1 , . . . , xd)than asTheory of Statistics c 2000–2020 James E. Gentle

Prefaceix x1 x . ,xdand there is certainly no need to use the silly notationx (x1 , . . . , xd )T .A vector is not a matrix.There are times, however, when a vector may be treated like a matrix incertain operations. In such cases, the vector is treated as a matrix with onecolumn.Appendix C provides a list of the common notation that I use. The readeris encouraged to look over that list both to see the notation itself and to getsome idea of the objects that I discuss.Solving ProblemsThe main ingredient for success in a course in mathematical statistics is theability to work problems. The only way to enhance one’s ability to workproblems is to work problems. It is not sufficient to read, to watch, or tohear solutions to problems. One of the most serious mistakes students make incourses in mathematical statistics is to work through a solution that somebodyelse has done and to think they have worked the problem.While sometimes it may not be possible to solve a given problem, ratherthan looking for a solution that someone else has come up with, it is muchbetter to stop with a partial solution or a hint and then sometime later returnto the effort of completing the solution. Studying a problem without its solutionis much more worthwhile than studying the solution to the problem.Do you need to see a solution to a problem that you have solved? Exceptin rare cases, if you have solved a problem, you know whether or not yourpurported solution is correct. It is like a Sudoku puzzle; although solutionsto these are always published in the back of the puzzle book or in a lateredition of the medium, I don’t know what these are for. If you have solvedthe puzzle you know that your solution is correct. If you cannot solve it, Idon’t see any value in looking at the solution. It’s not going to make you abetter Sudoku solver. (Sudoku is different from crossword puzzles, anotherof my pastimes. Seeing the solution or partial solution to a crossword puzzlecan make you a better crossword solver.) There is an important difference inSudoku puzzles and mathematical problems. In Sudoku puzzles, there is onlyone correct solution. In mathematical problems, there may be more than oneway to solve a problem, so occasionally it is worthwhile to see someone else’ssolution.The common wisdom (or cliché, depending on your viewpoint) that it takes10000 hours to master a field or a skill is probably applicable to statistics.Theory of Statistics c 2000–2020 James E. Gentle

xPrefaceThis means working on this stuff for about 40 hours a week for 5 years. Thisis approximately the amount of work that a student should do for receipt ofa PhD degree (preferably in less than 5 years).Many problems serve as models of “standard operations” for solving otherproblems. Some problems should become “easy pieces”.Standard OperationsThere are a number of operations and mathematical objects that occur overand over in deriving results or in proving propositions. These operations aresometimes pejoratively called “tricks”. In some sense, perhaps they are; butit is useful to identify these operations outside of the context of a specificapplication. Some of these standard operations and useful objects are listedin Section 0.0.9 on page 676.Easy PiecesI recommend that all students develop a list of easy pieces. These are propositions or examples and counterexamples that the student can state and proveor describe and work through without resort to notes. An easy piece is something that is important in its own right, but also may serve as a model ortemplate for many other problems. A student should attempt to accumulatea large bag of easy pieces. If development of this bag involves some memorization, that is OK, but things should just naturally get into the bag in theprocess of working problems and observing similarities among problems —and by seeing the same problem over and over.Some examples of easy pieces are State and prove the information inequality (CRLB) for a d-vector parameter. (Get the regularity conditions correct.) Give an example to distinguish the asymptotic bias from the limiting bias. State and prove Basu’s theorem. Give an example of a function of some parameter in some family of distributions that is not U-estimable. A statement of the Neyman-Pearson Lemma (with or without the randomization) and its proof.Some easy pieces in the background area of “statistical mathematics” are Let C be the class of all closed intervals in IR. Show that σ(C) B(IR)(the real Borel σ-field). Define induced measure and prove that it is a measure. That is, prove: If(Ω, F , ν) is a measure space and (Λ, G) is a measurable space, and f is afunction from Ω to Λ that is measurable with respect to F /G, then ν f 1is a measure with domain G. Define the Lebesgue integral for a general Borel function.Theory of Statistics c 2000–2020 James E. Gentle

Prefacexi State and prove Fatou’s lemma conditional on a sub-σ-field.Make your own list of easy pieces.Relevance and BoundariesFor any proposition or example, you should have a clear understanding of whythe proposition or example is important. Where is it subsequently used? Isit used to prove something else important, or does it justify some statisticalprocedure?Propositions and definitions have boundaries; that is, they apply to a specific situation. You should look at the “edges” or “boundaries” of the hypotheses. What would happen if you were to remove one or more assumptions? (Thisis the idea behind counterexamples.) What would happen if you make strongerassumptions?“It is clear” and “It can be shown”I tend to use the phrase “it is clear .” often. (I only realized this recently,because someone pointed it out to me.) When I say “it is clear .”, I expectthe reader to agree with me actively, not passively.I use this phrase only when the statement is “clearly” true to me. I mustadmit, however, sometimes when I read the statement a few weeks later, it’snot very clear! It may require many minutes of difficult reasoning. In anyevent, the reader should attempt to supply the reasoning for everything thatI say is clear.I also use the phrase “it can be shown .” in connection with a fact (theorem) whose proof at that point would be distracting, or else whose proof Ijust don’t want to write out. (In later iterations of this document, however, Imay decide to give the proof.) A statement of fact preceded by the phrase “itcan be shown”, is likely to require more thought or background informationthan a statement preceded by the phrase “it is clear”, although this may bea matter of judgement.Study of mathematical statistics at the level appropriate for this documentis generally facilitated by reference to a number of texts and journal articles;and I assume that the student does refer to various sources.My CoursesThe courses in mathematical statistics at George Mason University areCSI/STAT 972 and CSI/STAT 973. The prerequisites for these coursesinclude measure-theoretic-based probability theory, such as is covered inCSI/STAT 971. Chapters 0 and 1 address the prerequisite material briefly,and in CSI/STAT 972 some class time is devoted to this material. AlthoughTheory of Statistics c 2000–2020 James E. Gentle

xiiPrefaceChapters 1 and 2 are on “probability”, some of their focus is more on whatis usually covered in “statistics” courses, such as families of distributions, inparticular, the exponential class of families.My notes on these courses are available athttp://mason.gmu.edu/ jgentle/csi9723/AcknowledgementsA work of this size is bound to have some errors (at least if I have anythingto do with it!). Errors must first be detected and then corrected.I would appreciate any feedback – errors, comments, or suggestions. Emailme at jgentle@gmu.eduFairfax County, VirginiaTheory of Statistics c 2000–2020 James E. GentleJames E. GentleMarch 20, 2020

ContentsPreface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1Probability Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1.1 Some Important Probability Definitions and Facts . . . . . . . . . . .1.1.1 Probability and Probability Distributions . . . . . . . . . . . . .1.1.2 Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1.1.3 Definitions and Properties of Expected Values . . . . . . . .1.1.4 Relations among Random Variables . . . . . . . . . . . . . . . . . .1.1.5 Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1.1.6 Fisher Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1.1.7 Generating Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1.1.8 Characteristic Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . .1.1.9 Functionals of the CDF; Distribution “Measures” . . . . . .1.1.10 Transformations of Random Variables . . . . . . . . . . . . . . . .1.1.11 Decomposition of Random Variables . . . . . . . . . . . . . . . . .1.1.12 Order Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1.2 Series Expansions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1.2.1 Asymptotic Properties of Functions . . . . . . . . . . . . . . . . . .1.2.2 Expansion of the Characteristic Function . . . . . . . . . . . . .1.2.3 Cumulants and Expected Values . . . . . . . . . . . . . . . . . . . .1.2.4 Edgeworth Expansions in Hermite Polynomials . . . . . . . .1.2.5 The Edgeworth Expansion . . . . . . . . . . . . . . . . . . . . . . . . . .1.3 Sequences of Spaces, Events, and Random Variables . . . . . . . . .1.3.1 The Borel-Cantelli Lemmas . . . . . . . . . . . . . . . . . . . . . . . . .1.3.2 Exchangeability and Independence of Sequences . . . . . . .1.3.3 Types of Convergence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1.3.4 Weak Convergence in Distribution . . . . . . . . . . . . . . . . . . .1.3.5 Expectations of Sequences; Sequences of Expectations . .1.3.6 Convergence of Functions . . . . . . . . . . . . . . . . . . . . . . . . . . .1.3.7 Asymptotic Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . .Theory of Statistics c 2000–2020 James E. 585899192

xivContents1.3.8 Asymptotic Expectation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1001.4 Limit Theorems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1011.4.1 Laws of Large Numbers . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1021.4.2 Central Limit Theorems for Independent Sequences . . . . 1041.4.3 Extreme Value Distributions . . . . . . . . . . . . . . . . . . . . . . . . 1081.4.4 Other Limiting Distributions . . . . . . . . . . . . . . . . . . . . . . . . 1091.5 Conditional Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1101.5.1 Conditional Expectation: Definition and Properties . . . . 1101.5.2 Some Properties of Conditional Expectations . . . . . . . . . 1121.5.3 Projections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1151.5.4 Conditional Probability and Probability Distributions . . 1191.6 Stochastic Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1211.6.1 Probability Models for Stochastic Processes . . . . . . . . . . . 1251.6.2 Continuous Time Processes . . . . . . . . . . . . . . . . . . . . . . . . . 1261.6.3 Markov Chains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1261.6.4 Lévy Processes and Brownian Motion . . . . . . . . . . . . . . . . 1291.6.5 Brownian Bridges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1301.6.6 Martingales . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1301.6.7 Empirical Processes and Limit Theorems . . . . . . . . . . . . . 133Notes and Further Reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1452Distribution Theory and Statistical Models . . . . . . . . . . . . . . . . 1552.1 Complete Families . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1622.2 Shapes of the Probability Density . . . . . . . . . . . . . . . . . . . . . . . . . 1632.3 “Regular” Families . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1682.3.1 The Fisher Information Regularity Conditions . . . . . . . . 1682.3.2 The Le Cam Regularity Conditions . . . . . . . . . . . . . . . . . . 1692.3.3 Quadratic Mean Differentiability . . . . . . . . . . . . . . . . . . . . 1692.4 The Exponential Class of Families . . . . . . . . . . . . . . . . . . . . . . . . . 1692.4.1 The Natural Parameter Space of Exponential Families . 1732.4.2 The Natural Exponential Families . . . . . . . . . . . . . . . . . . . 1732.4.3 One-Parameter Exponential Families . . . . . . . . . . . . . . . . . 1732.4.4 Discrete Power Series Exponential Families . . . . . . . . . . . 1752.4.5 Quadratic Variance Functions . . . . . . . . . . . . . . . . . . . . . . . 1752.4.6 Full Rank and Curved Exponential Families . . . . . . . . . . 1752.4.7 Properties of Exponential Families . . . . . . . . . . . . . . . . . . . 1762.5 Parametric-Support Families . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1772.6 Transformation Group Families . . . . . . . . . . . . . . . . . . . . . . . . . . . 1782.6.1 Location-Scale Families . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1792.6.2 Invariant Parametric Families . . . . . . . . . . . . . . . . . . . . . . . 1822.7 Infinitely Divisible and Stable Families . . . . . . . . . . . . . . . . . . . . . 1832.8 Families of Distributions with Heavy Tails . . . . . . . . . . . . . . . . . . 1842.9 The Family of Normal Distributions . . . . . . . . . . . . . . . . . . . . . . . 1852.9.1 Multivariate and Matrix Normal Distribution . . . . . . . . . 186Theory of Statistics c 2000–2020 James E. Gentle

Contentsxv2.9.2 Functions of Normal Random Variables . . . . . . . . . . . . . . 1872.9.3 Characterizations of the Normal Family of Distributions 1892.10 Generalized Distributions and Mixture Distributions . . . . . . . . . 1922.10.1 Truncated and Censored Distributions . . . . . . . . . . . . . . . 1922.10.2 Mixture Families . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1942.10.3 Skewed Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1952.10.4 Flexible Families of Distributions Useful in Modeling . . . 1962.11 Multivariate Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1982.11.1 Marginal Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1982.11.2 Elliptical Families . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1982.11.3 Higher Dimensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199Notes and Further Reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2013Basic Statistical Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2053.1 Inferential Information in Statistics . . . . . . . . . . . . . . . . . . . . . . . . 2113.1.1 Statistical Inference: Point Estimation . . . . . . . . . . . . . . . 2153.1.2 Sufficiency, Ancillarity, Minimality, and Completeness . . 2213.1.3 Information and the Information Inequality . . . . . . . . . . . 2293.1.4 “Approximate” Inference . . . . . . . . . . . . . . . . . . . . . . . . . . . 2353.1.5 Statistical Inference in Parametric Families . . . . . . . . . . . 2353.1.6 Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2363.1.7 Other Issues in Statistical Inference . . . . . . . . . . . . . . . . . . 2363.2 Statistical Inference: Approaches and Methods . . . . . . . . . . . . . . 2393.2.1 Likelihood . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2413.2.2 The Empirical Cumulative Distribution Function . . . . . . 2463.2.3 Fitting Expected Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2503.2.4 Fitting Probability Distributions . . . . . . . . . . . . . . . . . . . . 2533.2.5 Estimating Equations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2543.2.6 Summary and Preview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2583.3 The Decision Theory Approach to Statistical Inference . . . . . . . 2593.3.1 Decisions, Losses, Risks, and Optimal Actions . . . . . . . . . 2593.3.2 Approaches to Minimizing the Risk . . . . . . . . . . . . . . . . . . 2673.3.3 Admissibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2703.3.4 Minimaxity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2743.3.5 Summary and Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2763.4 Invariant and Equivariant Statistical Procedures . . . . . . . . . . . . . 2793.4.1 Formulation of the Basic Problem . . . . . . . . . . . . . . . . . . . 2803.4.2 Optimal Equivariant Statistical Procedures . . . . . . . . . . . 2843.5 Probability Statements in Statistical Inference . . . . . . . . . . . . . . 2903.5.1 Tests of Hypotheses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2903.5.2 Confidence Sets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2963.6 Variance Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3013.6.1 Jackknife Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3013.6.2 Bootstrap Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 304Theory of Statistics c 2000–2020 James E. Gentle

xviContents3.6.3 Substitution Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3043.7 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3053.7.1 Inference in Linear Models . . . . . . . . . . . . . . . . . . . . . . . . . . 3053.7.2 Inference in Finite Populations . . . . . . . . . . . . . . . . . . . . . . 3053.8 Asymptotic Inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3063.8.1 Consistency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3073.8.2 Asymp

Probability theory is the most directly relevant mathematical background, and it is assumed that the reader has a working knowledge of measure-theory-based probability theory. Chapter 1 covers this theory at a fairly rapid pace. Theory of Statistics c 2000-2020 James E. Gentle

Related Documents:

Springer Texts in Statistics Alfred: Elements of Statistics for the Life and Social Sciences Berger: An Introduction to Probability and Stochastic Processes Bilodeau and Brenner:Theory of Multivariate Statistics Blom: Probability and Statistics: Theory and Applications Brockwell and Davis:Introduction to Times Series and Forecasting, Second Edition Chow and Teicher:Probability Theory .

Statistics Student Version can do all of the statistics in this book. IBM SPSS Statistics GradPack includes the SPSS Base modules as well as advanced statistics, which enable you to do all the statistics in this book plus those in our IBM SPSS for Intermediate Statistics book (Leech et al., in press) and many others. Goals of This Book

Web Statistics -- Measuring user activity Contents Summary Website activity statistics Commonly used measures What web statistics don't tell us Comparing web statistics Analyzing BJS website activity BJS website findings Web page. activity Downloads Publications Press releases. Data to download How BJS is using its web statistics Future .

San Joaquin Delta College MATH 12: Introduction to Statistics and Probability Theory (3) San Jose City College MATH 63: Elementary Statistics (3) San Jose State University STAT 095: Elementary Statistics (3) STAT 115a: Elementary Statistics (3) STAT 115B: Intermediate Statistics (3) Santa Barbara City College

1 Chapter 1 The Role of Statistics and the Data Analysis Process 1.1 Descriptive statistics is the branch of statistics that involves the organization and summary of the values in a data set. Inferential statistics is the branch of statistics concerned with reaching conclusions about a population based on the information provided by a sample.

as economic statistics, education statistics and health statistics, to name a few. Having timely, valid, reliable, and comparable labour statistics is crucial to inform policy formulation, implementation and evaluation, labour market research and goal setting and monitoring. Such labour statistics can be derived from a number of different types of

Pretoria: Statistics South Africa, 2012 1 vol. (various paging) Previous title: South African Statistics 1995 Suid-Afrikaanse Statistieke 1995 Title continues in English only ISBN: 978-0-621-40949-9 1. Population Statistics 2. Tourist trade 3. Vital statistics 4. Education South Africa Statistics 5. Labour Statistics 6. Prices 7. South Africa .

Algae: (11L) 2. Algae: General characters, economic importance and Classification (Chapman and Chapman, 1973) up to classes. 03L . 3. Study of life cycle of algae with reference to taxonomic position, occurrence, thallus structure, and reproduction of Nostoc, Chara, Sargassum and Batrachospermum . 08 L.