Econometrics - Weebly

1y ago
16 Views
2 Downloads
2.43 MB
387 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Francisco Tran
Transcription

ECONOMETRICSBruce E. Hansenc 2000,20141University of WisconsinDepartment of EconomicsThis Revision: January 3, 2014Comments Welcome1This manuscript may be printed and reproduced for individual or instructional use, but may not beprinted for commercial purposes.

ContentsPreface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii1 Introduction1.1 What is Econometrics? . . . . . . . . . . . .1.2 The Probability Approach to Econometrics1.3 Econometric Terms and Notation . . . . . .1.4 Observational Data . . . . . . . . . . . . . .1.5 Standard Data Structures . . . . . . . . . .1.6 Sources for Economic Data . . . . . . . . .1.7 Econometric Software . . . . . . . . . . . .1.8 Reading the Manuscript . . . . . . . . . . .1.9 Common Symbols . . . . . . . . . . . . . .11123456782 Conditional Expectation and Projection2.1 Introduction . . . . . . . . . . . . . . . .2.2 The Distribution of Wages . . . . . . . .2.3 Conditional Expectation . . . . . . . . .2.4 Log Differences* . . . . . . . . . . . . .2.5 Conditional Expectation Function . . .2.6 Continuous Variables . . . . . . . . . . .2.7 Law of Iterated Expectations . . . . . .2.8 CEF Error . . . . . . . . . . . . . . . . .2.9 Intercept-Only Model . . . . . . . . . .2.10 Regression Variance . . . . . . . . . . .2.11 Best Predictor . . . . . . . . . . . . . .2.12 Conditional Variance . . . . . . . . . . .2.13 Homoskedasticity and Heteroskedasticity2.14 Regression Derivative . . . . . . . . . .2.15 Linear CEF . . . . . . . . . . . . . . . .2.16 Linear CEF with Nonlinear Effects . . .2.17 Linear CEF with Dummy Variables . . .2.18 Best Linear Predictor . . . . . . . . . .2.19 Linear Predictor Error Variance . . . . .2.20 Regression Coefficients . . . . . . . . . .2.21 Regression Sub-Vectors . . . . . . . . .2.22 Coefficient Decomposition . . . . . . . .2.23 Omitted Variable Bias . . . . . . . . . .2.24 Best Linear Approximation . . . . . . .2.25 Normal Regression . . . . . . . . . . . .2.26 Regression to the Mean . . . . . . . . .2.27 Reverse Regression . . . . . . . . . . . .2.28 Limitations of the Best Linear 536373838394041i.

CONTENTSii2.29 Random Coefficient Model . . . . . . . . . . . . . . . . . . .2.30 Causal Effects . . . . . . . . . . . . . . . . . . . . . . . . . .2.31 Expectation: Mathematical Details* . . . . . . . . . . . . .2.32 Existence and Uniqueness of the Conditional Expectation*2.33 Identification* . . . . . . . . . . . . . . . . . . . . . . . . . .2.34 Technical Proofs* . . . . . . . . . . . . . . . . . . . . . . . .Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3 The Algebra of Least Squares3.1 Introduction . . . . . . . . . . . . . . . . . . . . . .3.2 Random Samples . . . . . . . . . . . . . . . . . . .3.3 Sample Means . . . . . . . . . . . . . . . . . . . . .3.4 Least Squares Estimator . . . . . . . . . . . . . . .3.5 Solving for Least Squares with One Regressor . . .3.6 Solving for Least Squares with Multiple Regressors3.7 Illustration . . . . . . . . . . . . . . . . . . . . . .3.8 Least Squares Residuals . . . . . . . . . . . . . . .3.9 Model in Matrix Notation . . . . . . . . . . . . . .3.10 Projection Matrix . . . . . . . . . . . . . . . . . .3.11 Orthogonal Projection . . . . . . . . . . . . . . . .3.12 Estimation of Error Variance . . . . . . . . . . . .3.13 Analysis of Variance . . . . . . . . . . . . . . . . .3.14 Regression Components . . . . . . . . . . . . . . .3.15 Residual Regression . . . . . . . . . . . . . . . . .3.16 Prediction Errors . . . . . . . . . . . . . . . . . . .3.17 Influential Observations . . . . . . . . . . . . . . .3.18 Normal Regression Model . . . . . . . . . . . . . .3.19 CPS Data Set . . . . . . . . . . . . . . . . . . . . .3.20 Programming . . . . . . . . . . . . . . . . . . . . .3.21 Technical Proofs* . . . . . . . . . . . . . . . . . . .Exercises . . . . . . . . . . . . . . . . . . . . . . . . . .4 Least Squares Regression4.1 Introduction . . . . . . . . . . . . . .4.2 Sample Mean . . . . . . . . . . . . .4.3 Linear Regression Model . . . . . . .4.4 Mean of Least-Squares Estimator . .4.5 Variance of Least Squares Estimator4.6 Gauss-Markov Theorem . . . . . . .4.7 Residuals . . . . . . . . . . . . . . .4.8 Estimation of Error Variance . . . .4.9 Mean-Square Forecast Error . . . . .4.10 Covariance Matrix Estimation Under4.11 Covariance Matrix Estimation Under4.12 Standard Errors . . . . . . . . . . . .4.13 Computation . . . . . . . . . . . . .4.14 Measures of Fit . . . . . . . . . . . .4.15 Empirical Example . . . . . . . . . .4.16 Multicollinearity . . . . . . . . . . .4.17 Normal Regression Model . . . . . .Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .HomoskedasticityHeteroskedasticity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 05108110

CONTENTSiii5 An Introduction to Large Sample Asymptotics5.1 Introduction . . . . . . . . . . . . . . . . . . . . .5.2 Asymptotic Limits . . . . . . . . . . . . . . . . .5.3 Convergence in Probability . . . . . . . . . . . .5.4 Weak Law of Large Numbers . . . . . . . . . . .5.5 Almost Sure Convergence and the Strong Law* .5.6 Vector-Valued Moments . . . . . . . . . . . . . .5.7 Convergence in Distribution . . . . . . . . . . . .5.8 Higher Moments . . . . . . . . . . . . . . . . . .5.9 Functions of Moments . . . . . . . . . . . . . . .5.10 Delta Method . . . . . . . . . . . . . . . . . . . .5.11 Stochastic Order Symbols . . . . . . . . . . . . .5.12 Uniform Stochastic Bounds* . . . . . . . . . . . .5.13 Semiparametric Efficiency . . . . . . . . . . . . .5.14 Technical Proofs* . . . . . . . . . . . . . . . . . .Exercises . . . . . . . . . . . . . . . . . . . . . . . . .1121121121141151161171181201211231241261271301346 Asymptotic Theory for Least Squares6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6.2 Consistency of Least-Squares Estimation . . . . . . . . . . . . . . . .6.3 Asymptotic Normality . . . . . . . . . . . . . . . . . . . . . . . . . .6.4 Joint Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . .6.5 Consistency of Error Variance Estimators . . . . . . . . . . . . . . .6.6 Homoskedastic Covariance Matrix Estimation . . . . . . . . . . . . .6.7 Heteroskedastic Covariance Matrix Estimation . . . . . . . . . . . .6.8 Summary of Covariance Matrix Notation . . . . . . . . . . . . . . . .6.9 Alternative Covariance Matrix Estimators* . . . . . . . . . . . . . .6.10 Functions of Parameters . . . . . . . . . . . . . . . . . . . . . . . . .6.11 Asymptotic Standard Errors . . . . . . . . . . . . . . . . . . . . . . .6.12 t statistic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6.13 Confidence Intervals . . . . . . . . . . . . . . . . . . . . . . . . . . .6.14 Regression Intervals . . . . . . . . . . . . . . . . . . . . . . . . . . .6.15 Forecast Intervals . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6.16 Wald Statistic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6.17 Homoskedastic Wald Statistic . . . . . . . . . . . . . . . . . . . . . .6.18 Confidence Regions . . . . . . . . . . . . . . . . . . . . . . . . . . . .6.19 Semiparametric Efficiency in the Projection Model . . . . . . . . . .6.20 Semiparametric Efficiency in the Homoskedastic Regression Model* .6.21 Uniformly Consistent Residuals* . . . . . . . . . . . . . . . . . . . .6.22 Asymptotic Leverage* . . . . . . . . . . . . . . . . . . . . . . . . . .Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 581581591601621631641667 Restricted Estimation7.1 Introduction . . . . . . . . . . . . . . . .7.2 Constrained Least Squares . . . . . . . .7.3 Exclusion Restriction . . . . . . . . . . .7.4 Minimum Distance . . . . . . . . . . . .7.5 Asymptotic Distribution . . . . . . . . .7.6 Efficient Minimum Distance Estimator .7.7 Exclusion Restriction Revisited . . . . .7.8 Variance and Standard Error Estimation7.9 Misspecification . . . . . . . . . . . . . .169169170171172173174175177177.

CONTENTS7.10 Nonlinear Constraints7.11 Inequality Restrictions7.12 Constrained MLE . . .7.13 Technical Proofs* . . .Exercises . . . . . . . . . .iv.1791801811811838 Hypothesis Testing8.1 Hypotheses . . . . . . . . . . . . . . . . . . . . . .8.2 Acceptance and Rejection . . . . . . . . . . . . . .8.3 Type I Error . . . . . . . . . . . . . . . . . . . . .8.4 t tests . . . . . . . . . . . . . . . . . . . . . . . . .8.5 Type II Error and Power . . . . . . . . . . . . . . .8.6 Statistical Significance . . . . . . . . . . . . . . . .8.7 P-Values . . . . . . . . . . . . . . . . . . . . . . . .8.8 t-ratios and the Abuse of Testing . . . . . . . . . .8.9 Wald Tests . . . . . . . . . . . . . . . . . . . . . .8.10 Homoskedastic Wald Tests . . . . . . . . . . . . . .8.11 Criterion-Based Tests . . . . . . . . . . . . . . . .8.12 Minimum Distance Tests . . . . . . . . . . . . . . .8.13 Minimum Distance Tests Under Homoskedasticity8.14 F Tests . . . . . . . . . . . . . . . . . . . . . . . .8.15 Likelihood Ratio Test . . . . . . . . . . . . . . . .8.16 Problems with Tests of NonLinear Hypotheses . .8.17 Monte Carlo Simulation . . . . . . . . . . . . . . .8.18 Confidence Intervals by Test Inversion . . . . . . .8.19 Power and Test Consistency . . . . . . . . . . . . .8.20 Asymptotic Local Power . . . . . . . . . . . . . . .8.21 Asymptotic Local Power, Vector Case . . . . . . .8.22 Technical Proofs* . . . . . . . . . . . . . . . . . . .Exercises . . . . . . . . . . . . . . . . . . . . . . . . . 992022042052072102112139 Regression Extensions9.1 NonLinear Least Squares . . . . .9.2 Generalized Least Squares . . . . .9.3 Testing for Heteroskedasticity . . .9.4 Testing for Omitted NonLinearity .9.5 Least Absolute Deviations . . . . .9.6 Quantile Regression . . . . . . . .Exercises . . . . . . . . . . . . . . . . .21521521822122122222422710 The Bootstrap10.1 Definition of the Bootstrap . . . . . . . . .10.2 The Empirical Distribution Function . . . .10.3 Nonparametric Bootstrap . . . . . . . . . .10.4 Bootstrap Estimation of Bias and Variance10.5 Percentile Intervals . . . . . . . . . . . . . .10.6 Percentile-t Equal-Tailed Interval . . . . . .10.7 Symmetric Percentile-t Intervals . . . . . .10.8 Asymptotic Expansions . . . . . . . . . . .10.9 One-Sided Tests . . . . . . . . . . . . . . .10.10Symmetric Two-Sided Tests . . . . . . . . .10.11Percentile Confidence Intervals . . . . . . .229229229231231232234234235237238239.

CONTENTSv10.12Bootstrap Methods for Regression Models . . . . . . . . . . . . . . . . . . . . . . . . 240Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24211 NonParametric Regression11.1 Introduction . . . . . . . . . . . . . . . .11.2 Binned Estimator . . . . . . . . . . . . .11.3 Kernel Regression . . . . . . . . . . . . .11.4 Local Linear Estimator . . . . . . . . . .11.5 Nonparametric Residuals and Regression11.6 Cross-Validation Bandwidth Selection .11.7 Asymptotic Distribution . . . . . . . . .11.8 Conditional Variance Estimation . . . .11.9 Standard Errors . . . . . . . . . . . . . .11.10Multiple Regressors . . . . . . . . . . . . . . . . . . . .Fit. . . . . . . . . . .24324324324524624724925225525525612 Series Estimation12.1 Approximation by Series . . . . . . . . . . . .12.2 Splines . . . . . . . . . . . . . . . . . . . . . .12.3 Partially Linear Model . . . . . . . . . . . . .12.4 Additively Separable Models . . . . . . . . .12.5 Uniform Approximations . . . . . . . . . . . .12.6 Runge’s Phenomenon . . . . . . . . . . . . . .12.7 Approximating Regression . . . . . . . . . . .12.8 Residuals and Regression Fit . . . . . . . . .12.9 Cross-Validation Model Selection . . . . . . .12.10Convergence in Mean-Square . . . . . . . . .12.11Uniform Convergence . . . . . . . . . . . . . .12.12Asymptotic Normality . . . . . . . . . . . . .12.13Asymptotic Normality with Undersmoothing12.14Regression Estimation . . . . . . . . . . . . .12.15Kernel Versus Series Regression . . . . . . . .12.16Technical Proofs . . . . . . . . . . . . . . . 7213 Generalized Method of Moments13.1 Overidentified Linear Model . . . . . . . . .13.2 GMM Estimator . . . . . . . . . . . . . . .13.3 Distribution of GMM Estimator . . . . . .13.4 Estimation of the Efficient Weight Matrix .13.5 GMM: The General Case . . . . . . . . . .13.6 Over-Identification Test . . . . . . . . . . .13.7 Hypothesis Testing: The Distance Statistic13.8 Conditional Moment Restrictions . . . . . .13.9 Bootstrap GMM Inference . . . . . . . . . .Exercises . . . . . . . . . . . . . . . . . . . . . .27827827928028128228228328428528714 Empirical Likelihood14.1 Non-Parametric Likelihood . . . . . . . .14.2 Asymptotic Distribution of EL Estimator14.3 Overidentifying Restrictions . . . . . . . .14.4 Testing . . . . . . . . . . . . . . . . . . . .14.5 Numerical Computation . . . . . . . . . .289289291292293294.

CONTENTS15 Endogeneity15.1 Instrumental Variables . . .15.2 Reduced Form . . . . . . .15.3 Identification . . . . . . . .15.4 Estimation . . . . . . . . .15.5 Special Cases: IV and 2SLS15.6 Bekker Asymptotics . . . .15.7 Identification Failure . . . .Exercises . . . . . . . . . . . . .vi.29629729829929929930130230416 Univariate Time Series16.1 Stationarity and Ergodicity . . . . . .16.2 Autoregressions . . . . . . . . . . . . .16.3 Stationarity of AR(1) Process . . . . .16.4 Lag Operator . . . . . . . . . . . . . .16.5 Stationarity of AR(k) . . . . . . . . .16.6 Estimation . . . . . . . . . . . . . . .16.7 Asymptotic Distribution . . . . . . . .16.8 Bootstrap for Autoregressions . . . . .16.9 Trend Stationarity . . . . . . . . . . .16.10Testing for Omitted Serial Correlation16.11Model Selection . . . . . . . . . . . . .16.12Autoregressive Unit Roots . . . . . . .306. 306. 308. 309. 309. 310. 310. 311. 312. 312. 313. 314. 31417 Multivariate Time Series17.1 Vector Autoregressions (VARs) . . . .17.2 Estimation . . . . . . . . . . . . . . .17.3 Restricted VARs . . . . . . . . . . . .17.4 Single Equation from a VAR . . . . .17.5 Testing for Omitted Serial Correlation17.6 Selection of Lag Length in an VAR . .17.7 Granger Causality . . . . . . . . . . .17.8 Cointegration . . . . . . . . . . . . . .17.9 Cointegrated VARs . . . . . . . . . . .31631631731731731831831931932018 Limited Dependent Variables18.1 Binary Choice . . . . . . . .18.2 Count Data . . . . . . . . .18.3 Censored Data . . . . . . .18.4 Sample Selection . . . . . .322322323324325.19 Panel Data32719.1 Individual-Effects Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32719.2 Fixed Effects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32719.3 Dynamic Panel Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32920 Nonparametric Density Estimation33020.1 Kernel Density Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33020.2 Asymptotic MSE for Kernel Estimates . . . . . . . . . . . . . . . . . . . . . . . . . . 332

CONTENTSA Matrix AlgebraA.1 Notation . . . . . . . . . . .A.2 Matrix Addition . . . . . .A.3 Matrix Multiplication . . .A.4 Trace . . . . . . . . . . . . .A.5 Rank and Inverse . . . . . .A.6 Determinant . . . . . . . . .A.7 Eigenvalues . . . . . . . . .A.8 Positive Definiteness . . . .A.9 Matrix Calculus . . . . . . .A.10 Kronecker Products and theA.11 Vector and Matrix Norms .A.12 Matrix Inequalities . . . . .vii.335335336336337338339340341342342343343B ProbabilityB.1 Foundations . . . . . . . . . . . . . . . . . .B.2 Random Variables . . . . . . . . . . . . . .B.3 Expectation . . . . . . . . . . . . . . . . . .B.4 Gamma Function . . . . . . . . . . . . . . .B.5 Common Distributions . . . . . . . . . . . .B.6 Multivariate Random Variables . . . . . . .B.7 Conditional Distributions and Expectation .B.8 Transformations . . . . . . . . . . . . . . .B.9 Normal and Related Distributions . . . . .B.10 Inequalities . . . . . . . . . . . . . . . . . .B.11 Maximum Likelihood . . . . . . . . . . . . .348348350350351352354356358359361364. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Vec Operator. . . . . . . . . . . . . . .C Numerical OptimizationC.1 Grid Search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .C.2 Gradient Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .C.3 Derivative-Free Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .369. 369. 369. 371

PrefaceThis book is intended to serve as the textbook for a first-year graduate course in econometrics.It can be used as a stand-alone text, or be used as a supplement to another text.Students are assumed to have an understanding of multivariate calculus, probability theory,linear algebra, and mathematical statistics. A prior course in undergraduate econometrics wouldbe helpful, but not required. Two excellent undergraduate textbooks are Wooldridge (2009) andStock and Watson (2010).For reference, some of the basic tools of matrix algebra, probability, and statistics are reviewedin the Appendix.For students wishing to deepen their knowledge of matrix algebra in relation to their study ofeconometrics, I recommend Matrix Algebra by Abadir and Magnus (2005).An excellent introduction to probability and statistics is Statistical Inference by Casella andBerger (2002). For those wanting a deeper foundation in probability, I recommend Ash (1972)or Billingsley (1995). For more advanced statistical theory, I recommend Lehmann and Casella(1998), van der Vaart (1998), Shao (2003), and Lehmann and Romano (2005).For further study in econometrics beyond this text, I recommend Davidson (1994) for asymptotic theory, Hamilton (1994) for time-series methods, Wooldridge (2002) for panel data and discreteresponse models, and Li and Racine (2007) for nonparametrics and semiparametric econometrics.Beyond these texts, the Handbook of Econometrics series provides advanced summaries of contemporary econometric methods and theory.The end-of-chapter exercises are important parts of the text and are meant to help teach studentsof econometrics. Answers are not provided, and this is intentional.I would like to thank Ying-Ying Lee for providing research assistance in preparing some of theempirical examples presented in the text.As this is a manuscript in progress, some parts are quite incomplete, and there are many topicswhich I plan to add. In general, the earlier chapters are the most complete while the later chaptersneed significant work and revision.viii

Chapter 1Introduction1.1What is Econometrics?The term “econometrics” is believed to have been crafted by Ragnar Frisch (1895-1973) ofNorway, one of the three principle founders of the Econometric Society, first editor of the journalEconometrica, and co-winner of the first Nobel Memorial Prize in Economic Sciences in 1969. Itis therefore fitting that we turn to Frisch’s own words in the introduction to the first issue ofEconometrica to describe the discipline.A word of explanation regarding the term econometrics may be in order. Its definition is implied in the statement of the scope of the [Econometric] Society, in Section Iof the Constitution, which reads: “The Econometric Society is an international societyfor the advancement of economic theory in its relation to statistics and mathematics.Its main object shall be to promote studies that aim at a unification of the theoreticalquantitative and the empirical-quantitative approach to economic problems.”But there are several aspects of the quantitative approach to economics, and no singleone of these aspects, taken by itself, should be confounded with econometrics. Thus,econometrics is by no means the same as economic statistics. Nor is it identical withwhat we call general economic theory, although a considerable portion of this theory hasa defininitely quantitative character. Nor should econometrics be taken as synonomouswith the application of mathematics to economics. Experience has shown that eachof these three view-points, that of statistics, economic theory, and mathematics, isa necessary, but not by itself a sufficient, condition for a real understanding of thequantitative relations in modern economic life. It is the unification of all three that ispowerful. And it is this unification that constitutes econometrics.Ragnar Frisch, Econometrica, (1933), 1, pp. 1-2.This definition remains valid today, although some terms have evolved somewhat in their usage.Today, we would say that econometrics is the unified study of economic models, mathematicalstatistics, and economic data.Within the field of econometrics there are sub-divisions and specializations. Econometric theory concerns the development of tools and methods, and the study of the properties of econometricmethods. Applied econometrics is a term describing the development of quantitative economicmodels and the application of econometric methods to these models using economic data.1.2The Probability Approach to EconometricsThe unifying methodology of modern econometrics was articulated by Trygve Haavelmo (19111999) of Norway, winner of the 1989 Nobel Memorial Prize in Economic Sciences, in his seminal1

CHAPTER 1. INTRODUCTION2paper “The probability approach in econometrics”, Econometrica (1944). Haavelmo argued thatquantitative economic models must necessarily be probability models (by which today we wouldmean stochastic). Deterministic models are blatently inconsistent with observed economic quantities, and it is incoherent to apply deterministic models to non-deterministic data. Economicmodels should be explicitly designed to incorporate randomness; stochastic errors should not besimply added to deterministic models to make them random. Once we acknowledge that an economic model is a probability model, it follows naturally that an appropriate tool way to quantify,estimate, and conduct inferences about the economy is through the powerful theory of mathematical statistics. The appropriate method for a quantitative economic analysis follows from theprobabilistic construction of the economic model.Haavelmo’s probability approach was quickly embraced by the economics profession. Today noquantitative work in economics shuns its fundamental vision.While all economists embrace the probability approach, there has been some evolution in itsimplementation.The structural approach is the closest to Haavelmo’s original idea. A probabilistic economicmodel is specified, and the quantitative analysis performed under the assumption that the economicmodel is correctly specified. Researchers often describe this as “taking their model seriously.” Thestructural approach typically leads to likelihood-based analysis, including maximum likelihood andBayesian estimation.A criticism of the structural approach is that it is misleading to treat an economic modelas correctly specified. Rather, it is more accurate to view a model as a useful abstraction orapproximation. In this case, how should we interpret structural econometric analysis? The quasistructural approach to inference views a structural economic model as an approximation ratherthan the truth. This theory has led to the concepts of the pseudo-true value (the parameter valuedefined by the estimation problem), the quasi-likelihood function, quasi-MLE, and quasi-likelihoodinference.Closely related is the semiparametric approach. A probabilistic economic mo

For more advanced statistical theory, I recommend Lehmann and Casella (1998), van der Vaart (1998), Shao (2003), and Lehmann and Romano (2005). . Today, we would say that econometrics is the unified study of economic models, mathematical statistics, and economic data. Within the field of econometrics there are sub-divisions and .

Related Documents:

Econometrics is the branch of economics concerned with the use of mathematical methods (especially statistics) in describing economic systems. Econometrics is a set of quantitative techniques that are useful for making "economic decisions" Econometrics is a set of statistical tools that allows economists to test hypotheses using

Harmless Econometrics is more advanced. 2. Introduction to Econometrics by Stock and Watson. This textbook is at a slightly lower level to Introductory Econometrics by Wooldridge. STATA 3. Microeconometrics Using Stata: Revised Edition by Cameron and Trivedi. An in-depth overview of econometrics with STATA. 4. Statistics with STATA by Hamilton .

of Basic Econometrics is to provide an elementary but comprehensive intro-duction to econometrics without resorting to matrix algebra, calculus, or statistics beyond the elementary level. In this edition I have attempted to incorporate some of the developments in the theory and practice of econometrics that have taken place since the

Nov 14, 2016 · Econ 612 Time Series Econometrics (Masters Level) Econ 613 Applied Econometrics: Micro (Masters Level) MA students who want to go on to a Ph.D. in Economics or a related field are encouraged to take the required Ph.D. Econometrics sequence (

1.1 USING EVIEWS FOR PRINCIPLES OF ECONOMETRICS, 5E This manual is a supplement to the textbook Principles of Econometrics, 5th edition, by Hill, Griffiths and Lim (John Wiley & Sons, Inc., 2018). It is not in itself an econometrics book, nor is it a complete computer manual. Rather it is a step-by-step guide to using EViews 10

Warsaw School of Economics Institute of Econometrics Department of Applied Econometrics Department of Applied Econometrics Working Papers Warsaw School of Economics Al. Niepodleglosci 164 02-554 Warszawa, Poland Working Paper No. 3-10 Empirical power of the Kwiatkowski-Phillips-Schmidt-Shin test Ewa M. Syczewska Warsaw School of Economics

What is Econometrics? (cont'd) Introductory Econometrics Jan Zouhar 7 econometrics is not concerned with the numbers themselves (the concrete information in the previous example), but rather with the methods used to obtain the information crucial role of statistics textbook definitions of econometrics: "application of mathematical statistics to economic data to lend

Engineering Mathematics – I, Reena Garg, Khanna Book Publishing . AICTE Recommended Books for Undergraduate Degree Courses as per Model Curriculum 2018 AICTE Suggested Books in Engineering & Technology w.e.f. 2018-19 BSC103 – Mathematics – II 1. Advanced Engineering Mathematics, Chandrika Prasad & Reena Garg, Khanna Book Publishing 2. Higher Engineering Mathematics, Ramana B.V., Tata .