Econometrics2011
.pdfECONOMETRICS
Bruce E. Hansen
c 2000, 20111
University of Wisconsin
www.ssc.wisc.edu/~bhansen
This Revision: January 13, 2011
Comments Welcome
1 This manuscript may be printed and reproduced for individual or instructional use, but may not be printed for commercial purposes.
Contents
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
vii |
1 Introduction |
1 |
1.1What is Econometrics? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 The Probability Approach to Econometrics . . . . . . . . . . . . . . . . . . . . . . . 1
1.3Econometric Terms and Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.4Observational Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.5Standard Data Structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.6 |
Sources for Economic Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
5 |
1.7 |
Econometric Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
6 |
1.8 |
Reading the Manuscript . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
7 |
2 Moment Estimation |
8 |
2.1Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.2Population and Sample Mean . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.3Sample Mean is Unbiased . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.4 |
Variance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
9 |
2.5 |
Convergence in Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
10 |
2.6 |
Weak Law of Large Numbers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
11 |
2.7Vector-Valued Moments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.8Convergence in Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.9Functions of Moments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.10Delta Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.11Stochastic Order Symbols . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.12Uniform Stochastic Bounds* . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.13Semiparametric E¢ ciency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2.14 |
Expectation* . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
22 |
2.15 |
Technical Proofs* . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
24 |
3 Conditional Expectation and Projection |
28 |
3.1Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
3.2The Distribution of Wages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
3.3Conditional Expectation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
3.4 Conditional Expectation Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
3.5Continuous Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
3.6Law of Iterated Expectations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
3.7Monotonicity of Conditioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
3.8CEF Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
3.9 Best Predictor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
3.10Conditional Variance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
3.11Homoskedasticity and Heteroskedasticity . . . . . . . . . . . . . . . . . . . . . . . . . 42
3.12 Regression Derivative . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
i
CONTENTS |
ii |
3.13Linear CEF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
3.14Linear CEF with Nonlinear E¤ects . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
3.15Linear CEF with Dummy Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
3.16 Best Linear Predictor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
3.17Linear Predictor Error Variance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
3.18Regression Coe¢ cients . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
3.19 Regression Sub-Vectors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
3.20Coe¢ cient Decomposition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
3.21Omitted Variable Bias . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
3.22 |
Best Linear Approximation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
57 |
3.23 |
Normal Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
57 |
3.24 |
Regression to the Mean . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
58 |
3.25Reverse Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
3.26Limitations of the Best Linear Predictor . . . . . . . . . . . . . . . . . . . . . . . . . 60
3.27Random Coe¢ cient Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
3.28Causal E¤ects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
3.29 |
Existence and Uniqueness of the Conditional Expectation* . . . . . . . . . . . . . . |
65 |
3.30 |
Technical Proofs* . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
66 |
Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
69 |
|
4 The Algebra of Least Squares |
71 |
4.1Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
4.2Least Squares Estimator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
4.3 |
Solving for Least Squares . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
72 |
4.4 |
Illustration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
74 |
4.5Least Squares Residuals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
4.6Model in Matrix Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
4.7 Projection Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
4.8Orthogonal Projection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
4.9 |
Regression Components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
79 |
4.10 |
Residual Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
80 |
4.11 |
Prediction Errors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
82 |
4.12 |
In‡uential Observations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
83 |
4.13Measures of Fit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
4.14Normal Regression Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
88 |
5 Least Squares Regression |
91 |
5.1Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
5.2Mean of Least-Squares Estimator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
5.3Variance of Least Squares Estimator . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
5.4Gauss-Markov Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
5.5 |
Residuals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
96 |
5.6 |
Estimation of Error Variance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
97 |
5.7 |
Covariance Matrix Estimation Under Homoskedasticity . . . . . . . . . . . . . . . . |
98 |
5.8Covariance Matrix Estimation Under Heteroskedasticity . . . . . . . . . . . . . . . . 99
5.9Standard Errors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
5.10 |
Multicollinearity . . . . . |
. . . . . . . . . . . . . . |
. . . . . . . . . . . . . . . |
. . |
. |
. |
103 |
5.11 |
Normal Regression Model |
. . . . . . . . . . . . . . |
. . . . . . . . . . . . . . . |
. . |
. |
. |
106 |
Exercises . . . . . . . . . . . . |
. . . . . . . . . . . . . . |
. . . . . . . . . . . . . . . |
. . |
. |
. |
108 |
CONTENTS |
iii |
6 Asymptotic Theory for Least Squares |
109 |
6.1Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
6.2Consistency of Least-Squares Estimation . . . . . . . . . . . . . . . . . . . . . . . . . 109
6.3Consistency of Sample Variance Estimators . . . . . . . . . . . . . . . . . . . . . . . 112
6.4Asymptotic Normality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
6.5 |
Joint Distribution . . . . . . . . |
. . . . . . . . . . |
. . . . . . . |
. . . . . . . . . . . |
. |
115 |
6.6 |
Uniformly Consistent Residuals* |
. . . . . . . . . . |
. . . . . . . |
. . . . . . . . . . . |
. |
118 |
6.7Asymptotic Leverage* . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
6.8 Consistent Covariance Matrix Estimation . . . . . . . . . . . . . . . . . . . . . . . . 120
6.9Functions of Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
6.10Asymptotic Standard Errors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
6.11 |
t statistic . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123 |
|
6.12 |
Con…dence Intervals |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
123 |
6.13 |
Regression Intervals |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
124 |
6.14Quadratic Forms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
6.15Con…dence Regions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
6.16Semiparametric E¢ ciency in the Projection Model . . . . . . . . . . . . . . . . . . . 128
6.17Semiparametric E¢ ciency in the Homoskedastic Regression Model* . . . . . . . . . . 130
6.18Technical Proofs* . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
Exercises . . . . . . . . . . . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134 |
7 Restricted Estimation |
136 |
7.1Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
7.2Constrained Least Squares . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
7.3Exclusion Restriction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
7.4Minimum Distance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
7.5 Computation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
7.6Asymptotic Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140
7.7E¢ cient Minimum Distance Estimator . . . . . . . . . . . . . . . . . . . . . . . . . . 141
7.8Exclusion Restriction Revisited . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
7.9Variance and Standard Error Estimation . . . . . . . . . . . . . . . . . . . . . . . . . 143
7.10 |
Nonlinear Constraints |
|
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143 |
|||||||
7.11 |
Technical Proofs* . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144 |
|||||||||
Exercises . . . . . . . . . |
. |
. . . . . . |
. . . . . . . . . . |
. |
. . . . . . . |
. . . . . . . |
. . . |
. |
146 |
|
8 Testing |
|
|
|
|
|
|
|
|
147 |
|
8.1 |
t tests . . . . . . . . |
. |
. . . . . . |
. . . . . . . . . . |
. |
. . . . . . . |
. . . . . . . |
. . . |
. |
147 |
8.2t-ratios . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
8.3 Wald Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
8.4Minimum Distance Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
8.5 F Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
8.6Normal Regression Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152
8.7 Problems with Tests of NonLinear Hypotheses . . . . . . . . . . . . . . . . . . . . . 152
8.8Monte Carlo Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
8.9 Estimating a Wage Equation |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
. |
157 |
Exercises . . . . . . . . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
. |
161 |
9 Additional Regression Topics |
|
|
163 |
9.1Generalized Least Squares . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
9.2Testing for Heteroskedasticity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
9.3Forecast Intervals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
9.4 NonLinear Least Squares . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
CONTENTS |
iv |
9.5Least Absolute Deviations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
9.6 Quantile Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
9.7Testing for Omitted NonLinearity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
9.8Model Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174
Exercises . . . . . . . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177 |
|
10 The Bootstrap |
179 |
|
10.1 |
De…nition of the Bootstrap |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 179 |
10.2 |
The Empirical Distribution Function . . . . . . . . . . . . . . . . . . . . . . . . . . . 179 |
|
10.3 |
Nonparametric Bootstrap |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181 |
10.4 |
Bootstrap Estimation of Bias and Variance . . . . . . . . . . . . . . . . . . . . . . . 181 |
10.5Percentile Intervals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182
10.6Percentile-t Equal-Tailed Interval . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184
10.7 |
Symmetric Percentile-t Intervals |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184 |
|
10.8 |
Asymptotic Expansions . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
185 |
10.9 |
One-Sided Tests . . . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
187 |
10.10Symmetric Two-Sided Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
10.11Percentile Con…dence Intervals . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . 189 |
|
10.12Bootstrap Methods for Regression Models |
. . . . . . . . . . . . . . . . . . . . . . . . 190 |
|
Exercises . . . . . . . . . . . . . . . . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . 192 |
|
11 Generalized Method of Moments |
193 |
|
11.1 |
Overidenti…ed Linear Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193 |
|
11.2 |
GMM Estimator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194 |
|
11.3 |
Distribution of GMM Estimator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195 |
|
11.4 |
Estimation of the E¢ cient Weight Matrix |
. . . . . . . . . . . . . . . . . . . . . . . . 196 |
11.5 |
GMM: The General Case . . . . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . 197 |
11.6 |
Over-Identi…cation Test . . . . . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . 197 |
11.7 |
Hypothesis Testing: The Distance Statistic |
. . . . . . . . . . . . . . . . . . . . . . . 198 |
11.8Conditional Moment Restrictions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
11.9Bootstrap GMM Inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
Exercises . . . . . . . . . . . . . . . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . 202 |
||
12 Empirical Likelihood |
|
204 |
|
12.1 |
Non-Parametric Likelihood . . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . |
204 |
12.2 |
Asymptotic Distribution of EL Estimator |
. . . . . . . . . . . . . . . . . . . . . . . . |
206 |
12.3Overidentifying Restrictions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
12.4Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
12.5Numerical Computation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 209
13 Endogeneity |
211 |
|
13.1 |
Instrumental Variables . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212 |
13.2 |
Reduced Form . . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213 |
13.3 |
Identi…cation . . . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 214 |
13.4 |
Estimation . . . . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 214 |
13.5 |
Special Cases: IV and 2SLS |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 214 |
13.6 |
Bekker Asymptotics . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 216 |
13.7 |
Identi…cation Failure . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217 |
Exercises . . . . . . . . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219 |
CONTENTS |
v |
14 Univariate Time Series |
221 |
14.1Stationarity and Ergodicity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221
14.2Autoregressions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
14.3Stationarity of AR(1) Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224
14.4Lag Operator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224
14.5 |
Stationarity of AR(k) |
. . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . |
. . . |
. |
225 |
14.6 |
Estimation . . . . . . |
. . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . |
. . . |
. |
225 |
14.7Asymptotic Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226
14.8Bootstrap for Autoregressions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
14.9Trend Stationarity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227 14.10Testing for Omitted Serial Correlation . . . . . . . . . . . . . . . . . . . . . . . . . . 228 14.11Model Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229 14.12Autoregressive Unit Roots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
15 Multivariate Time Series |
|
231 |
|
15.1 |
Vector Autoregressions (VARs) |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231 |
|
15.2 |
Estimation . . . . . . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 232 |
|
15.3 |
Restricted VARs . . . . . . . . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
232 |
15.4 |
Single Equation from a VAR . |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
232 |
15.5Testing for Omitted Serial Correlation . . . . . . . . . . . . . . . . . . . . . . . . . . 233
15.6Selection of Lag Length in an VAR . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233
15.7 Granger Causality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234
15.8Cointegration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234
15.9Cointegrated VARs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235
16 Limited Dependent Variables |
237 |
16.1Binary Choice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237
16.2Count Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 238
|
16.3 |
Censored Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
239 |
|
16.4 |
Sample Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
240 |
17 |
Panel Data |
242 |
|
|
17.1 |
Individual-E¤ects Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242 |
|
|
17.2 |
Fixed E¤ects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
242 |
|
17.3 |
Dynamic Panel Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 244 |
|
18 |
Nonparametrics |
245 |
18.1Kernel Density Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245
18.2Asymptotic MSE for Kernel Estimates . . . . . . . . . . . . . . . . . . . . . . . . . . 247
A Matrix Algebra |
250 |
|
A.1 |
Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250 |
|
A.2 |
Matrix Addition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
251 |
A.3 |
Matrix Multiplication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
251 |
A.4 |
Trace . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252 |
|
A.5 |
Rank and Inverse . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253 |
|
A.6 |
Determinant . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254 |
|
A.7 |
Eigenvalues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
255 |
A.8 |
Positive De…niteness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
255 |
A.9 |
Matrix Calculus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 256 |
|
A.10 |
Kronecker Products and the Vec Operator . . . . . . . . . . . . . . . . . . . . . . . . 257 |
|
A.11 |
Vector and Matrix Norms and Inequalities . . . . . . . . . . . . . . . . . . . . . . . . 257 |
CONTENTS |
|
vi |
||
B |
Probability |
|
260 |
|
|
B.1 |
Foundations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
. |
260 |
|
B.2 |
Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
. |
262 |
|
B.3 |
Expectation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
. |
262 |
|
B.4 |
Gamma Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
. |
263 |
|
B.5 |
Common Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
. |
264 |
|
B.6 |
Multivariate Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
. |
266 |
|
B.7 |
Conditional Distributions and Expectation . . . . . . . . . . . . . . . . . . . . . . . |
. 268 |
|
|
B.8 |
Transformations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
. |
270 |
|
B.9 |
Normal and Related Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . |
. |
271 |
|
B.10 |
Inequalities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
. |
273 |
|
B.11 |
Maximum Likelihood . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277 |
||
C |
Numerical Optimization |
|
282 |
|
|
C.1 |
Grid Search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
. |
282 |
|
C.2 |
Gradient Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
. |
282 |
|
C.3 |
Derivative-Free Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
. |
284 |
Preface
This book is intended to serve as the textbook for a …rst-year graduate course in econometrics. It can be used as a stand-alone text, or be used as a supplement to another text.
Students are assumed to have an understanding of multivariate calculus, probability theory, linear algebra, and mathematical statistics. A prior course in undergraduate econometrics would be helpful, but not required.
For reference, some of the basic tools of matrix algebra, probability, and statistics are reviewed in the Appendix.
For students wishing to deepen their knowledge of matrix algebra in relation to their study of econometrics, I recommend Matrix Algebra by Abadir and Magnus (2005).
An excellent introduction to probability and statistics is Statistical Inference by Casella and Berger (2002). For those wanting a deeper foundation in probability, I recommend Ash (1972) or Billingsley (1995). For more advanced statistical theory, I recommend Lehmann and Casella (1998), van der Vaart (1998), Shao (2003), and Lehmann and Romano (2005).
For further study in econometrics beyond this text, I recommend Davidson (1994) for asymptotic theory, Hamilton (1994) for time-series methods, Wooldridge (2002) for panel data and discrete response models, and Li and Racine (2007) for nonparametrics and semiparametric econometrics. Beyond these texts, the Handbook of Econometrics series provides advanced summaries of contemporary econometric methods and theory.
As this is a manuscript in progress, some parts are quite incomplete, in particular the later sections of the manuscript. Hopefully one day these sections will be ‡eshed out and completed in more detail.
I would like to thank Ying-Ying Lee for providing research assistance in preparing some of the empirical examples presented in the text.
vii
Chapter 1
Introduction
1.1What is Econometrics?
The term “econometrics” is believed to have been crafted by Ragnar Frisch (1895-1973) of Norway, one of the three principle founders of the Econometric Society, …rst editor of the journal Econometrica, and co-winner of the …rst Nobel Memorial Prize in Economic Sciences in 1969. It is therefore …tting that we turn to Frisch’ own words in the introduction to the …rst issue of Econometrica for an explanation of the discipline.
A word of explanation regarding the term econometrics may be in order. Its de…nition is implied in the statement of the scope of the [Econometric] Society, in Section I of the Constitution, which reads: “The Econometric Society is an international society for the advancement of economic theory in its relation to statistics and mathematics....
Its main object shall be to promote studies that aim at a uni…cation of the theoreticalquantitative and the empirical-quantitative approach to economic problems....”
But there are several aspects of the quantitative approach to economics, and no single one of these aspects, taken by itself, should be confounded with econometrics. Thus, econometrics is by no means the same as economic statistics. Nor is it identical with what we call general economic theory, although a considerable portion of this theory has a de…ninitely quantitative character. Nor should econometrics be taken as synonomous with the application of mathematics to economics. Experience has shown that each of these three view-points, that of statistics, economic theory, and mathematics, is a necessary, but not by itself a su¢ cient, condition for a real understanding of the quantitative relations in modern economic life. It is the uni…cation of all three that is powerful. And it is this uni…cation that constitutes econometrics.
Ragnar Frisch, Econometrica, (1933), 1, pp. 1-2.
This de…nition remains valid today, although some terms have evolved somewhat in their usage. Today, we would say that econometrics is the uni…ed study of economic models, mathematical statistics, and economic data.
Within the …eld of econometrics there are sub-divisions and specializations. Econometric theory concerns the development of tools and methods, and the study of the properties of econometric methods. Applied econometrics is a term describing the development of quantitative economic models and the application of econometric methods to these models using economic data.
1.2The Probability Approach to Econometrics
The unifying methodology of modern econometrics was articulated by Trygve Haavelmo (19111999) of Norway, winner of the 1989 Nobel Memorial Prize in Economic Sciences, in his seminal
1
CHAPTER 1. INTRODUCTION |
2 |
paper “The probability approach in econometrics”, Econometrica (1944). Haavelmo argued that quantitative economic models must necessarily be probability models (by which today we would mean stochastic). Deterministic models are blatently inconsistent with observed economic quantities, and it is incohorent to apply deterministic models to non-deterministic data. Economic models should be explicitly designed to incorporate randomness; stochastic errors should not be simply added to deterministic models to make them random. Once we acknowledge that an economic model is a probability model, it follows naturally that the best way to quantify, estimate, and conduct inferences about the economy is through the powerful theory of mathematical statistics. The appropriate method for a quantitative economic analysis follows from the probabilistic construction of the economic model.
Haavelmo’s probability approach was quickly embraced by the economics profession. Today no quantitative work in economics shuns its fundamental vision.
While all economists embrace the probability approach, there has been some evolution in its implementation.
The structural approach is the closest to Haavelmo’s original idea. A probabilistic economic model is speci…ed, and the quantitative analysis performed under the assumption that the economic model is correctly speci…ed. Researchers often describe this as “taking their model seriously.”The structural approach typically leads to likelihood-based analysis, including maximum likelihood and Bayesian estimation.
A criticism of the structural approach is that it is misleading to treat an economic model as correctly speci…ed. Rather, it is more accurate to view a model as a useful abstraction or approximation. In this case, how should we interpret structural econometric analysis? The quasistructural approach to inference views a structural economic model as an approximation rather than the truth. This theory has led to the concepts of the pseudo-true value (the parameter value de…ned by the estimation problem), the quasi-likelihood function, quasi-MLE, and quasi-likelihood inference.
Closely related is the semiparametric approach. A probabilistic economic model is partially speci…ed but some features are left unspeci…ed. This approach typically leads to estimation methods such as least-squares and the Generalized Method of Moments. The semiparametric approach dominates contemporary econometrics, and is the main focus of this textbook.
Another branch of quantitative structural economics is the calibration approach. Similar to the quasi-structural approach, the calibration approach interprets structural models as approximations and hence inherently false. The di¤erence is that the calibrationist literature rejects mathematical statistics as inappropriate for approximate models, and instead selects parameters by matching model and data moments using non-statistical ad hoc1 methods.
1.3Econometric Terms and Notation
In a typical application, an econometrician has a set of repeated measurements on a set of variables. For example, in a labor application the variables could include weekly earnings, educational attainment, age, and other descriptive characteristics. We call this information the data, dataset, or sample.
We use the term observations to refer to the distinct repeated measurements on the variables. An individual observation often corresponds to a speci…c economic unit, such as a person, household, corporation, …rm, organization, country, state, city or other geographical region. An individual observation could also be a measurement at a point in time, such as quarterly GDP or a daily interest rate.
Economists typically denote variables by the italicized roman characters y, x; and/or z: The convention in econometrics is to use the character y to denote the variable to be explained, while
1 Ad hoc means “for this purpose” –a method designed for a speci…c problem –and not based on a generalizable principle.