Lecture Slides for Advanced Econometrics 2 (2022 version)
To navigate these slides, press c. To make the fonts bigger, press b. To make the fonts smaller, press s. To see all slides in one page, press a.
Lectures 20 to 28: The generalized method of moments (updated as of 2022-05-06)
- Implementing GMM in R using a specialized package: 2010 tutorial , update to 2020 as momentfit
- Details of the consistency proof for nonlinear GMM (pdf here)
- What happened on May 30: Discuss Quiz 8, wrap up everything including what happens to the J-test if you use a suboptimal weighting matrix, discuss some time series applications of IV/GMM estimation (especially more instruments coming from correct specification and correct dynamic specification), discuss key parts of Angrist and Krueger (1991), what things to look forward to? (testing rank conditions, weak instruments, stronger assumptions for identification in panel data settings, rethinking asymptotics in panel data settings)
- What happened on May 25: Finish the remaining ideas behind the consistency proof in the nonlinear GMM case, move on to asymptotic normality proof, Hansen-Sargan J-test (end)
- What happened on May 23: Focused on the test of overidentifying restrictions for linear GMM, in particular, finished the underlying intuition along with its many forms, how to form the test statistic, what the asymptotic distribution is under the null, started the key ideas behind the consistency proof in the nonlinear GMM case, what makes the proof different from the linear structural equations case? (end at slide 47)
- What happened on May 18: How to implement different versions of efficient GMM, Quiz 9 related to exercise on slide 30-31 (also discussed how to approach the exercise), is having more instruments a good thing?, why is it called 2SLS again?, control function approach to estimating linear structural equations, started the intuition behind the test of overidentifying restrictions for linear GMM (end at slide 40)
- What happened on May 16: Optimal weighting matrices (in what sense?), 2SLS versus GMM estimation (effects on the estimand, consistency, and asymptotic normality), 2SLS is efficient GMM under stronger orthogonality conditions and conditional homoscedasticity and no serial correlation, modifications to GMM algorithm, why is 2SLS with robust standard errors used more often than efficient GMM in practice?, how do we interpret structural parameters and the related estimands? (end at slide 34)
- What happened on May 11: Simultaneous equation models, how will the identification argument change when L>K, where could the weighting matrix possibly come from? (end at slide 25)
- What happened on May 9: Feedback on Quiz 7, issues with implementing GMM on a computer, examples of linear structural equations, why are they called structural equations/response schedules (end at slide 18)
- What happened on May 7: How to set up a GMM estimation problem, how to apply the algorithm, OLS in a GMM estimation framework, linear structural equations, Quiz 7 (end at slide 14)
- What happened on Apr 27: Motivation for GMM estimation (end at slide 6)
Lectures 15 to 20: Linear regressions applied to time series data
- R implementation of HC, HAC standard errors: tutorial, further software development of sandwich estimators here
- What happened on Apr 27: The CLT for zero mean ergodic stationary processes (finished all slides)
- What happened on Apr 25: Working out the details for the AR(1) model with AR(1) errors to understand the limitations of Chapter 5 and to motivate Chapters 6 to 8, wrapping up the key ideas from Chapters 3 to 6
- What happened on Apr 20: The effect of imposing MDS-type assumptions on consistency and asymptotic normality, exercises (end at slide 48)
- What happened on Apr 18: Contrasting IID, MDS, and white noise, examining Assumption 5.5 from a theoretical and practical point of view (end at slide 46)
- What happened on Apr 13: Conceptual understanding of stationarity and ergodicity, why showing stationarity and ergodicity is hard, which of the examples are stationary and ergodic, martingales and martingale difference sequences (MDS) (end at slide 43)
- What happened on Apr 11: The need for stochastic processes, examples and what concepts may be needed for asymptotics to work, code for visualizing examples of stochastic processes (end at slide 38)
- Pre-recorded videos at SPOC: How do we modify the theory to justify linear regression for time series data, IID as a special case, in what case will OLS work and not work, Monte Carlo simulation to evaluate the performance of OLS in AR(1) models, intuitive idea behind ergodicity and stationarity, the simplest ergodic theorem, IID vs martingale difference sequences (MDS), the form of the asymptotic variance in the time series case, the structure of the proofs for consistency and asymptotic normality, why it is important to consistently estimate the asymptotic covariance matrix and how to actually do it (slides 1 to 35)
- Comparison of assumptions across Chapters 3 to 6 (PDF)
- Writing down the proofs for consistency, asymptotic normality, and consistent covariance matrix estimation (PDF)
Lectures 8 to 15: Least squares algebra and finite-sample theory
- What happened on Apr 11: Wrapping up Practice Exercise Item 8, big picture of Chapters 2 to 6 (comparing assumptions, what can be learned from OLS)
- After doing Exercises 3.8-3.9, you can now read more on multicollinearity.
- What happened on Apr 2 and Apr 6: Discussing Quiz 5 (how to write a complete solution), exam strategies, more practice exercises, connections to large-sample properties
- What happened on Mar 30: Q&A, clarifying misunderstandings, more exercises, Quiz 5
- What happened on Mar 28: Quiz 4, Q&A, connections to large-sample properties, practiced some exercises
- Pre-recorded videos at SPOC: Finite-sample properties of the OLS estimator, simplified and sandwich covariance matrices, orthogonal transformations and where the degrees-of-freedom adjustment comes from, reference results for the multivariate normal distribution (PDF) (slides 30 to 51)
- What happened on Mar 23 (lots of network difficulties here, some delays): Quiz 3, Measures of fit, FWL, multicollinearity, restricted least squares algebra (end at slide 29)
- What happened on Mar 21: Quiz 2, simplification of the asymptotic variance of the slope under correct specification and constant conditional variance, repeat: estimand vs estimate vs estimator (picture from @WomenInStat), setup and some details about MRW (1992), why do we need econometrics for MRW (1992), MRW (1992) satisfies Assumption 3.1 and 3.2, Assumption 3.3 only needed for uniqueness of OLS estimator, most crucial least squares algebra (skipped slides 7 to 13 - but available as video at SPOC, start again at slide 2, end at slide 18)
- What happened on Mar 16: Preview of material, gains from correct specification (end at slide 2, parts of slide 14 and 15)
Lectures 6 to 8: Conditional prediction
- What happened on Mar 16: Example of correctly specified linear regression model, some exercises on recognizing linear regression models and assumptions behind them, extension to general case, the meaning of the word "linear" and "regression", short and long regressions (finished slides, regression anatomy slides discussed in uploaded video at SPOC)
- What happened on Mar 14: Law of total variance, loss from prediction and its sources, distinction between CEF and best linear predictor and its implications (purpose, extrapolation, interpretation, estimation), correct specification and its role in unbiasedness, Quiz 1 (end at slide 24)
- What happened on Mar 9: Finished remaining points left behind from Mar 7 (why the need for a joint asymptotic distribution, Q-Q plots), motivation for conditional expectations (flexibility, determining whether least squares slope is unbiased), alternative interpretation of conditional expectation as best predictor (along with the proof), understanding and applying law of iterated expectations (end at slide 7, some parts of slides 23 and 24)
Lectures 1 to 5: Linear prediction and least squares fit
- What happened on Mar 7: Finish establishing asymptotic normality (applying asymptotic tools correctly), the need to consistently estimate asymptotic variances, sandwich/robust form of the asymptotic variance, Monte Carlo simulation of I SEE THE MOUSE (What is happening to the standard errors produced by lm() in R?) (finished slides)
- What happened on Mar 2: Review of asymptotic ideas (related to Chapter 4 of main textbook), establishing consistency, what does the least squares fit recover in large samples, start to establish asymptotic normality (end at slide 37)
- What happened on Feb 28: Exercises, interpretation of coefficients of best linear predictor, least squares fit, Monte Carlo simulation of I SEE THE MOUSE (end at slide 34)
- What happened on Feb 23: Consequences of using expected values for prediction, I SEE THE MOUSE, linear prediction rules, common structure of optimization problems in prediction, the linear regression model from a different starting point (end at slide 24)
- What happened on Feb 21: Course setup and details, review of expected values, alternative interpretations of expected value (end at slide 14)
Empirical applications: Papers and datasets
Lecture Slides for Advanced Econometrics 2 (2022 version) by Andrew Adrian Pua is licensed under Attribution-ShareAlike 4.0 International
Powered by