Subject
Causal Inference for the Social Sciences
General details of the subject
- Mode
- Face-to-face degree course
- Language
- English
Description and contextualization of the subject
Causal inference for the Social Sciences covers methods to establish causal relationships between a treatment, policy or intervention and an outcome or endogenous variable using different types of data: experimental and observational data. A particularly important application of causal inference is the evaluation of public programs or policies. Sometimes, people refer to the methods described in this course as econometric policy evaluation or program evaluation and also as counterfactual impact evaluation. These methods allow the researcher to determine whether a policy or program has the intended effect in a quantitatively sound manner.Teaching staff
Name | Institution | Category | Doctor | Teaching profile | Area | |
---|---|---|---|---|---|---|
GARDEAZABAL MATIAS, FRANCISCO JAVIER | University of the Basque Country | Profesorado Catedratico De Universidad | Doctor | Not bilingual | Fundamentals of Economic Analysis | javier.gardeazabal@ehu.eus |
Competencies
Name | Weight |
---|---|
Entender el papel que juegan los experimentos aleatorios y naturales dentro del método científico | 20.0 % |
Comprender y saber utilizar las diferentes técnicas para establecer las relaciones causa-efecto en experimentos naturales o aleatorios | 40.0 % |
Saber utilizar los métodos de inferencia causal para la evaluación de programas y políticas públicas | 40.0 % |
Study types
Type | Face-to-face hours | Non face-to-face hours | Total hours |
---|---|---|---|
Lecture-based | 24 | 36 | 60 |
Applied computer-based groups | 16 | 24 | 40 |
Training activities
Name | Hours | Percentage of classroom teaching |
---|---|---|
Exercises | 8.0 | 100 % |
Expositive classes | 16.0 | 100 % |
Reading and practical analysis | 60.0 | 0 % |
Tutorials | 16.0 | 100 % |
Assessment systems
Name | Minimum weighting | Maximum weighting |
---|---|---|
Practical tasks | 20.0 % | 40.0 % |
Written examination | 60.0 % | 80.0 % |
Ordinary call: orientations and renunciation
The final grade of the course will be a weighted average of the final and the homeworks. Should it be unfeasible to hold the final exam at the school, an alternative online assessment procedure will be implemented.Extraordinary call: orientations and renunciation
The final grade of the course will be a weighted average of the final and the homeworks. Should it be unfeasible to hold the final exam at the school, an alternative online assessment procedure will be implemented.Temary
1. The scientific method:An outline of the scientific method. Sampling methods. External and internal Validity. Construct validity. Reliability. Levels of measurement. Research design. Types of experiments.
2. Randomized experiments:
Subjects. Treatments. Outcomes. Potential Outcomes. Treatment effects. Random assignment. Estimation. Testing. Regression interpretation. Examples.
3. Regression methods:
Non-random assignment. Selection bias. Conditional Independence. Regression formulation. Propensity score. Estimation and testing. Examples.
4. Matching methods:
Matching at the cell level. Common support. Matching on the score. Nearest neighbor matching. Combining matching and regression. Examples.
5. Inverse Probability Weighting:
Missing data analog. Treatment effects as weighted means. Estimation. Combin- ing inverse probability weighting and regression. Examples.
6. Regression discontinuity design:
Treatment under discontinuity. Treatment effect at the margin. Local regression. Sharp and fuzzy regression discontinuity designs. Estimation. Examples.
7. Instrumental Variables:
Endogenous treatment status. Instrumental variables: relevance and exclusion restrictions. IV estimation. Binary instruments. Local average treatment effects. Examples.
8. Difference-in-differences:
Regression interpretation. Pre-versus post-treatment differences. Treatment versus control differences. Difference-in-differences. Parallel trends. Examples.
9. Panel data methods:
Fixed effects. First differences. Difference-in-differences interpretation. Treatment histories. Propensity score weighting. Dynamic treatment effects. Examples.
10. Comparative case studies:
Case studies and comparative case studies. The synthetic control method. Placebo analysis and inference. Examples.
Bibliography
Compulsory materials
- Angrist, J. D. and J. S. Pischke, 2009. Mostly harmless econometrics; An empiricist's companion. Princeton University Press.- Wooldridge, J. M. 2010. Econometric Analysis of Cross Section and Panel Data. Cambridge, MA: MIT Press. Chapter 18.
Basic bibliography
Surveys on methods:• Imbens, Guido W. and Jeffrey M. Wooldridge. (2009). “Recent developments in the econometrics of program evaluation”. Journal of Economic Literature 47, no. 1: 5-86.
• Abadie, A and M. Cattaneo, (2018). “Econometric Methods for Program Evaluation” Working Paper, March 2018.
References
Articles (most of these are available from the university premises or using VPN)
• Abadie, Alberto and Guido Imbens (2011) “Bias-Corrected Matching Estimators for Average Treatment Effects” Journal of Business and Economic Statistics, 29(1).
• Abadie et al. (2004) “Implementing matching estimators for average treatment effects in Stata” The Stata Journal 4(3), 290–311.
• Abadie, Alberto, (2005). “Semiparametric Difference-in-Differences Estimators.” The Review of Economic Studies 72(1), 1–19.
• Abadie, Alberto, (forthcoming) “Using Synthetic Controls: Feasibility, Data Requirements, and Methodological Aspects” Journal of Economic Literature.
• Abadie, Alberto & Diamond, Alexis & Hainmueller, Jens, (2010). “Synthetic Control Methods for Comparative Case Studies: Estimating the Effect of Califor- nia’s Tobacco Control Program,” Journal of the American Statistical Association 105(490), 493-505.
• Abadie, A., A. Diamond and J. Hainmueller, (2015). “Comparative Politics and the Synthetic Control Method” American Journal of Political Science 59(2), 495–510.
• Abadie, A. and J. Gardeazabal, (2003). “The Economic Costs of Conflict: A Case Study of the Basque Country,” American Economic Review 93, 113-132.
• Angrist, J. (1990). “Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security Administrative Records,” American Economic Review 80, 313-336.
• Angrist, J. and A. Krueger (1991) “Does Compulsory School Attendance Affect Schooling and Earnings?” Quarterly Journal of Economics 106-4 979–1014.
• Battistin, E., A. Brugiavini, E. Rettore and G. Weber, (2009), “The Retirement Consumption Puzzle: Evidence from a Regression Discontinuity Approach,” American Economic Review 99, 2209-26.
• Brand, J. E. and Halaby, C. N., (2006). “Regression and matching estimates of the effects of elite college attendance on educational and career achievement,” Journal Social Science Research 35, 749-770.
• Card, D. and A. B. Krueger (1994). “Minimum Wages and Employment: A Case Study of the Fast-Food Industry in New Jersey and Pennsylvania,” American Economic Review 84, 772-793.
• Dehejia, R. and S. Wahba (1999) “Causal Effects in Non-Experimental Studies: Re-Evaluating the Evaluation of Training Programs,” Journal of the American Statistical Association, 1053–1062.
• Lalonde, R. (1986), “Evaluating the Econometric Evaluations of Training Programs,” American Economic Review 76, 604-620.
• Krueger, A. B. (1999). “Experimental estimates of education production functions,” Quarterly Journal of Economics 114, 497–532.
• Lee, D. (2008) “Randomized Experiments from Non-Random Selection in U.S. House Elections,” Journal of Econometrics 142, 675–697.
• Ludwig, J. and D. L. Miller, (2007) “Does Head Start Improve Children’s Life Chances? Evidence from a Regression Discontinuity Design” The Quarterly Jour- nal of Economics 122 (1): 159-208.
• Lalive, R. (2008). “How do extended benefits affect unemployment duration? A regression discontinuity approach,” Journal of Econometrics 142, 785-806.
• Robins, James, Hernán, Miguel Ángel, Brumback, Babette (2000). “Marginal Structural Models and Causal Inference in Epidemiology,” Epidemiology 11(5), 550-560.
Links
Professor William M.K. Trochim, Cornell University. http://www.socialresearchmethods.net/kb/index.phpJoshua Angrist and Victor Chernozhukov, Massachusetts Institute of Technology: http://ocw.mit.edu/courses/economics/14-387-applied-econometrics-mostly-harmless-big-data-fall-2014/index.htm