Instrument variable detection with graph learning : an application to high dimensional GIS-census data for house pricing

Endogeneity bias and instrument variable validation have always been important topics in statistics and econometrics. In the era of big data, such issues typically combine with dimensionality issues and, hence, require even more attention. In this paper, we merge two well-known tools from machine learning and biostatistics---variable selection algorithms and probablistic graphs---to estimate house prices and the corresponding causal structure using 2010 data on Sydney. The estimation uses a 200-gigabyte ultrahigh dimensional database consisting of local school data, GIS information, census data, house characteristics and other socio-economic records. Using "big data", we show that it is possible to perform a data-driven instrument selection efficiently and purge out the invalid instruments. Our approach improves the sparsity of variable selection, stability and robustness in the presence of high dimensionality, complicated causal structures and the consequent multicollinearity, and recovers a sparse and intuitive causal structure. The approach also reveals an efficiency and effectiveness in endogeneity detection, instrument validation, weak instrument pruning and the selection of valid instruments. From the perspective of machine learning, the estimation results both align with and confirms the facts of Sydney house market, the classical economic theories and the previous findings of simultaneous equations modeling. Moreover, the estimation results are consistent with and supported by classical econometric tools such as two-stage least square regression and different instrument tests. All the code may be found at \url{https://github.com/isaac2math/solar_graph_learning}.

[1]  Richard Scheines,et al.  Semi-Instrumental Variables: A Test for Instrument Admissibility , 2001, UAI.

[2]  R. Tibshirani,et al.  A note on the group lasso and a sparse group lasso , 2010, 1001.0736.

[3]  Robert Schlaifer,et al.  On the interpretation and observation of laws , 1988 .

[4]  Nir Friedman,et al.  Probabilistic Graphical Models - Principles and Techniques , 2009 .

[5]  R. Tibshirani,et al.  A SIGNIFICANCE TEST FOR THE LASSO. , 2013, Annals of statistics.

[6]  Peng Zhao,et al.  On Model Selection Consistency of Lasso , 2006, J. Mach. Learn. Res..

[7]  C. Blyth On Simpson's Paradox and the Sure-Thing Principle , 1972 .

[8]  Ning Xu,et al.  Solar: a least-angle regression for accurate and stable variable selection in high-dimensional data , 2020, ArXiv.

[9]  Bernhard Schölkopf,et al.  Nonlinear causal discovery with additive noise models , 2008, NIPS.

[10]  Aapo Hyvärinen,et al.  A Linear Non-Gaussian Acyclic Model for Causal Discovery , 2006, J. Mach. Learn. Res..

[11]  Ricardo Silva,et al.  Learning Instrumental Variables with Structural and Non-Gaussianity Assumptions , 2017, J. Mach. Learn. Res..

[12]  J. Pearl Causality: Models, Reasoning and Inference , 2000 .

[13]  Bin Yu,et al.  On Model Selection Consistency of the Elastic Net When p >> n , 2008 .

[14]  David Maxwell Chickering,et al.  Large-Sample Learning of Bayesian Networks is NP-Hard , 2002, J. Mach. Learn. Res..

[15]  R. Tibshirani,et al.  Sparse inverse covariance estimation with the graphical lasso. , 2008, Biostatistics.

[16]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.

[17]  David Maxwell Chickering,et al.  Learning Bayesian Networks: The Combination of Knowledge and Statistical Data , 1994, Machine Learning.

[18]  Jianqing Fan,et al.  NETWORK EXPLORATION VIA THE ADAPTIVE LASSO AND SCAD PENALTIES. , 2009, The annals of applied statistics.

[19]  E. H. Simpson,et al.  The Interpretation of Interaction in Contingency Tables , 1951 .

[20]  Jean-Baptiste Denis,et al.  Bayesian Networks , 2014 .

[21]  Donald Eugene. Farrar,et al.  Multicollinearity in Regression Analysis; the Problem Revisited , 2011 .

[22]  R. Tibshirani,et al.  Strong rules for discarding predictors in lasso‐type problems , 2010, Journal of the Royal Statistical Society. Series B, Statistical methodology.

[23]  Jianqing Fan,et al.  Sure independence screening for ultrahigh dimensional feature space , 2006, math/0612857.

[24]  David Maxwell Chickering,et al.  Optimal Structure Identification With Greedy Search , 2002, J. Mach. Learn. Res..

[25]  H. Zou,et al.  Addendum: Regularization and variable selection via the elastic net , 2005 .

[26]  Jianqing Fan,et al.  Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties , 2001 .