暂无分享,去创建一个
[1] Yoshua Bengio,et al. Generative Adversarial Nets , 2014, NIPS.
[2] Michèle Sebag,et al. Towards AutoML in the presence of Drift: first results , 2018, IJCAI 2018.
[3] Santanu Chaudhury,et al. Automatically Optimized Gradient Boosting Trees for Classifying Large Volume High Cardinality Data Streams Under Concept Drift , 2019 .
[4] Xinkun Nie,et al. Quasi-oracle estimation of heterogeneous treatment effects , 2017, Biometrika.
[5] Gaël Varoquaux,et al. Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..
[6] J. Friedman. Greedy function approximation: A gradient boosting machine. , 2001 .
[7] Richard A. Olshen,et al. CART: Classification and Regression Trees , 1984 .
[8] D. Rubin,et al. The central role of the propensity score in observational studies for causal effects , 1983 .
[9] Tianqi Chen,et al. XGBoost: A Scalable Tree Boosting System , 2016, KDD.
[10] P. Austin. An Introduction to Propensity Score Methods for Reducing the Effects of Confounding in Observational Studies , 2011, Multivariate behavioral research.
[11] Leo Breiman,et al. Random Forests , 2001, Machine Learning.
[12] Tie-Yan Liu,et al. LightGBM: A Highly Efficient Gradient Boosting Decision Tree , 2017, NIPS.
[13] Sören R. Künzel,et al. Metalearners for estimating heterogeneous treatment effects using machine learning , 2017, Proceedings of the National Academy of Sciences.
[14] J. Robins,et al. Semiparametric Efficiency in Multivariate Regression Models with Missing Data , 1995 .
[15] Uri Shalit,et al. Estimating individual treatment effect: generalization bounds and algorithms , 2016, ICML.