“Influence sketching”: Finding influential samples in large-scale regressions
暂无分享,去创建一个
Xuan Zhao | Brian Wallace | Michael Wojnowicz | Matt Wolff | Jay Luan | Ben Cruz | Caleb Crable | M. Wojnowicz | M. Wolff | Xuan Zhao | Ben Cruz | Brian Wallace | Jay Luan | Caleb Crable
[1] Honglak Lee,et al. Efficient L1 Regularized Logistic Regression , 2006, AAAI.
[2] S. Weisberg,et al. Characterizations of an Empirical Influence Function for Detecting Influential Cases in Regression , 1980 .
[3] Carlos Guestrin,et al. "Why Should I Trust You?": Explaining the Predictions of Any Classifier , 2016, ArXiv.
[4] Michael Carl Tschantz,et al. Better Malware Ground Truth: Techniques for Weighting Anti-Virus Vendor Labels , 2015, AISec@CCS.
[5] Sameer Singh,et al. “Why Should I Trust You?”: Explaining the Predictions of Any Classifier , 2016, NAACL.
[6] R. Dennis Cook,et al. Detection of Influential Observation in Linear Regression , 2000, Technometrics.
[7] Xuan Zhao,et al. Projecting "Better Than Randomly": How to Reduce the Dimensionality of Very Large Datasets in a Way That Outperforms Random Projections , 2016, 2016 IEEE International Conference on Data Science and Advanced Analytics (DSAA).
[8] D. Pregibon. Logistic Regression Diagnostics , 1981 .
[9] Kenneth Ward Church,et al. Very sparse random projections , 2006, KDD '06.
[10] Dimitris Achlioptas,et al. Database-friendly random projections: Johnson-Lindenstrauss with binary coins , 2003, J. Comput. Syst. Sci..
[11] M. Verleysen,et al. Classification in the Presence of Label Noise: A Survey , 2014, IEEE Transactions on Neural Networks and Learning Systems.
[12] Michael W. Mahoney. Randomized Algorithms for Matrices and Data , 2011, Found. Trends Mach. Learn..
[13] Nathan Halko,et al. Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions , 2009, SIAM Rev..
[14] Anupam Gupta,et al. An elementary proof of the Johnson-Lindenstrauss Lemma , 1999 .
[15] David W. Hosmer,et al. Applied Logistic Regression , 1991 .