Local sensitivity approximations for selectivity bias

Observational data analysis is often based on tacit assumptions of ignorability or randomness. The paper develops a general approach to local sensitivity analysis for selectivity bias, which aims to study the sensitivity of inference to small departures from such assumptions. If M is a model assuming ignorability, we surround M by a small neighbourhood N defined in the sense of Kullback–Leibler divergence and then compare the inference for models in N with that for M. Interpretable bounds for such differences are developed. Applications to missing data and to observational comparisons are discussed. Local approximations to sensitivity analysis are model robust and can be applied to a wide range of statistical problems.

[1]  E. C. Hammond,et al.  Smoking and lung cancer: recent evidence and a discussion of some questions. , 1959, Journal of the National Cancer Institute.

[2]  I. Bross Spurious effects from an extraneous variable. , 1966, Journal of chronic diseases.

[3]  J. Heckman The Common Structure of Statistical Models of Truncation, Sample Selection and Limited Dependent Variables and a Simple Estimator for Such Models , 1976 .

[4]  J. Schlesselman Assessing effects of confounding variables. , 1978, American journal of epidemiology.

[5]  R. Horwitz The planning of observational studies of human populations , 1979 .

[6]  H. White Maximum Likelihood Estimation of Misspecified Models , 1982 .

[7]  D. Rubin,et al.  Assessing Sensitivity to an Unobserved Binary Covariate in an Observational Study with Binary Outcome , 1983 .

[8]  R. Little,et al.  A note about models for selectivity bias. , 1985 .

[9]  P. Rosenbaum Sensitivity analysis for certain permutation inferences in matched observational studies , 1987 .

[10]  D. Rubin,et al.  Statistical Analysis with Missing Data. , 1989 .

[11]  Nan M. Laird,et al.  Regression Analysis for Categorical Variables with Outcome Subject to Nonignorable Nonresponse , 1988 .

[12]  P. Rosenbaum Sensitivity analysis for matching with multiple controls , 1988 .

[13]  P. McCullagh,et al.  Generalized Linear Models, 2nd Edn. , 1990 .

[14]  Paul R. Rosenbaum,et al.  Sensitivity of Two-Sample Permutation Inferences in Observational Studies , 1990 .

[15]  B. C. Arnold,et al.  Bivariate Distributions with Conditionals in Prescribed Exponential Families , 1991 .

[16]  Alan H. Welsh,et al.  Log-linear models for survey data with non-ignorable non-response , 1993 .

[17]  Taesung Park,et al.  Models for Categorical Data with Nonignorable Nonresponse , 1994 .

[18]  Roderick J. A. Little,et al.  Modeling the Drop-Out Mechanism in Repeated-Measures Studies , 1995 .

[19]  J. Copas,et al.  Hearing Impairment and the Log‐Normal Distribution , 1996 .

[20]  Contribution of the discussion of the paper. Inference from non-random samples by Copas and Li , 1997 .

[21]  J. Copas,et al.  Inference for Non‐random Samples , 1997 .

[22]  Geert Molenberghs,et al.  Discussion of Copas, J.B. and LI, H.G.: inference for non-random samples , 1997 .

[23]  R. Kronmal,et al.  Assessing the sensitivity of regression results to unmeasured confounders in observational studies. , 1998, Biometrics.

[24]  Jonathan J. Forster,et al.  Model‐based inference for categorical survey data subject to non‐ignorable non‐response , 1998 .

[25]  Colin B. Begg,et al.  A New Strategy for Evaluating the Impact of Epidemiologic Risk Factors for Cancer with Application to Melanoma , 1998 .

[26]  G Molenberghs,et al.  Parametric models for incomplete continuous and categorical longitudinal data , 1999, Statistical methods in medical research.

[27]  Roderick J. A. Little,et al.  Adjusting for Nonignorable Drop-Out Using Semiparametric Nonresponse Models: Comment , 1999 .

[28]  David E. Booth,et al.  Analysis of Incomplete Multivariate Data , 2000, Technometrics.

[29]  J. Copas,et al.  The offender group reconviction scale: a statistical reconviction score for use by probation officers , 2002 .

[30]  Eric R. Ziegel,et al.  Generalized Linear Models , 2002, Technometrics.