The utility of the PND statistic: a reply to Allison and Gorman.

Quantitative synthesis ("meta-analysis") of single-subject research has rarely been conducted, partly because of a lack of agreement on how study outcomes are to be quantified. This article provides a response to Allison and Gorman (Behaviour Research and Therapy, 31, 621-631, 1993), who listed some problematic characteristics of use of the PND (percent of non-overlapping data) statistic for computing single-subject study outcomes, and recommended a regression-based solution to computation of effect sizes from single-subject research reports. Although Allison and Gorman are generally accurate in pointing out some limitations of the use of the PND statistic, they have been less thorough in identifying its relative strengths. Among these strengths is the fact that the PND statistic and its variations (a) have been shown to be strongly related to qualitative, "expert" ratings, (b) have been successfully employed in at least seven separate integrative reviews, and (c) have produced results which are complementary to more qualitative reviews of the same literature. In contrast, Allison and Gorman did not report results of applications of their procedure and, although their procedure has apparent theoretical support, it may be less useful in synthesizing existing single-subject literature.

[1]  Helena C. Kraemer,et al.  A nonparametric technique for meta-analysis effect size calculation. , 1982 .

[2]  J. R. Scotti,et al.  A meta-analysis of intervention research with problem behavior: treatment validity and standards of practice. , 1991, American journal of mental retardation : AJMR.

[3]  Thomas E. Scruggs,et al.  Early Intervention for Developmental Functioning: A Quantitative Synthesis of Single-Subject Research , 1988 .

[4]  A. L. Edwards,et al.  Analysis of nonorthogonal designs: The 2-super(k ) factorial experiment. , 1982 .

[5]  Thomas E. Scruggs,et al.  Reply to Owen White , 1987 .

[6]  D M Baer,et al.  An implicit technology of generalization. , 1977, Journal of applied behavior analysis.

[7]  Ronald C. Serlin,et al.  Meta-analysis for single-case research. , 1992 .

[8]  M. Mastropieri,et al.  The Efficacy of Early Intervention Programs: A Meta-Analysis , 1986, Exceptional children.

[9]  Donald M. Baer,et al.  Meta-analysis for Single-Subject Research , 1987 .

[10]  R. R. Jones,et al.  Effects of serial dependency on the agreement between visual and statistical inference. , 1978, Journal of applied behavior analysis.

[11]  A. Kazdin,et al.  Methodological and interpretive problems of single-case experimental designs. , 1978, Journal of consulting and clinical psychology.

[12]  Thomas E. Scruggs,et al.  The Quantitative Synthesis of Single-Subject Research , 1987 .

[13]  Ann Casey,et al.  Nonaversive Procedures in the Treatment of Classroom Behavior Problems , 1985 .

[14]  O. White,et al.  Some Comments Concerning “The Quantitative Synthesis of Single-Subject Research” , 1987 .

[15]  Thomas E. Scruggs,et al.  Early Intervention for Socially Withdrawn Children , 1985 .

[16]  Kenneth A. Kavale,et al.  Early Language Intervention , 1988 .

[17]  Ann Casey,et al.  A Methodology for the Quantitative Synthesis of Intra-Subject Design Research , 1985 .

[18]  B. Gorman,et al.  Calculating effect sizes for meta-analysis: the case of the single case. , 1993, Behaviour research and therapy.

[19]  T. Scruggs Advances in Learning and Behavioral Disabilities , 1992 .

[20]  W. Yeaton A critique of the effectiveness of applied behavior analysis research , 1982 .

[21]  Thomas E. Scruggs,et al.  Early Intervention for Children with Conduct Disorders: A Quantitative Synthesis of Single-Subject Research , 1986 .

[22]  P. Strain,et al.  Peer Social Initiations: Effective Intervention for Social Skills Development of Exceptional Children , 1986, Exceptional children.

[23]  Barbara J. Smith,et al.  A Counter-Interpretation of Early Intervention Effects: A Response to Casto and Mastropieri , 1986 .