A toolkit for evaluating public R&D investment models, methods, and findings from ATP's first decade

Evaluation is an essential component of publicly funded R&D programs, both in support of program management and public policy. The Advanced Technology Program (ATP) has emerged over its first decade as a leader in evaluation, engaging nationally prominent evaluators to apply new and existing methods in building an analytical and empirical basis for ATP’s operations and performance. This report draws from a body of 45 studies commissioned by ATP between 1990 and 2000 and analyzes the methods and techniques used and examines the findings of those studies. These studies have increased understanding not only of ATP but also of the dynamics of innovation systems and the relationships between public and private sector funding of R&D. The findings examined are organized around five major themes: firm/industry effects, collaboration effects, spillover effects, interfaces and comparisons with other programs, and measures of overall program performance. The extensive toolkit of evaluation methods presented in the report illustrates how those methods can be used to answer a variety of stakeholder questions. Methods include survey, descriptive and economic case study, bibliometrics, historical tracing, econometrics, expert judgment, social network analysis, cost index, and a composite performance rating system constructed from indicator metrics. Additionally, the use of analytical and conceptual modeling to explore a program’s underlying relationships and process dynamics is considered. The political economy of ATP is discussed, and an evaluation framework and an overview of evaluation best practices are provided. The report integrates and condenses a large body of related research and thus provides ATP with a convenient reference work, toolkit, and planning guide. For those administrators of other programs, public policy makers, and evaluators, the report also serves as an evaluation toolkit by providing a logical framework

[1]  Barry Bozeman,et al.  Peer Review and Evaluation of R&D Impacts , 1993 .

[2]  R. Grol,et al.  Quality improvement by peer review , 1994 .

[3]  Richard May,et al.  The STARLIGHT information visualization system , 1997, Proceedings. 1997 IEEE Conference on Information Visualization (Cat. No.97TB100165).

[4]  Robert D. Tortora,et al.  Sampling: Design and Analysis , 2000 .

[5]  Linton C. Freeman,et al.  Carnegie Mellon: Journal of Social Structure: Visualizing Social Networks Visualizing Social Networks , 2022 .

[6]  P. Kokkonen,et al.  [Resources in the future]. , 2001, Duodecim; laaketieteellinen aikakauskirja.

[7]  Edward M. Gramlich,et al.  A guide to benefit-cost analysis , 1990 .

[8]  Henry G. Small,et al.  A SCI-Map case study: Building a map of AIDS research , 1994, Scientometrics.

[9]  Amalya L. Oliver,et al.  Social Networks, Learning, and Flexibility: Sourcing Scientific Knowledge in New Biotechnology Firms , 1994 .

[10]  J. Calder Survey research methods , 1998, Medical education.

[11]  Edwin Mansfield,et al.  Social Returns from R&D: Findings, Methods and Limitations , 1991 .

[12]  F. Narin,et al.  The transfer of public science to patented technology: A case study in agricultural science , 1997 .

[13]  Edwin Mansfield,et al.  Academic research and industrial innovation , 1991 .

[14]  Ronald N. Kostoff,et al.  The principles and practices of peer review , 1997 .

[15]  L. Branscomb,et al.  Funding civilian and dual-use industrial technology , 1993 .

[16]  Kimberly S. Hamilton,et al.  Linkage between agency-supported research and patented industrial technology , 1995 .

[17]  D. Mowery,et al.  Analysing The Economic Payoffs From Basic Research , 1992 .

[18]  Caroline Haythornthwaite,et al.  Studying Online Social Networks , 2006, J. Comput. Mediat. Commun..

[19]  J. Conley Asking questions: A practical guide to questionnaire design , 1983 .

[20]  A. Boardman,et al.  Cost-Benefit Analysis: Concepts and Practice , 1996 .

[21]  Michael Gibbons,et al.  THE USE OF CO-NOMINATION ANALYSIS IN THE EVALUATION OF COLLABORATIVE RESEARCH , 1988 .

[22]  K. Eisenhardt Building theories from case study research , 1989, STUDI ORGANIZZATIVI.

[23]  David Roessner,et al.  Quantitative and qualitative methods and measures in the evaluation of research , 2000 .

[24]  James N. Baron,et al.  Resources and Relationships: Social Networks and Mobility in the Workplace , 1997 .

[25]  M. Newman,et al.  The structure of scientific collaboration networks. , 2000, Proceedings of the National Academy of Sciences of the United States of America.