Canadian Traditions and Directions in Information Systems Research

As a field of study matures, interest in its evolution and future increases. Individuals begin to study the choice of research topics and methods. Frameworks for research emerge, authors call for overarching theory to guide research, and articles appear analyzing the work done to date. In the information systems (IS) field, this interest surfaced most clearly in 1980, at the First International Conference on Information Systems (ICIS), where leaders in the field presented papers and led discussions on the evolution of IS research and the directions it should take in the future. Since the 1980 ICIS, we have seen conferences, colloquia, and journals that concentrated on research methodology and paradigms. In 1984, the Harvard Business School sponsored a Research Colloquium on Information Systems to consider research history and needs in the field. Nineteen eighty-five brought the publication of Research Methods in Information Systems (Mumford, 1985). More recently, ICIS 1989 and 1990 included paper tracks on research methodology and direction, the IFIP WG 8.2 1990 Working Conference chose "The Information Systems Research Arena of the 90s" as its theme, the 1992 Hawaii International Conference on Systems Sciences (HICSS-25) included sessions on methodology and measurement issues, and the ICIS 1993 included a track on relevant IS research methods. The development of thought in a particular field can be traced in two ways. First, surveying and analyzing published research demonstrate historic topics of interest, paradigms, and issues. There are numerous examples of such reviews in IS, including work by Alavi et al. (1989), Culnan (1986), Elam et al. (1986), and Swanson and Ramiller (1993). These studies examine the field's publications and categorize its historical evolution. Such work tells us where we've come from and where we are now. One shortcoming in studying publications is the inevitable lag between the research project and publication of its results. From January, 1990, through December, 1993, for example, the average time from submission of an IS-related paper to its publication in Management Science ranged from 15 to 44 months, with a mean lag of 27.2 months. In Information Systems Research, time to acceptance during the same period ranged from 8 to 34 months, with a mean of 22.4 months. Conference proceedings fare better, but are rarely included in publication studies. Thus, studies that look at published papers run the risk of describing the field as it existed two or more years earlier. Such studies also often use only a fragment of published research, such as that appearing in top journals or by the best-known researchers. As a result, they report on only a part of the field and may ignore significant research by newcomers. A second shortcoming is the publication bias (Light & Pillemer, 1984) or file drawer problem (Rosenthal, 1978) arising in studies of publications. Some research fails to demonstrate significant effects, and thus may not be considered worth publishing. In other cases, the research may not explore a "new" or "exciting" issue, and thus be less apt to find an outlet. As a result, counts or analyses of publications can underestimate the relative amount of research in a particular area of IS. There are alternatives to publication analyses. One is to track the topics being studied that have yet to lead to published papers. Notable among sources for such research are doctoral student research and works in progress by established researchers. By studying the topics investigated in these works, we can anticipate where future research is heading, and what the favored topics, issues, and paradigms may become. This approach looks forward from the present. Teng and Galletta (1991) surveyed 397 MIS researchers to assay the quality of historical work and research-in-progress, They identified three vital issues that would help demonstrate IS's progress as a discipline, or movement out of a preparadigmatic state. …