Assessing Readiness of IS Majors to Enter the Job Market: An IS Competency Exam Based on the Model Curriculum

An information systems exit assessment exam, based on the IS Model Curriculum, was given to students nationwide this past spring. This test is being used to evaluate student performance against a nationally recognized standard and to evaluate and improve information systems curricula at participating institutions. This paper aims to persuade IS institutions to support these ongoing efforts to improve IS education through the exit assessment approach presented. Recently, a joint effort has been undertaken between the IS Model Curriculum Task Force and the Institute for the Certification of Computing Professionals (ICCP) that follows a national trend towards accountability in IS education. The goal of the effort is to provide a means of objectively assessing whether or not IS majors are prepared to enter the workforce and thereby assess whether or not four-year degree programs in IS are properly educating their students. Achievement of this goal is through the development and use of an IS competency exam based on the IS Model Curriculum (Gorgone et al. 2002). The approach used is to develop a test that not only reflects what IS students need to know, but does so at a level that reflects culminating skills—those at a maturity level reached at graduation time and required for the entry level job market. The purpose of this paper is to persuade the IS community that we have a viable approach to assess the extent to which institutions with four-year degree programs in IS are producing students who are capable of effective entry into the IS job market. We also hope to persuade IS institutions to support these efforts by participating in the curriculum assessment and improvement process. These efforts have implications for IS accreditation. Are My Students Ready? Institutions with four-year degree programs in information systems all face the fundamental challenge of providing a useful, lifelong education in the field of information systems, while simultaneously providing a practical, professional preparation. Fundamental questions include: Are my IS majors ready for the job market? From a curriculum standpoint, why or why not? How can we improve our IS curriculum to do a better job? IS Exit Assessment as an Approach An IS exit assessment exam will be an objective way for an institution to assess its students and its curriculum against a recognized standard. Currently, IS institutions have no such exam to use in their portfolios of assessment mechanisms. They have to rely on course-by-course assessments, a very qualitatively-oriented accreditation self-study process, and reports from advisory boards and recruiters of their students. Landry et al./Assessing Readiness of IS Majors 2003 — Ninth Americas Conference on Information Systems 3083 Course -level objectives (learning units) Exit-level skills Exit-level Objectives Exit-level test items Local course materials We propose an approach that uses a new competency exam as a means of exit assessment and curriculum improvement. The approach used to develop the IS competency exam has the following characteristics. • use of learning units (educational objectives) in the model curriculum • incorporation of the “exit skills” related to practical readiness • development of context-specific, high-level questions (Bloom 1956) Figure 1. IS Exit Objective and Test Item Design Process Learning units are representations of education goals and objectives for the IS model courses. LUs are organized by course and can be related to the body of knowledge. LUs reflect what students should know as a result of completing their coursework. Skills are more practical and job-related. The skills in the model curriculum subsume the LUs and are organized around the fundamental things that an IS worker should be able to do in an entry level job. The skills contain links to a set of job ad-word vocabulary developed in a study that was part of a recent revision to the model curriculum (Landry et al. 2000). Job ads were obtained for four categories of entry-level jobs, including programmer, IS analyst, database analyst, and network administrator. In a multiple choice format, it is easy to write poor questions. In particular, a test item could focus on an issue that is trivial or that involves understanding jargon. Usually, such an item is testing knowledge at a low (Bloom) level, such as recognition or differentiation. To create a true exit exam, there needs to be questions aimed at a higher Bloom level, dealing with problems that are at the job-level and that incorporate knowledge with skills. The approach we used was to link the learning units with their related skills, and then write what we termed an exit objective for each pair. We then used these exit objectives as a basis for writing test items. A process model of our approach is shown in Figure 1. A three-step summary of this process was provided to test item writers at a workshop held earlier this year: 1. For each skill/LU combination, create an exit-level learning objective that will achieve the desired outcome in the context of that particular skill. 2. After creating the Exit-Level Learning objective, create a multiple-choice question (including stem, correct answer, and, as time permits, appropriate distracters) that will assess that objective at the appropriate cognitive (Bloom) level. 3. Repeat steps 1 and 2 for each skill listed before going to the next LU. As an example, the following exit objective and test item was written for the workshop: For IS model course IS.7, LU #72 IS Analysis and Design Tasks and the related sub-skill 3.1.4 Information Systems Analysis and Design, an example would be: Exit-Level Learning Objective: Given an implementation scenario, such as an airline reservation system, develop an appropriate conversion strategy. Special Educational Track: Teaching/Curriculum Innovations 3084 2003 — Ninth Americas Conference on Information Systems Question Stem: For a company that is implementing a new web-based e-commerce application to replace the phone-in catalog order system, the BEST conversion strategy would be: (correct answer in parentheses) a. Remove the old system and install the new system *b. Run the old system and new system in parallel c. Install the new system one module at a time d. Implement the whole system one department at a time Recent Activities—A Report Recent efforts by the IS Model Curriculum Task Force, in conjunction with the ICCP, have enlisted the assistance of volunteer faculty members from across the U.S. and Canada. A workshop held in early February, along with post-workshop efforts of participants, resulted in the development of exit objectives and test items for an IS Competency Exam. A total of 230 exit objectives and a total of 303 test questions have been written to date. Subsequent to the workshop, an extensive revision process was undertaken to refine the test items into a homogenous group of exit-level items that covered each individual LU and sub-skill. Although items are written for exit objectives, item scores can be aggregated to assess performance according to learning units or skills. Exams were given to students in April and May. Results were analyzed and provided to participating schools at a June workshop. Results will be able to be tied back to specific courses in the curriculum, using an approach that involves the use of the model curriculum both in test development and in curriculum analysis. Test items statistics, such as the point-biserial correlation coefficient, are calculated and used to assess and improve test items. All test items were pilot tested at a student conference, and all future items will either be pilot tested similarly or as a small group of “no-score” items on the prior year’s official test.