Some Assessment Tools For Evaluating Curricular Innovations Outcomes

One of the most critical aspects of the new ABET Engineering Criteria 2000 (EC-2000) is the existence of an outcomes assessment plan for program evaluation and continuous improvement. Outcomes assessment requires the generation of assessment tools or instruments to gather data that will document if a program’s stated goals and objectives are being met and if students have acquired identified skills. In 1994, a partnership of universities called the Manufacturing Engineering Education Partnership (MEEP) initiated the design and implementation of a novel undergraduate manufacturing program, better known as the Learning Factory. This paper describes how MEEP designed the assessment strategy to evaluate the curricular innovation project outcomes, and presents some of the assessment instruments/tools designed. The tools developed, some in collaboration with industrial partners, were utilized for assessing overall and specific qualitative aspects of the program as well as student performance (e.g., teamwork skills and oral presentation/written skills). A total of 9 assessment instruments are presented. We believe that the Learning Factory as well as the project’s assessment strategy and tools used comply with the new ABET Engineering Criteria 2000 (EC-2000). Introduction The creation and adoption of ABET’s new accreditation standards is a historic move to promote innovation and continuous improvement in engineering education . The core of EC 2000 is an outcomes assessment component that requires engineering programs to have in place a continuous process of evaluation and feedback, to ensure the improvement of the effectiveness of the program. There are numerous resources available for the development and implementation of outcomes assessment plans. For example, Rogers and Sando have prepared a user friendly, step by step booklet that presents eight steps in developing an assessment plan . But regardless of how the assessment plan is developed, an effective plan must start with the identification of specific goals and objectives, definition of performance criteria, followed by the data collection 1 Penn State University, University of Washington, and the University of Puerto Rico at Mayagüez in collaboration with Sandia National Laboratories. Project sponsored by the Technology Reinvestment Project. (TRP Project #3018, NSF Award #DMI9413880) 2 John S. Lamancusa, Jens E. Jorgensen, and José L. Zayas, The Learning Factory – A New Approach to Integrating Design and Manufacturing into Engineering Curricula. ASEE Journal of Engineering Education, Vol 86, No.2, April 1997. 3 George D. Paterson, Engineering Criteria 2000: A Bold New Change Agent , ASEE PRISM, September, 1997. 4 Gloria M. Rogers and Jean K. Sando, Stepping Ahead: An Assessment Plan Development Guide, Foundation Coalition, 1996. P ge 301.1 Some Assessment Tools for Evaluating Curricular Innovations Outcomes ASEE Conference June 1998 2 methods and tools and, finally, the elaboration of feedback mechanisms. Data collection requires the development of assessment instruments focused for appropriate audiences. Either prompted by EC-2000 or by the desire to improve quality standards, engineering programs have started to gather data for use in appraisal and improvements efforts in their institutional programs. For example, the College of Engineering of Auburn University has developed a plan to assess the quality of their instructional programs, designing various assessment tools for that purpose . In the case of the Manufacturing Engineering Education Partnership (MEEP), a coalition of institutions who in response to industry needs, has developed an innovative manufacturing engineering curriculum and physical facilities for product realization (See Figure 1). This program offers a new paradigm for engineering education, providing a balance between theory and practice and emphasizing the development of basic skills in the student. The desired skills include communication, teamwork, business concerns and project management. Detailed information about the program can be found in the website, http://lfserver.lf.psu.edu/LF/col_home.html . A CD-ROM with curricular materials and publications can be requested. 6 This paper describes 1) how MEEP designed the assessment strategy to evaluate this curricular innovation outcomes, and 2) some of the assessment instruments used. The tools developed, some in collaboration with industrial partners, were utilized to assess overall and specific qualitative aspects of the program, as well as student performance. Assessment Strategy Developing MEEP’s assessment strategy proceeded rather easy because the project’s goals and objectives had been clearly defined in the project’s Strategic Plan. 7 An assessment team was formed and the strategy discussed and shared with all the constituents (faculty, students, industrial partners). It was agreed that in order to have comprehensive and valid results the assessment plan should have the following elements: • Internal (self-assessments) • External (outside the partnership) • Multiple criteria (variety of modes and viewpoints) • Holistic (integrated) • Qualitative and quantitative components. 5 Larry D. Benefield, Landa L. Trentham, Karen Khodadadi, and Willieam F. Walker, Quality Improvement in a College of Engineering Instructional Program, Journal of Engineering Education, January, 1997. 6 To request a CD-ROM contact John S. Lamancusa, Mechanical Engineering, Penn State University, email: jsl3@psu.edu. 7 Strategic Plan for the Manufacturing Engineering Education Partnership , September 1994. Professional Engineer skills freshman year )LJXUH 0((3 &XUULFXOXP0RGHO Product Dissection Graphics & Design Manufacturing Processes Entrepreneurship Concurrent Engineering Interdisciplinary Design Project BUSINESS ENVIRONMENT PARTNERING WITH INDUSTRY HANDS-ON REAL-LIFE