As the ABET process in each institution moves toward outcomes assessment, it pushes each program to develop and implement its outcome assessment plans. Assessment plans document how programs will gather data, interpret the findings, and use the results to make improvements in programs, curricula, and resources. The ABET team at North Carolina State University (faculty from each engineering discipline along with assessment-knowledgeable personnel) has developed a model that academic programs within the College of Engineering use to determine what data to gather, where to obtain the data, and what criteria may be most appropriate when interpreting the data. This paper presents the model and the processes by which our programs collect data, examine processes already in place at the institution, and determine redundancies as well as omissions in those processes, methods, and data. This paper also describes the North Carolina State University College of Engineering’s Electronic Assessment Database website (www.engr.ncsu.edu/assessment) which illustrates the on-going process of assessing and improving student learning using this model. The website provides a filter for the expanse of all possible data to help faculty conveniently and effectively access the data. It also provides ways to link programmatic outcomes to curriculum assessment methods including classroom assignments, course portfolios, and capstone design projects. Regional and Programmatic Accreditation North Carolina State University falls under the accreditation auspices of the Commission on Colleges of the Southern Association of Colleges and Schools (SACS). SACS has been a leader in the field of assessment and institutional effectiveness. The Commission on Colleges is currently reviewing the way it evaluates institutions for accreditation. The Commission on “Proceedings of the 2002 American Society of Engineering Education Annual Conference & Exposition Copyright a 2002, American Society of Engineering Education Colleges is working on a new set of criteria and procedures that will be even more heavily based on assessment processes and on providing evidence that the institution “maintains clearly specified educational objectives that are consistent with its mission and appropriate to the degrees it offers.” 1 As part of the new accreditation process, each institution must meet the Core Requirements, and the Comprehensive Standards on institutional effectiveness, as in the examples given below: Core Requirements: The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that incorporate a systematic review of programs and services that (a) results in continuing improvement and (b) demonstrates that the institution is effectively accomplishing its mission. See Core Requirement 5, pg. 8. 1 Comprehensive Standards: The institution identifies expected outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results. See Comprehensive Standard on Institutional Effectiveness, pg. 11. Like SACS, the Accreditation Board for Engineering and Technology (ABET) has also moved toward an outcomes assessment process. Consequently, the concept of outcomes assessment is new for many faculty within engineering programs. All general comprehensive reviews beginning 2001-02 have been conducted under the new outcomes-based criteria (Engineering Criteria 2000 or EC2000). EC2000 states that each program must develop program educational objectives, program outcomes, and develop methods to assess each. Authorities in the field of assessment have asserted that regional or programmatic accreditation cannot be treated as something “different” or something that is completed by a few faculty members. Data collection should not be done in the sterile environment of accreditation. Rather, it must be integrated into the entire institutional framework of assessment to be effective. “The biggest challenge for assessment with respect to serving dual purposes is to generate information based on locally developed methods that can be reported to external audiences in meaningful ways.” Knowing that NC State’s engineering programs would come under an ABET and a SACS review during the spring and fall of 2004, the engineering programs began to develop plans and outcomes assessment processes soon after the last accreditation visit in the fall of 1998. As part of the assessment development, the programs decided that the processes must not only met regional and programmatic accreditation needs, but the processes must also be of value to the faculty. A search of the engineering education literature showed that few engineering programs had a comprehensive system for measuring program results in terms of student learning outcomes. As a first step the College of Engineering developed more specific guidelines for writing program educational objectives. As efforts continued, it became clear that the knowledge and
[1]
Philip Doepker,et al.
The Development And Implementation Of An Assessment Plan For Engineering Programs: A Model For Continuous Improvement
,
1999
.
[2]
J. Prus,et al.
A critical review of student assessment options
,
1994
.
[3]
Trudy W. Banta,et al.
Assessment Essentials: Planning, Implementing, and Improving Assessment in Higher Education. Higher and Adult Education Series.
,
1999
.
[4]
L. Braskamp.
Purposes, Issues, and Principles of Assessment.
,
1991
.
[5]
Lion F. Gardiner.
Assessment Essentials: Planning, Implementing, and Improving Assessment in Higher Education (review)
,
2002
.
[6]
Patrick J. Lynch.
Web Style Guide
,
1999
.
[7]
Rebecca Brent,et al.
EC2000 Criterion 2: A Procedure for Creating, Assessing, and Documenting Program Educational Objectives
,
2001
.
[8]
J. McGourty,et al.
Developing a Comprehensive Assessment Program for Engineering Education *
,
1998
.