Obstacles in assessing academic conditions can include generating interest in assessment efforts in order to achieve high response rates, transcending communication barriers, preserving confidentiality, minimizing biases from numerous sources, and conducting meaningful statistical analyses. A graduate environmental engineering program needed to overcome these obstacles to create a valid assessment tool. Previous program surveys did not amply address specific student concerns. Those surveys had poorly designed questions and answer formats. Survey distribution had relied on students to retrieve and return surveys themselves. Data analysis had consisted of only computing mean values and compiling comments. Results of the surveys had suffered from low response rates, biases, and demographic underrepresentation. A graduate-student committee designed a survey considering the aforementioned problems. “The improvement of research quality” was the overall survey theme, and four subtopics -research resources, research preparation, research views and attitudes, and research-group support -were created to generate specific question ideas from the student population at-large. Questions were included in the survey based on importance, the actionable nature of obtained knowledge, and other criteria. Background and control questions were included for categorizing respondents. Sensitive natures of some questions were addressed to reduce biases. The format of the survey was tailored to make respondents comfortable and interested in participating. Question quality was examined through a pilot study and reviews by professionals. Answer formats were mainly closed-ended with most open-ended questions providing supplemental information. Hand-distribution and hand-collection were intended to make the survey tangible, appreciable, and accessible for respondents. Univariate analysis produced meaningful findings regarding individual variables, while bivariate/multivariate analysis determined correlations among multiple variables. Sensitivity analysis was also conducted to uncover potential biases in answering behavior for students who both were involved in survey design and responded to the survey. We submit that our survey effort was successful overall due to high response, accurate demographic representation, positive student feedback, reduced biases, and significance of findings. 50 students (greater than 75% of the population) responded to the survey. Some of the salient findings indicate deficiencies in communication and statistics education, deficiencies in overall research preparation for first-year and master’s students, an overall failing of a laboratory course to provide research-skill education, and a lack of guidance from research-group members for some students. Our improved survey has led students and faculty members in the program to appreciate internal assessment and encourage the student committee to continue its efforts. The committee is beginning to solve problems discovered in this study and will be continuing to use the successful method in further assessment. Our method is believed to be applicable for engineering programs that must deal with common obstacles in making a sound assessment tool. P ge 11130.2
[1]
Catherine E. Brawner,et al.
Technology in Engineering Education: What Do the Faculty Know and Want?
,
1999
.
[2]
J. Calder.
Survey research methods
,
1998,
Medical education.
[3]
John B. Napp,et al.
Survey of Library Services at Engineering News Record's Top 500 Design Firms: Implications for Engineering Education
,
2004
.
[4]
David M. Rooney,et al.
The Alumni Survey as an Effective Assessment Tool for Small Engineering Programs
,
2002
.
[5]
A. S. Linsky.
STIMULATING RESPONSES TO MAILED QUESTIONNAIRES: A REVIEW
,
1975
.
[6]
R. Felder,et al.
A Survey of Faculty Teaching Practices and Involvement in Faculty Development Activities
,
2002
.
[7]
J.S. Collofello,et al.
Assessing the process maturity utilized in software engineering team project courses
,
1994,
FIE'99 Frontiers in Education. 29th Annual Frontiers in Education Conference. Designing the Future of Science and Engineering Education. Conference Proceedings (IEEE Cat. No.99CH37011.
[8]
A. Zydney,et al.
Faculty Perspectives Regarding the Undergraduate Research Experience in Science and Engineering
,
2002
.
[9]
Eric R. Ziegel,et al.
Probability and Statistics for Engineering and the Sciences
,
2004,
Technometrics.
[10]
D. Dillman.
Mail and telephone surveys : the total design method
,
1979
.
[11]
M. N. Donald,et al.
IMPLICATIONS OF NONRESPONSE FOR THE INTERPRETATION OF MAIL QUESTIONNAIRE DATA
,
1960
.
[12]
SeanSt. Clair,et al.
Faculty Use and Impressions of Courseware Management Tools: A National Survey
,
2003
.
[13]
S. Whitesides,et al.
Gender and Graduate School: Engineering Students Confront Life after the B. Eng.
,
2002
.
[14]
Barbara M. Moskal,et al.
Assessment in Engineering Education: Evolution, Approaches and Future Collaborations
,
2005
.