Factors Considered in the Assessment of Computer Science Engineering Capstone Projects and Their Influence on Discrepancies Between Assessors

A capstone project is an extensive learning experience traditionally developed during a student's final academic year. Assessing such a complex assignment involves several challenges and is usually based upon the evaluations of at least two different people: the capstone project advisor, and one or more other assessors. Quantitative studies comparing only different assessors' grades and qualitative studies investigating the origin of possible discrepancies have been conducted. In both cases, contradictory conclusions were reached. The objective of this study is to analyze the factors that are given consideration by assessors of engineering capstone projects and the influence of these factors on the discrepancies between different assessors’ opinions of the same project. This study quantitatively examined 162 computer science engineering capstone projects developed by one student and supervised by one advisor. Each project was assessed by the project advisor and a committee. For each project, the advisor and the committee were asked to complete an additional questionnaire on product characteristics, student competences, and project supervision. Competences demonstrated by the student were found to be the most relevant element when a capstone project was evaluated by the advisor and the committee; product characteristics were found to be second in influence. Furthermore, advisors grant minor significance to the advisor-involvement component. Discrepancies between grades seem to be associated with those aspects to which one assessor has access, while the other does not, such as student skills demonstrated during project development or their performance in the oral defense. Both the advisor's and the committee's perspectives are important in the assessment of this complex task and they complement one another.

[1]  Vivienne Farrell,et al.  Capstone project: fair, just and accountable assessment , 2012, ITiCSE '12.

[2]  G Shobha,et al.  Effective Approach in Making Capstone Project a Holistic Learning Experience to Students of Undergraduate Computer Science Engineering Program. , 2018 .

[3]  R. C. Woods,et al.  Comparison of two quantitative methods of determining rater bias , 2003 .

[4]  Suellen Shay,et al.  The assessment of complex tasks: a double reading , 2005 .

[5]  César Domínguez,et al.  Surveying and benchmarking techniques to analyse DNA gel fingerprint images , 2015, Briefings Bioinform..

[6]  J. Bettany-Saltikov,et al.  Bones, boys, bombs and booze: an exploratory study of the reliability of marking dissertations across disciplines , 2009 .

[7]  U. Sveen,et al.  Grade Correspondence between Internal and External Examiners of Occupational Therapy Students’ Bachelor Theses , 2018, Uniped.

[8]  César Domínguez,et al.  Incorporating Computing Professionals’ Know-how , 2019, ACM Trans. Comput. Educ..

[9]  Tony Clear Thinking ISsues: the three p's of capstone project performance , 2009, SGCS.

[10]  Paul D. Ellis,et al.  The essential guide to effect sizes : statistical power, meta-analysis, and the interpretation of research results , 2010 .

[11]  Spencer P. Magleby,et al.  Designing a Senior Capstone Course to Satisfy Industrial Customers , 1993 .

[12]  S. Shay The Assessment of Complex Performance: A Socially Situated Interpretive Act. , 2004 .

[13]  Hiroshi Ito,et al.  Is a rubric worth the time and effort? Conditions for its success , 2015 .

[14]  Jeewani Anupama Ginige,et al.  Towards Criteria Based Allocation of Capstone Projects for an Enhanced Learning Experience , 2008, 2008 International Conference on Computer Science and Software Engineering.

[15]  Andy P. Field,et al.  Discovering Statistics Using SPSS , 2000 .

[16]  R. C. Woods Iterative processing algorithm to detect biases in assessments , 2003, IEEE Trans. Educ..

[17]  Chensong Dong Assessment of mechanical engineering final year projects using fuzzy multi attribute utility theory , 2012 .

[18]  Renee Ka Yin Chin,et al.  Towards an efficient final year project examination: Rules and regulation , 2017, 2017 IEEE 9th International Conference on Engineering Education (ICEED).

[19]  Abel Nyamapfene,et al.  Involving supervisors in assessing undergraduate student projects: is double marking robust? , 2012 .

[20]  César Domínguez Pérez,et al.  Supervision Typology in Computer Science Engineering Capstone Projects , 2012 .

[21]  Robert Feldt,et al.  Assessment and Support for Software Capstone Projects at the Undergraduate Level: A Survey and Rubrics , 2011, 2011 Frontiers of Information Technology.

[22]  Håkan Burden,et al.  Involving External Stakeholders in Project Courses , 2018, ACM Trans. Comput. Educ..

[23]  John A. Marin,et al.  Elements of an Optimal Capstone Design Experience , 1999 .

[24]  Andy Field,et al.  Discovering statistics using SPSS, 2nd ed. , 2005 .

[25]  Paul M. Leidig,et al.  Resources for instructors of capstone courses in computing , 2001, ITiCSE-WGR '01.

[26]  K. L. Chan Statistical analysis of final year project marks in the computer engineering undergraduate program , 2001, IEEE Trans. Educ..

[27]  Stakeholder perspectives on workplace-based performance assessment: towards a better understanding of assessor behaviour , 2017, Advances in health sciences education : theory and practice.

[28]  B. McKinstry,et al.  Leniency and halo effects in marking undergraduate short research projects , 2004, BMC Medical Education.

[29]  R. Haigh,et al.  Enhancing the quality and consistency of undergraduate dissertation assessment: A case study , 2007 .

[30]  M. MacDougall,et al.  Halos and horns in the assessment of undergraduate medical students: A consistency-based approach. , 2008 .

[31]  GoldweberMichael,et al.  Resources for instructors of capstone courses in computing , 2001 .

[32]  Sarah Smith Heckman,et al.  Capstones and Large Projects in Computing Education , 2018, TOCE.

[33]  César Domínguez,et al.  Student and Staff Perceptions of Key Aspects of Computer Science Engineering Capstone Projects , 2016, IEEE Transactions on Education.

[35]  Juan Ignacio Godino-Llorente,et al.  A Systematic Approach to the Pedagogic Design of Final Year Projects: Learning Outcomes, Supervision and Assessment* , 2010 .

[36]  T. Lumley,et al.  The importance of the normality assumption in large public health data sets. , 2002, Annual review of public health.

[37]  Harald Merckelbach,et al.  The Creative Experiences Questionnaire (CEQ): A brief self-report measure of fantasy proneness , 2001 .

[38]  Rolph E. Anderson,et al.  Multivariate Data Analysis (7th ed. , 2009 .