demonstrated student performance

Educators are faced with many challenging questions in designing an eective curriculum. What prerequisite knowledge do students have before commencing a new subject? At what level of mastery? What is the spread of capabilities between bare-passing students vs. the top-performing group? How does the intended learning specication compare to student performance at the end of a subject? In this paper we present a conceptual model that helps in answering some of these questions. It has the following main capabilities: capturing the learning specication in terms of syllabus topics and outcomes; capturing mastery levels to model progression; capturing the minimal vs. aspirational learning design; capturing condence and reliability metrics for each of these mappings; and nally, comparing and reecting on the learning specication against actual student performance. We present a webbased implementation of the model, and validate it by mapping the nal exams from four programming subjects against the ACM/IEEE CS2013 topics and outcomes, using Bloom’s Taxonomy as the mastery scale. We then import the itemised exam grades from 632 students across the four subjects and compare the demonstrated student performance against the expected learning for each of these. Key contributions of this work are the validated conceptual model for capturing and comparing expected learning vs. demonstrated performance, and a web-based implementation of this model, which is made freely available online as a community resource.

[1]  Judy Kay,et al.  Toward a shared understanding of competency in programming: an invitation to the BABELnot project , 2012, ACE 2012.

[2]  Angela Carbone,et al.  Going SOLO to assess novice programmers , 2008, SIGCSE 2008.

[3]  Raymond Lister The CC2013 Strawman and Bloom's taxonomy , 2012, INROADS.

[4]  Fenwick W. English,et al.  Quality Control In Curriculum Development , 1978 .

[5]  James Skene,et al.  Introductory programming: examining the exams , 2012, ACE 2012.

[6]  Heidi Hayes Jacobs,et al.  Mapping the Big Picture. Integrating Curriculum & Assessment K-12. , 1997 .

[7]  Tony Clear,et al.  An Australasian study of reading and comprehension skills in novice programmers, using the bloom and SOLO taxonomies , 2006 .

[8]  Judy Kay,et al.  Foundations for Modeling University Curricula in Terms of Multiple Learning Goal Sets , 2013, IEEE Transactions on Learning Technologies.

[9]  Duane Buck,et al.  JKarelRobot: a case study in supporting levels of cognitive development in the computer science curriculum , 2001, SIGCSE '01.

[10]  Judy Kay,et al.  Unified model for embedding learning standards into university curricula for effective accreditation and quality assurance , 2012 .

[11]  D Royce Sadler Moderation, grading and calibration , 2009 .

[12]  Errol Thompson,et al.  Bloom's taxonomy for CS assessment , 2008, ACE '08.

[13]  Benjamin S. Bloom,et al.  Taxonomy of Educational Objectives: The Classification of Educational Goals. , 1957 .

[14]  Christopher J. Fox,et al.  Requirements for a computer science curriculum emphasizing information technology: subject area curriculum issues , 1996, SIGCSE '96.

[15]  Gerald Alan Burgess Introduction to programming: blooming in America , 2005 .

[16]  Melissa S Medina,et al.  A curriculum review and mapping process supported by an electronic database system. , 2008, American journal of pharmaceutical education.

[17]  Timothy G Willett,et al.  Current status of curriculum mapping in Canada and the UK , 2008, Medical education.

[18]  Heidi Hayes Jacobs,et al.  Planning for Curriculum Integration. , 1991 .

[19]  John Leaney,et al.  First Year Programming: Let All the Flowers Bloom , 2003, ACE.

[20]  Judy Kay,et al.  Modeling Long Term Learning of Generic Skills , 2010, Intelligent Tutoring Systems.

[21]  Dave Oliver,et al.  This Course Has A Bloom Rating Of 3.9 , 2004, ACE.

[22]  Judy Kay,et al.  PROGOSS: Mastering the curriculum , 2012 .

[23]  Heidi Hayes Jacobs Mapping the big picture , 1997 .

[24]  Bill Z. Manaris,et al.  Bloom's taxonomy revisited: specifying assessable learning objectives in computer science , 2008, SIGCSE '08.

[25]  Raymond Lister,et al.  Concrete and other neo-Piagetian forms of reasoning in the novice programmer , 2011, ACE 2011.

[26]  John Leaney,et al.  Introductory programming, criterion-referencing, and bloom , 2003, SIGCSE.

[27]  Judy Kay,et al.  Coming to terms with Bloom: an online tutorial for teachers of programming fundamentals , 2012, ACE 2012.

[28]  Adel M. Abunawass,et al.  COMPASS: a CS program assessment project , 2004, ITiCSE '04.

[29]  Roland N. Ibbett,et al.  Grand Challenges in Computing: Education - A Summary , 2005, Comput. J..

[30]  R. Harden,et al.  AMEE Guide No. 21: Curriculum mapping: a tool for transparent and authentic teaching and learning , 2001, Medical teacher.