FLOP, a free laboratory of programming

The Test Driven Design (TDD) methodology [4, 23, 8] is currently a very common approach for programming and software engineering learning. On-line judges are widely used in everyday teaching, and their use in the scope of programming contests is currently especially well known. There are good tools and collections of programming problems available for exams as well as for contests. We have developed a simple, light, and practical open laboratory. The term open is used here in two senses: It is free for students to use and free to download and distribute under the GPL license. This laboratory hosts programming problems, it allows the instructor to easily add new ones, and it also automatically assesses the solutions sent by the students. In addition to the system, we have developed a collection of programming problems for CS1/2, designed from a pedagogical point of view and covering several levels of difficulty.

[1]  D. Krathwohl A Taxonomy for Learning, Teaching and Assessing: , 2008 .

[2]  Benjamin S. Bloom,et al.  A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives , 2000 .

[3]  Cristóbal Pareja-Flores,et al.  EXercita: automatic web publishing of programming exercises , 2001 .

[4]  Susan H. Rodger,et al.  How to develop and grade an exam for 20,000 students (or maybe just 200 or 20) , 2002, SIGCSE '02.

[5]  Christopher Douce,et al.  Automatic test-based assessment of programming: A review , 2005, JERC.

[6]  Petri Ihantola,et al.  Review of recent systems for automatic assessment of programming assignments , 2010, Koli Calling.

[7]  D. M. Ha,et al.  A gentle introduction , 2006 .

[8]  Andrew Lim,et al.  Online Judge , 2001, Comput. Educ..

[9]  J. Ángel Velázquez-Iturbide,et al.  EXercita. A System for Archiving and Publishing Programming Exercises , 2001, Computers and Education. Towards an Interconnected Society.

[10]  Tarek Hegazy,et al.  The CourseMarker CBA System: Improvements over Ceilidh , 2004, Education and Information Technologies.

[11]  B. Bloom,et al.  Taxonomy of Educational Objectives. Handbook I: Cognitive Domain , 1966 .

[12]  Nathan Griffiths,et al.  The boss online submission and assessment system , 2005, JERC.

[13]  Isidoro Hernán-Losada,et al.  Testing-Based Automatic Grading: A Proposal from Bloom's Taxonomy , 2008, 2008 Eighth IEEE International Conference on Advanced Learning Technologies.

[14]  Kirsti Ala-Mutka,et al.  A Survey of Automated Assessment Approaches for Programming Assignments , 2005, Comput. Sci. Educ..

[15]  Stephen H. Edwards Rethinking computer science education from a test-first perspective , 2003, OOPSLA '03.

[16]  Benjamin S. Bloom,et al.  Taxonomy of Educational Objectives: The Classification of Educational Goals. , 1957 .

[17]  John Leaney,et al.  First Year Programming: Let All the Flowers Bloom , 2003, ACE.

[18]  José Paulo Leal,et al.  Mooshak: a Web‐based multi‐site programming contest system , 2003, Softw. Pract. Exp..

[19]  John Waldron,et al.  Introductory programming, problem solving and computer assisted assessment , 2002 .

[20]  Diane Kelly,et al.  More testing should be taught , 2001, CACM.

[21]  Michal Forǐsek,et al.  Security of Programming Contest Systems , 2007 .

[22]  Donald C. Wells Extreme Programming: A gentle introduction , 2003 .

[23]  Melissa Davidson,et al.  The Taxonomy of Learning , 2008, International anesthesiology clinics.

[24]  Beth Simon,et al.  My program is ok – am I? Computing freshmen's experiences of doing programming assignments , 2012, Comput. Sci. Educ..

[25]  Surendra Gupta Automatic Assessment of Programming assignment , 2012, ICIT 2012.