Incorporating Computing Professionals’ Know-how

It is important for both computer science academics and students to clearly comprehend the differences between academic and professional perspectives in terms of assessing a deliverable. It is especially interesting to determine whether the aspects deemed important to evaluate by a computer science expert are the same as those established by academics and students. Such potential discrepancies are indicative of the unexpected challenges students may encounter once they graduate and begin working. In this article, we propose a learning activity in which computer science students made a video about their future profession after hearing an expert in the field who discussed about the characteristics and difficulties of his or her work. Academics, professional experts, and students assessed the videos by means of a questionnaire. This article reports a quantitative study of the results of this experience, which was conducted for three academic years. The study involved 63 students, 6 academics, and 4 computing professionals with extensive experience, and 14 videos were evaluated. Professional experts proved to be the most demanding in the assessment, followed by academics. The least demanding group was the students. These differences are more salient if more substantial issues are examined. The experts focused more on aspects of content, whereas the student preferred to concentrate on format. The academics’ focus falls between these two extremes. Understanding how experts value knowledge can guide educators in their search for effective learning environments in computing education.

[1]  Rob Martens,et al.  Exploring the Value of Peer Feedback in Online Learning for the Provider. , 2017 .

[2]  N. Falchikov,et al.  Student Peer Assessment in Higher Education: A Meta-Analysis Comparing Peer and Teacher Marks , 2000 .

[3]  Robert Thomas Mason A Database Practicum for Teaching Database Administration and Software Development at Regis University. , 2013 .

[4]  C. Atman,et al.  How people learn. , 1985, Hospital topics.

[5]  Naglaa A. Megahed Reflections on studio-based learning: assessment and critique , 2017 .

[6]  Paul Luo Li,et al.  What Makes a Great Software Engineer? , 2015, 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering.

[7]  John Impagliazzo,et al.  A competency-based approach toward curricular guidelines for information technology education , 2018, 2018 IEEE Global Engineering Education Conference (EDUCON).

[8]  Rolph E. Anderson,et al.  Multivariate Data Analysis (7th ed. , 2009 .

[9]  Joint Task Force on Computing Curricula Computer Science Curricula 2013: Curriculum Guidelines for Undergraduate Degree Programs in Computer Science , 2013 .

[10]  Roger Hadgraft,et al.  Engineering Education and the Development of Expertise , 2011 .

[11]  Judy L. Bastin About Database Administration , 2014 .

[12]  Jocelyn Armarego,et al.  The Teaching--Research--Industry--Learning Nexus in Information and Communications Technology , 2012, TOCE.

[13]  Gwo-Jen Hwang,et al.  An interactive peer-assessment criteria development approach to improving students' art design performance using handheld devices , 2015, Comput. Educ..

[14]  Christian Bird,et al.  Code Reviewing in the Trenches: Challenges and Best Practices , 2018, IEEE Software.

[15]  Alan Harrison,et al.  Embedding “insights from industry” in supply chain programmes: the role of guest lecturers , 2011 .

[16]  R. Bromme,et al.  Expertise and estimating what other people know: the influence of professional experience and type of knowledge. , 2001, Journal of experimental psychology. Applied.

[17]  Lino Montoro Moreno,et al.  Student perceptions of peer assessment: an interdisciplinary study , 2014 .

[18]  Zacharias C. Zacharia,et al.  Peer versus expert feedback: An investigation of the quality of peer feedback among secondary school students , 2014, Comput. Educ..

[19]  John Impagliazzo,et al.  Integrating professionalism and workplace issues into the computing and information technology curriculum: report of the ITiCSE'99 working group on professionalism , 1999, ITiCSE-WGR '99.

[20]  Bernd Brügge,et al.  Software Engineering Project Courses with Industrial Clients , 2015, TOCE.

[21]  Nian-Shing Chen,et al.  Effects of high level prompts and peer assessment on online learners' reflection levels , 2009, Comput. Educ..

[22]  Tony Clear,et al.  Preparing Tomorrow's Software Engineers for Work in a Global Environment , 2017, IEEE Software.

[23]  Gavin T. L. Brown,et al.  Teachers’ reasons for using peer assessment: positive experience predicts use , 2017 .

[24]  Chris J. Pilgrim,et al.  Work integrated learning rationale and practices in Australian information and communications technology degrees , 2012, ACE 2012.

[25]  P. Sadler,et al.  The Impact of Self- and Peer-Grading on Student Learning , 2006 .

[26]  Ibrahim Akman,et al.  Investigation of employers' performance expectations for new IT graduates in individual and team work settings for software development , 2018, Inf. Technol. People.

[27]  Sabine Sonnentag,et al.  Expertise in Software Design , 2006 .

[28]  Stefan Brandle,et al.  Software studio: teaching professional software engineering , 2011, SIGCSE.

[29]  Bill D. Carroll,et al.  A professional practices course in computer science and engineering , 2015 .

[30]  Hongli Li,et al.  Peer assessment in the digital age: a meta-analysis comparing peer and teacher ratings , 2016 .

[31]  DomínguezCésar,et al.  A comparative analysis of the consistency and difference among online self-, peer-, external- and instructor-assessments , 2016 .

[32]  Netta Iivari,et al.  Critical Design Research and Information Technology: Searching for Empowering Design , 2017, Conference on Designing Interactive Systems.

[33]  Nicole Clark,et al.  Evaluating Student Teams Developing Unique Industry Projects , 2005, ACE.

[34]  Reed Stevens,et al.  Multiple Perspectives on Engaging Future Engineers , 2011 .

[35]  Chi-Cheng Chang,et al.  A comparative analysis of the consistency and difference among teacher-assessment, student self-assessment and peer-assessment in a Web-based portfolio assessment environment for high school students , 2012, Comput. Educ..

[36]  F. Gobet,et al.  The Cambridge handbook of expertise and expert performance , 2006 .

[37]  César Domínguez,et al.  Spiral and Project-Based Learning with Peer Assessment in a Computer Science Project Management Course , 2016 .

[38]  Vikram Cariapa,et al.  Benefits of industry involvement in multidisciplinary capstone design courses , 2014 .

[39]  Craig S. Mullins Database Administration: The Complete Guide to DBA Practices and Procedures (2nd Edition) , 2012 .

[40]  Marisa Exter,et al.  Exploring Experienced Professionals’ Reflections on Computing Education , 2012, TOCE.

[41]  César Domínguez Pérez,et al.  Supervision Typology in Computer Science Engineering Capstone Projects , 2012 .

[42]  César Domínguez,et al.  Student and Staff Perceptions of Key Aspects of Computer Science Engineering Capstone Projects , 2016, IEEE Transactions on Education.

[43]  Alison Clear Introductory Programming and Educational Performance Indicators - a Mismatch , 2014 .

[44]  Magda Huisman,et al.  Software: University Courses versus Workplace Practice , 2015 .

[45]  Ali Shafaat,et al.  Engineers' written feedback on design , 2016, 2016 IEEE Frontiers in Education Conference (FIE).

[46]  Jens F. Binder,et al.  The academic value of internships: Benefits across disciplines and student backgrounds , 2015 .

[47]  Jennifer L. Croissant,et al.  Teamed Internships in Environmental Engineering and Technology: A Project Report , 2000 .

[48]  Sally Fincher,et al.  Computer Science Curricula 2013 , 2013 .

[49]  Paul D. Ellis,et al.  The essential guide to effect sizes : statistical power, meta-analysis, and the interpretation of research results , 2010 .

[50]  C. MacArthur,et al.  Student revision with peer and expert reviewing , 2010 .

[51]  Chin-Yuan Lai,et al.  Training nursing students' communication skills with online video peer assessment , 2016, Comput. Educ..

[52]  Cynthia J. Atman,et al.  Engineering Design Processes: A Comparison of Students and Expert Practitioners , 2007 .

[53]  Jong-Suk Ahn,et al.  Development of Internship & Capstone Design Integrated Program for University-industry Collaboration , 2013 .

[54]  ChangChi-Cheng,et al.  A comparative analysis of the consistency and difference among teacher-assessment, student self-assessment and peer-assessment in a Web-based portfolio assessment environment for high school students , 2012 .

[55]  Chin-Chung Tsai,et al.  On-line peer assessment and the role of the peer feedback: A study of high school computer course , 2007, Comput. Educ..

[56]  Sarah A. Douglas,et al.  Teaching HCI Design With the Studio Approach , 2003, Comput. Sci. Educ..

[57]  Patrick Onghena,et al.  An inventory of peer assessment diversity , 2011 .

[58]  Michael J Rees,et al.  Virtualisation: A case study in database administration laboratory work , 2009 .

[59]  J. Hair Multivariate data analysis , 1972 .

[60]  Kelly Blincoe,et al.  Establishing Trust and Relationships through Video Conferencing in Virtual Collaborations: An Experience Report on a Global Software Engineering Course , 2016, 2016 IEEE 11th International Conference on Global Software Engineering Workshops (ICGSEW).

[61]  Kevin Leor Guest Speakers: A Great Way to Commit to Education , 2015 .

[62]  Ana Sánchez,et al.  A comparative analysis of the consistency and difference among online self-, peer-, external- and instructor-assessments: The competitive effect , 2016, Comput. Hum. Behav..

[63]  Wendy Zhang,et al.  Beyond the computer science curriculum: empowering students for success , 2011 .

[64]  Theresa Beaubouef,et al.  The sometimes harsh reality of real world computer science projects , 2010, INROADS.