Draft Repeatability Evaluation Proposal Hybrid Systems Computation & Control

We propose to allow authors of accepted papers at HSCC to submit a repeatability package (whose contents are outlined below) around the same time as the nal version of the paper. That package will be reviewed by members of a repeatability evaluation committee according to the criteria described in this document. Those papers corresponding to packages which pass the criteria will be identied as \repeatable" at HSCC and possibly in the ACM DL.

[1]  David Jones,et al.  Clear Climate Code: Rewriting Legacy Science Software for Clarity , 2011, IEEE Software.

[2]  Markus Rupp,et al.  Reproducible research in signal processing , 2009, IEEE Signal Processing Magazine.

[3]  Ian M. Mitchell,et al.  Best Practices for Scientific Computing , 2012, PLoS biology.

[4]  Darrel C. Ince,et al.  The Duke University scandal — what can be done? , 2011 .

[5]  Michael A. Heroux,et al.  Barely sufficient software engineering: 10 practices to improve your CSE software , 2009, 2009 ICSE Workshop on Software Engineering for Computational Science and Engineering.

[6]  Heather A. Piwowar,et al.  Sharing Detailed Research Data Is Associated with Increased Citation Rate , 2007, PloS one.

[7]  Janice Singer,et al.  How do scientists develop and use scientific software? , 2009, 2009 ICSE Workshop on Software Engineering for Computational Science and Engineering.

[8]  Philippe Bonnet,et al.  Repeatability and workability evaluation of SIGMOD 2011 , 2011, SGMD.

[9]  Feng Liu,et al.  A survey of the practice of computational science , 2011, 2011 International Conference for High Performance Computing, Networking, Storage and Analysis (SC).

[10]  Greg Miller,et al.  A Scientist's Nightmare: Software Problem Leads to Five Retractions , 2006, Science.

[11]  Arian Maleki,et al.  Reproducible Research in Computational Harmonic Analysis , 2009, Computing in Science & Engineering.