Data dredging, salami-slicing, and other successful strategies to ensure rejection: twelve tips on how to not get your paper published

Everyone talks about ‘‘star systems’’, like professional sports, movies, even airline piloting, (and now more recently the financial industry) where a few at the very top make scads of money, and many at the bottom just scrape by. It seems to me that we in the academy are also players in a variant of a star system, where many try very hard but relatively few succeed. The system is university research, and the currency is publications, not money. Consider the fate of a research idea. Far too commonly, the idea arises in response to a practical question, where someone reflects on her educational roles and identifies a question that she apparently has no answer for. ‘‘Are expert tutors better than nonexperts?’’ ‘‘What is the optimal size for a tutorial group?’’ ‘‘Do multiple choice and short answer questions give different information about a student?’’ ‘‘How can I accommodate students with different learning styles?’’ ‘‘Should I use periodic quizzes to reinforce learning?’’ Far too often this then leads to a study, where she uses her skills as a clinical researcher to design some research to address the question. And far too often it emerges that the study, while methodologically acceptable by clinical trial standards, is missing some critical elements for good educational research. Or, as commonly, it is a question that has already been answered. Along the same lines, far too often, a faculty member with acknowledged educational skill but minimal interest in research will sit down with his department chair for annual review, and be advised to ‘‘Write up that course you’re doing and get it published.’’ A futile exercise; very few journals in education will publish full articles that are little more than descriptions of bright ideas. Finally, there is the research requirement that residency programs and specialty boards demand of residents. Far too often residents will be required to conduct a research project, all by themselves, with minimal supervision, in their spare time. Often this is some kind of educational research, typically a survey of other residents, since it looks easier—no patients, no ethical issues. This is a recipe for mediocrity, and likely does more harm than good in turning residents on to the value of educational research.

[1]  Geoff Norman,et al.  The minimal relationship between simulation fidelity and transfer of learning , 2012, Medical education.

[2]  G. Norman Fifty years of medical education research: waves of migration , 2011, Medical education.

[3]  D. Cook The Research We Still Are Not Doing: An Agenda for the Study of Computer-Based Learning , 2005, Academic medicine : journal of the Association of American Medical Colleges.

[4]  D. Cook,et al.  Internet-based learning in the health professions: a meta-analysis. , 2008, JAMA.

[5]  J M Felner,et al.  Effectiveness of a computer-based system to teach bedside cardiology. , 1999, Academic medicine : journal of the Association of American Medical Colleges.

[6]  Petra Kaufmann,et al.  Experimental And Quasi Experimental Designs For Research , 2016 .

[7]  M. McDaniel,et al.  Learning Styles , 2008, Psychological science in the public interest : a journal of the American Psychological Society.

[8]  D. Campbell,et al.  EXPERIMENTAL AND QUASI-EXPERIMENT Al DESIGNS FOR RESEARCH , 2012 .

[9]  C. Rees,et al.  Emotional Intelligence Medical Education: Measuring the Unmeasurable? , 2005, Advances in health sciences education : theory and practice.

[10]  Kevin W Eva,et al.  Self-Assessment in the Health Professions: A Reformulation and Research Agenda , 2005, Academic medicine : journal of the Association of American Medical Colleges.