An Overview of the Dynamics of Relative Research Performance in Central-Eastern Europe Using a Ranking-Based Analysis Derived from SCImago Data

In recent times, rankings seem to play an increasingly important role, influencing the lives of individual researchers or academics and their institutions. Individual and institutional rankings used for promotion and research or academic funding seem to illustrate more and more the “publish or perish” mantra, relying sometimes almost exclusively on publications and their citations. Eastern Europe found itself part of this new world after a period of isolation, uneven for the countries within the area. The present study uses SCImago data to perform a regional analysis of individual and aggregated domains, for individual countries and the entire region, based on a novel “adjusted citation index”, in order to measure the performance and identify, using correlations with additional data and information, the mechanisms that can increase the research performance of a country. In a nutshell, the results indicate that the national research policies are responsible for performance. Adaptive research policies simulate a real performance, in comparison with more restrictive ones, which are more likely to stimulate unethical behaviors such as self-citations or citation stacking, especially when used for the assessment of researchers. The importance of the findings lies in the possibility of replicating the methodology, adapting it to different spatial scales.

[1]  Peter Vinkler,et al.  Correlation between the structure of scientific research, scientometric indicators and GDP in EU and non-EU countries , 2008, Scientometrics.

[2]  Ewen Callaway,et al.  Beat it, impact factor! Publishing elite turns against controversial metric , 2016, Nature.

[3]  Barbara Kline Pope,et al.  Transparency in authors’ contributions and responsibilities to promote integrity in scientific publication , 2017, Proceedings of the National Academy of Sciences.

[4]  Jingda Ding,et al.  Exploring the limitations of the h-index and h-type indexes in measuring the research performance of authors , 2020, Scientometrics.

[5]  P. Lawrence The mismeasurement of science , 2007, Current Biology.

[6]  C. Propper,et al.  Herding Cats? Management and University Performance , 2013 .

[7]  Ádám Kun,et al.  Publish and Who Should Perish: You or Science? , 2018, Publ..

[8]  Thed N. van Leeuwen,et al.  Measuring the productivity of national R&D systems : challenges in cross-national comparisons of R&D input and publication output indicators , 2016 .

[9]  Dalibor Fiala,et al.  Science Evaluation in the Czech Republic: The Case of Universities , 2013 .

[10]  Stephen Curry,et al.  Let’s move beyond the rhetoric: it’s time to change how we judge research , 2018, Nature.

[11]  Maja Jokić,et al.  Comparative analysis of book citations in social science journals by Central and Eastern European authors , 2019, Scientometrics.

[12]  Alan N. Miller,et al.  Publish or Perish , 2005 .

[13]  Saeed-Ul Hassan,et al.  A novel machine-learning approach to measuring scientific knowledge flows using citation context analysis , 2018, Scientometrics.

[14]  A. Oancea Research governance and the future(s) of research assessment , 2019, Palgrave Communications.

[15]  Loet Leydesdorff,et al.  How have the Eastern European countries of the former Warsaw Pact developed since 1990? A bibliometric study , 2013, Scientometrics.

[16]  Icy Lee Publish or perish: The myth and reality of academic publishing , 2012, Language Teaching.

[17]  Bruno Meeus,et al.  Is there a world beyond the Web of Science? Publication practices outside the heartland of academic geography , 2010 .

[18]  Djuro Kutlaca,et al.  Analysis of quantitative and qualitative indicators of SEE countries scientific output , 2014, Scientometrics.

[19]  Maja Jokić Productivity, visibility, authorship, and collaboration in library and information science journals: Central and Eastern European authors , 2019, Scientometrics.

[20]  D. Fanelli Do Pressures to Publish Increase Scientists' Bias? An Empirical Support from US States Data , 2010, PloS one.

[21]  Jirí Vanecek,et al.  Bibliometric analysis of the Czech research publications from 1994 to 2005 , 2008, Scientometrics.

[22]  Alexandru-Ionut Petrisor,et al.  Predatory Publishers using Spamming Strategies for Call for Papers and Review Requests : A Case Study , 2018 .

[23]  Slavo Radosevic,et al.  Are there global shifts in the world science base? Analysing the catching up and falling behind of world regions , 2014, Scientometrics.

[24]  Francesco Pomponi,et al.  Who Is (Likely) Peer-Reviewing Your Papers? A Partial Insight into the World's Top Reviewers , 2019, Publ..

[25]  Loukas Anninos,et al.  Research performance evaluation: some critical thoughts on standard bibliometric indicators , 2014 .

[26]  Julian Hamann The visible hand of research performance assessment , 2016 .

[27]  A. Raan The use of bibliometric analysis in research performance assessment and monitoring of interdisciplinary scientific developments , 2003 .

[28]  Jacek Pietrucha,et al.  Country-specific determinants of world university rankings , 2017, Scientometrics.

[29]  Antonio Cavacini Recent trends in Middle Eastern scientific production , 2016, Scientometrics.

[30]  Stanislav Kozubek,et al.  Scientific publication performance in post-communist countries: still lagging far behind , 2017, Scientometrics.

[31]  Peter Sasvari,et al.  Exploring the influence of scientific journal ranking on publication performance in the Hungarian social sciences: the case of law and economics , 2019, Scientometrics.

[32]  Giovanni Abramo,et al.  National-scale research performance assessment at the individual level , 2011, Scientometrics.

[33]  Xianwen Wang,et al.  An EU without the UK: mapping the UK’s changing roles in the EU scientific research , 2018, Scientometrics.

[34]  P. Regibeau,et al.  Research assessment and recognized excellence: simple bibliometrics for more efficient academic research evaluations , 2016 .

[35]  Dejan Pajic,et al.  Globalization of the social sciences in Eastern Europe: genuine breakthrough or a slippery slope of the research evaluation practice? , 2015, Scientometrics.

[36]  Daniel Teodorescu,et al.  An examination of “citation circles” for social sciences journals in Eastern European countries , 2013, Scientometrics.

[37]  Vincent Larivière,et al.  Benchmarking scientific output in the social sciences and humanities: The limits of existing databases , 2006, Scientometrics.

[38]  Dalibor Fiala,et al.  Computer science in Eastern Europe 1989-2014: a bibliometric study , 2015, Aslib J. Inf. Manag..

[39]  Alexandru-Ionuţ Petrişor Evolving strategies of the predatory journals , 2016 .

[40]  Emanuel Kulczycki,et al.  Publication patterns in the social sciences and humanities: evidence from eight European countries , 2018, Scientometrics.

[41]  Jolita Vveinhardt,et al.  Publish or perish: how Central and Eastern European economists have dealt with the ever-increasing academic publishing requirements 2000–2015 , 2017, Scientometrics.