National Lists of Scholarly Publication Channels: An Overview and Recommendations for Their Construction and Maintenance
暂无分享,去创建一个
Emanuel Kulczycki | Tim C. E. Engels | Janne Pölönen | Raf Guns | Gunnar Sivertsen | Emanuel Kulczycki | G. Sivertsen | R. Guns | Janne Pölönen
[1] Tommi Kärkkäinen,et al. Expert-based versus citation-based ranking of scholarly and scientific publication channels , 2016, J. Informetrics.
[2] G. Sivertsen,et al. Scholarly book publishing: Its information sources for evaluation in the social sciences and humanities , 2017 .
[3] Peter Ingwersen,et al. Taking scholarly books into account: current developments in five European countries , 2016, Scientometrics.
[4] Anton J. Nederhof,et al. Bibliometric monitoring of research performance in the Social Sciences and the Humanities: A Review , 2006, Scientometrics.
[5] A J Brink,et al. Impact factor: use and abuse. , 2004, Cardiovascular journal of South Africa : official journal for Southern Africa Cardiac Society [and] South African Society of Cardiac Practitioners.
[6] Liv Langfeldt,et al. How Professors Think: Inside the Curious World of Academic Judgment , 2011 .
[7] Kaare Aagaard,et al. What happens when national research funding is linked to differentiated publication counts? A comparison of the Australian and Norwegian publication-based funding models , 2016 .
[8] Michel Zitt,et al. Relativity of citation performance and excellence measures: From cross-field to cross-scale effects of field-normalisation , 2005, Scientometrics.
[9] Magnus Gulbrandsen,et al. Between Scylla and Charybdis - and Enjoying it? Organisational Tensions and Research Work , 2000 .
[10] K. Aagaard. Performance-based Research Funding in Denmark: The Adoption and Translation of the Norwegian Model , 2018, J. Data Inf. Sci..
[11] Birger Larsen,et al. Comprehensive bibliographic coverage of the social sciences and humanities in a citation index: an empirical analysis of the potential , 2011, Scientometrics.
[12] J. Lind,et al. The missing link: How university managers mediate the impact of a performance-based research funding system , 2019 .
[13] Emanuel Kulczycki,et al. The diversity of monographs: changing landscape of book evaluation in Poland , 2018, Aslib J. Inf. Manag..
[14] Evaristo Jiménez-Contreras,et al. Clasificación integrada de revistas científicas (CIRC): propuesta de categorización de las revistas en ciencias sociales y humanas , 2010 .
[15] ALEX CSISZAR. How lives became lists and scientific papers became data: cataloguing authorship during the nineteenth century , 2017, The British Journal for the History of Science.
[16] Anthony F. J. van Raan,et al. Citation Analysis May Severely Underestimate the Impact of Clinical Research as Compared to Basic Research , 2012, PloS one.
[17] G. Sivertsen. Publication-Based Funding: The Norwegian Model , 2016 .
[18] Leo Egghe,et al. Little science, big science... and beyond , 1994, Scientometrics.
[19] Chris Neuhaus,et al. The Depth and Breadth of Google Scholar: An Empirical Study , 2006 .
[20] Emanuel Kulczycki,et al. Are book publications disappearing from scholarly communication in the social sciences and humanities? , 2018, Aslib J. Inf. Manag..
[21] Deborah Zornes,et al. Defining and assessing research quality in a transdisciplinary context , 2016 .
[22] Rafael Aleixandre-Benavent,et al. Categorization model of Spanish scientific journals in social sciences and humanities , 2019, ISSI.
[23] Linda Sile,et al. Measuring changes in publication patterns in a context of performance-based research funding systems: the case of educational research in the University of Gothenburg (2005–2014) , 2018, Scientometrics.
[24] Linda Butler,et al. What Happens when Funding is Linked to Publication Counts , 2004 .
[25] Brigitte Tiefenthaler,et al. Counting quality? The Czech performance-based research funding system , 2015 .
[26] Dag W. Aksnes,et al. A Criteria-based Assessment of the Coverage of Scopus and Web of Science , 2019, J. Data Inf. Sci..
[27] Ann Oakley,et al. Trust in Numbers , 1995 .
[28] Alexander Serenko,et al. Comparing the expert survey and citation impact journal ranking methods: Example from the field of Artificial Intelligence , 2011, J. Informetrics.
[29] Wang Na,et al. Scopus vs WOS as scientific evaluation tools: A comparative analysis based on a testing sample search on the topic of electric vehicles , 2010 .
[30] Ronald Rousseau,et al. Measuring scientific contributions with modified fractional counting , 2019, J. Informetrics.
[31] Vincent Larivière,et al. Benchmarking scientific output in the social sciences and humanities: The limits of existing databases , 2006, Scientometrics.
[32] Mirka Saarela,et al. Can we automate expert-based journal rankings? Analysis of the Finnish publication indicator , 2020, J. Informetrics.
[33] A. A. Manten,et al. Scientific periodicals. Their historical development, characteristics and control , 1977 .
[34] Tim C. E. Engels,et al. The representation of the social sciences and humanities in the Web of Science—a comparison of publication patterns and incentive structures in Flanders and Norway (2005–9) , 2012 .
[35] Liam Cleere,et al. A Local Adaptation in an Output-Based Research Support Scheme (OBRSS) at University College Dublin , 2018 .
[36] Diana Hicks,et al. The difficulty of achieving full coverage of international social science literature and the bibliometric consequences , 1999, Scientometrics.
[37] Andrea Bollini,et al. Improvement of editorial quality of journals indexed in DOAJ: a data analysis , 2017 .
[38] Jordi Molas-Gallart,et al. Research Governance and the Role of Evaluation , 2012 .
[39] Juan Pablo Alperin,et al. Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations , 2019, eLife.
[40] Tim C. E. Engels,et al. The Flemish Performance-based Research Funding System: A Unique Variant of the Norwegian Model , 2018, J. Data Inf. Sci..
[41] David Pontille,et al. Revues qui comptent, revues qu’on compte:produire des classements en économie et gestion , 2010 .
[42] David Pontille,et al. Rendre publique l'évaluation des SHS : les controverses sur les listes de revues de l'AERES , 2012 .
[43] Gunnar Sivertsen,et al. Developing Current Research Information Systems (CRIS) as Data Sources for Studies of Research , 2019, Springer Handbook of Science and Technology Indicators.
[44] T. Porter,et al. Trust in Numbers , 2020 .
[45] Lutz Bornmann,et al. Scientific peer review , 2011, Annu. Rev. Inf. Sci. Technol..
[46] Tim C. E. Engels,et al. The objectives, design and selection process of the Flemish Academic Bibliographic Database for the Social Sciences and Humanities (VABB-SHW) , 2014 .
[47] Casper Bruun Jensen. Making Lists, Enlisting Scientists: The Bibliometric Indicator, Uncertainty and Emergent Agency , 2011 .
[48] Janne Pölönen,et al. Bibliodiversity - what it is and why it is essential to creating situated knowledge , 2019 .
[49] Emanuel Kulczycki,et al. Publication patterns in the social sciences and humanities: evidence from eight European countries , 2018, Scientometrics.
[50] Emanuel Kulczycki,et al. Multilingual publishing in the social sciences and humanities: A seven‐country European study , 2020, J. Assoc. Inf. Sci. Technol..
[51] T. Olijhoek,et al. Criteria for open access and publishing , 2016 .
[52] G. Sivertsen. Unique, but still best practice? The Research Excellence Framework (REF) from an international perspective , 2017, Palgrave Communications.
[53] Denise Pumain,et al. JournalBase - A Comparative International Study of Scientific Journal Databases in the Social Sciences and the Humanities (SSH) , 2010 .
[54] Gunnar Sivertsen,et al. Erih Plus - Making the Ssh Visible, Searchable and Available , 2017, CRIS.
[55] E. Giménez-Toledo,et al. Assessment of humanities and social sciences monographs through their publishers: a review and a study towards a model of evaluation , 2009 .
[56] Adelaida Román-Román,et al. Cómo valorar la internacionalidad de las revistas de Ciencias Humanas y su categorización en ERIH , 2010 .
[57] Orlando Gregorio Chaviano. Evaluación y clasificación de revistas científicas: reflexiones en torno a retos y perspectivas para Latinoamérica , 2018 .
[58] Concepción Rodríguez Parada. BiD: textos universitaris de biblioteconomia i documentació , 2001 .
[59] Emanuel Kulczycki,et al. Taking scholarly books into account, part II: a comparison of 19 European countries in evaluation and funding , 2018, Scientometrics.
[60] D. Hicks. Performance-based university research funding systems , 2012 .
[61] David Goodman. The Criteria for Open Access , 2004 .
[62] L. Butler,et al. Explaining Australia’s increased share of ISI publications—the effects of a funding formula based on publication counts , 2003 .
[63] Koen Jonkers,et al. Research Performance Based Funding Systems: a Comparative Assessment , 2016 .
[64] Gunnar Sivertsen. Data integration in Scandinavia , 2015, Scientometrics.
[65] J. Schneider. An Outline of the Bibliometric Indicator Used for Performance-Based Funding of Research Institutions in Norway , 2009 .
[66] William H. Walters,et al. Do subjective journal ratings represent whole journals or typical articles? Unweighted or weighted citation impact? , 2017, J. Informetrics.
[67] P. Seglen,et al. Education and debate , 1999, The Ethics of Public Health.
[68] Pölönen Janne,et al. Local Use of a National Rating of Publication Channels in Finnish Universities [NWB'2016 poster] , 2016 .
[69] Peter van den Besselaar,et al. Perverse effects of output-based research funding? Butler's Australian case revisited , 2017, J. Informetrics.
[70] Ronald Rousseau,et al. Science deserves to be judged by its contents, not by its wrapping: Revisiting Seglen's work on journal impact and research evaluation , 2017, PloS one.
[71] Shuo Wang Xiaoli Hu. An introduction to the 3-dimensional virtual library sites-navigation system at Capital Normal University Library , 2011 .
[72] Andrea Bonaccorsi,et al. How robust is journal rating in Humanities and Social Sciences? Evidence from a large-scale, multi-method exercise , 2016 .
[73] Peter Ingwersen,et al. Influence of a performance indicator on Danish research production and citation impact 2000–12 , 2014, Scientometrics.
[74] Wang Yuefen,et al. Exploring the Three-dimensional Framework of Knowledge Service in the field of Library and Information Science (LIS) , 2010 .
[75] Emanuel Kulczycki,et al. Assessing publications through a bibliometric indicator: The case of comprehensive evaluation of scientific units in Poland , 2017 .
[76] Pieta Eklund,et al. The heterogeneous landscape of bibliometric indicators: Evaluating models for allocating resources at Swedish universities , 2016 .
[77] Phoebe V. Moore. Metric power , 2018 .
[78] Vincent Larivière,et al. Improving the coverage of social science and humanities researchers' output: The case of the Érudit journal platform , 2011, J. Assoc. Inf. Sci. Technol..
[79] S. Rijcke,et al. Bibliometrics: The Leiden Manifesto for research metrics. , 2015, Nature.
[80] P. Korytkowski,et al. Redesigning the Model of Book Evaluation in the Polish Performance-based Research Funding System , 2018, J. Data Inf. Sci..
[81] Peter Taylor,et al. Citation Statistics , 2009, ArXiv.
[82] Erin C McKiernan,et al. Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations , 2019, eLife.
[83] Gaby Haddow,et al. Quality, impact, and quantification: Indicators and metrics use by social scientists , 2018, J. Assoc. Inf. Sci. Technol..
[84] Vincent Larivière,et al. Rethinking impact factors: better ways to judge a journal , 2019, Nature.
[85] Gunnar Sivertsen,et al. The New Research Assessment Reform in China and Its Implementation , 2020 .
[86] Ruth Makinen. Management of Serials in Libraries , 2000 .
[87] Wu Dan,et al. A comparative analysis of major Chinese and English online question-answering communities , 2018 .
[88] D. Pontille,et al. The controversial policies of journal ratings: evaluating social sciences and humanities , 2010 .
[89] A. Nederhof. Books and chapters are not to be neglected in measuring research productivity. , 1989 .
[90] Tim C. E. Engels,et al. Comparing VABB-SHW (version VIII) with Cabells Journal Blacklist and Directory of Open Access Journals : report to the Authoritative Panel , 2018 .
[91] S. Rijcke,et al. Bibliometrics: The Leiden Manifesto for research metrics , 2015, Nature.
[92] Peter Haddawy,et al. A comprehensive examination of the relation of three citation-based journal metrics to expert judgment of journal quality , 2016, J. Informetrics.
[93] Ludo Waltman,et al. The correlation between citation-based and expert-based assessments of publication channels: SNIP and SJR vs. Norwegian quality assessments , 2014, J. Informetrics.
[94] Ulf Sandström,et al. The field factor: towards a metric for academic institutions , 2009 .
[95] Kaare Aagaard,et al. Some considerations about causes and effects in studies of performance-based research funding systems , 2017, J. Informetrics.
[96] G. Sivertsen. The Norwegian Model in Norway , 2018, J. Data Inf. Sci..
[97] Toby Burrows. Multidimensional journal evaluation: analyzing scientific periodicals beyond the impact factor , 2013 .
[98] Publication Forum. User guide for the Publication Forum classification , 2020 .
[99] Jian Wang,et al. Coverage and overlap of the new social sciences and humanities journal lists , 2011, J. Assoc. Inf. Sci. Technol..
[100] Leon Cremonini,et al. Performance-based funding and performance agreements in fourteen higher education systems , 2015 .
[101] Gunnar Sivertsen,et al. Patterns of internationalization and criteria for research assessment in the social sciences and humanities , 2016, Scientometrics.
[102] Benedetto Lepori,et al. Performance-based research funding in EU Member States—a comparative assessment , 2018, Science and Public Policy.
[103] Tim C. E. Engels,et al. Ambiguity in identification of peer-reviewed publications in the Finnish and Flemish performance-based research funding systems , 2019, Science and Public Policy.
[104] Emanuel Kulczycki,et al. Does an expert-based evaluation allow us to go beyond the Impact Factor? Experiences from building a ranking of national journals in Poland , 2017, Scientometrics.
[105] Lei Wang,et al. Three options for citation tracking: Google Scholar, Scopus and Web of Science , 2006, Biomedical digital libraries.
[106] Janne Pölönen. Applications of, and Experiences with, the Norwegian Model in Finland , 2018, J. Data Inf. Sci..
[107] Olle Persson,et al. Field normalized citation rates, field normalized journal impact and Norwegian weights for allocation of university research funds , 2012, Scientometrics.
[108] Lydia L. Lange,et al. Effects of disciplines and countries on citation habits. An analysis of empirical papers in behavioural sciences , 1985, Scientometrics.
[109] Paul Wouters,et al. Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories , 2019, SAGE Open.
[110] Elea Giménez-Toledo,et al. Scholarly publishing in social sciences and humanities, associated probabilities of belonging and its spectrum: a quantitative approach for the Spanish case , 2012, Scientometrics.
[111] Andrea Bergmann,et al. Citation Indexing Its Theory And Application In Science Technology And Humanities , 2016 .
[112] Paul Genoni,et al. ERA and the ranking of Australian humanities journals , 2009 .
[113] Federica Rossi,et al. Innovation intermediaries and performance-based incentives: a case study of regional innovation poles , 2019 .
[114] P. Gross,et al. COLLEGE LIBRARIES AND CHEMICAL EDUCATION. , 1927, Science.