Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today's Academic World

Nowadays, the importance of bibliographic databases (DBs) has increased enormously, as they are the main providers of publication metadata and bibliometric indicators universally used both for research assessment practices and for performing daily tasks. Because the reliability of these tasks firstly depends on the data source, all users of the DBs should be able to choose the most suitable one. Web of Science (WoS) and Scopus are the two main bibliographic DBs. The comprehensive evaluation of the DBs’ coverage is practically impossible without extensive bibliometric analyses or literature reviews, but most DBs users do not have bibliometric competence and/or are not willing to invest additional time for such evaluations. Apart from that, the convenience of the DB’s interface, performance, provided impact indicators and additional tools may also influence the users’ choice. The main goal of this work is to provide all of the potential users with an all-inclusive description of the two main bibliographic DBs by gathering the findings that are presented in the most recent literature and information provided by the owners of the DBs at one place. This overview should aid all stakeholders employing publication and citation data in selecting the most suitable DB.

[1]  Sierra Laddusaw,et al.  Acknowledgment of Libraries in the Journal Literature: An Exploratory Study , 2020, J. Data Inf. Sci..

[2]  Nees Jan van Eck,et al.  Large-scale comparison of bibliographic data sources: Scopus, Web of Science, Dimensions, Crossref, and Microsoft Academic , 2020, Quantitative Science Studies.

[3]  Emilio Delgado López-Cózar,et al.  Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: a multidisciplinary comparison of coverage via citations , 2020, Scientometrics.

[4]  Serhii Nazarovets,et al.  Controversial practice of rewarding for publications in national journals , 2020, Scientometrics.

[5]  Abdelghani Maddi,et al.  Measuring open access publications: a novel normalized open access indicator , 2020, Scientometrics.

[6]  Hilary I. Okagbue,et al.  Correlation between the CiteScore and Journal Impact Factor of top-ranked library and information science journals , 2020, Scientometrics.

[7]  Weishu Liu Accuracy of funding information in Scopus: a comparative case study , 2020, Scientometrics.

[8]  Lei Wang,et al.  Which h-index? An exploration within the Web of Science , 2020, Scientometrics.

[9]  Kyle J. Burghardt,et al.  Three Commonly Utilized Scholarly Databases and a Social Network Site Provide Different, But Related, Metrics of Pharmacy Faculty Publication , 2020, Publ..

[10]  S. Asai The effect of collaboration with large publishers on the internationality and influence of open access journals for research institutions , 2020, Scientometrics.

[11]  L. Bornmann,et al.  Citation concept analysis (CCA) of Robert K. Merton’s book Social Theory and Social Structure: How often are certain concepts from the book cited in subsequent publications? , 2020, Quantitative Science Studies.

[12]  Ciriaco Andrea D'Angelo,et al.  Collecting large-scale publication data at the level of individual researchers: a practical proposal for author name disambiguation , 2020, Scientometrics.

[13]  P. Ayris,et al.  Built to last! Embedding open science principles and practice into European universities , 2020 .

[14]  S. Asai Market power of publishers in setting article processing charges for open access journals , 2020, Scientometrics.

[15]  Weishu Liu,et al.  A tale of two databases: the use of Web of Science and Scopus in academic papers , 2020, Scientometrics.

[16]  Nees Jan van Eck,et al.  Comparing institutional-level bibliometric research performance indicator values based on different affiliation disambiguation systems , 2020, Quantitative Science Studies.

[17]  Kyle Siler,et al.  The pricing of open access journals: Diverse niches and sources of value in academic publishing , 2020, Quantitative Science Studies.

[18]  Neal R Haddaway,et al.  Which academic search systems are suitable for systematic reviews or meta‐analyses? Evaluating retrieval qualities of Google Scholar, PubMed, and 26 other resources , 2020, Research synthesis methods.

[19]  Goodluck Asobenie Kandonga,et al.  Exploring the limitations of the h-index and h-type indexes in measuring the research performance of authors , 2020, Scientometrics.

[20]  E. Krauskopf Sources without a CiteScore value: more clarity is required , 2020, Scientometrics.

[21]  Grégoire Côté,et al.  Scopus as a curated, high-quality bibliometric data source for academic research in quantitative science studies , 2020, Quantitative Science Studies.

[22]  Stacy Konkiel,et al.  Dimensions: Bringing down barriers between scientometricians and data , 2020, Quantitative Science Studies.

[23]  Lei Lei,et al.  Should highly cited items be excluded in impact factor calculation? The effect of review articles on journal impact factor , 2020, Scientometrics.

[24]  Enrique Herrera-Viedma,et al.  Software tools for conducting bibliometric analysis in science: An up-to-date review , 2020, El Profesional de la Información.

[25]  Guangyuan Hu,et al.  Funding information in Web of Science: an updated overview , 2020, Scientometrics.

[26]  Staša Milojević,et al.  Practical method to reclassify Web of Science articles into unique subject categories and broad disciplines , 2020, Quantitative Science Studies.

[27]  B. Björk,et al.  How Frequently are Articles in Predatory Open Access Journals Cited , 2019, Publ..

[28]  Mingkun Wei Research on impact evaluation of open access journals , 2019, Scientometrics.

[29]  Fereshteh Didegah,et al.  Do articles in open access journals have more frequent altmetric activity than articles in subscription-based journals? An investigation of the research output of Finnish universities , 2019, Scientometrics.

[30]  Alexander E. Ellinger,et al.  An evaluation of Web of Science, Scopus and Google Scholar citations in operations management , 2019, The International Journal of Logistics Management.

[31]  J. A. Teixeira da Silva,et al.  Predatory and exploitative behaviour in academic publishing: An assessment , 2019, The Journal of Academic Librarianship.

[32]  Kayvan Kousha,et al.  Web of Science and Scopus language coverage , 2019, Scientometrics.

[33]  Sergio Copiello,et al.  The open access citation premium may depend on the openness and inclusiveness of the indexing database, but the relationship is controversial because it is ambiguous where the open access boundary lies , 2019, Scientometrics.

[34]  Weishu Liu,et al.  The data source of this study is Web of Science Core Collection? Not enough , 2019, Scientometrics.

[35]  Lluís Codina,et al.  Ranking by Relevance and Citation Counts, a Comparative Study: Google Scholar, Microsoft Academic, WoS and Scopus , 2019, Future Internet.

[36]  Cameron Neylon,et al.  Comparison of bibliographic data sources: Implications for the robustness of university rankings , 2019, bioRxiv.

[37]  Vicente Safón,et al.  Inter-ranking reputational effects: an analysis of the Academic Ranking of World Universities (ARWU) and the Times Higher Education World University Rankings (THE) reputational relationship , 2019, Scientometrics.

[38]  Kevin W Boyack,et al.  A standardized citation metrics author database annotated for scientific field , 2019, PLoS biology.

[39]  Giovanni Abramo,et al.  Peer review versus bibliometrics: Which method better predicts the scholarly impact of publications? , 2019, Scientometrics.

[40]  Shuo Xu,et al.  Types of DOI errors of cited references in Web of Science with a cleaning method , 2019, Scientometrics.

[41]  P. Wouters Indicator Frameworks for Fostering Open Knowledge Practices in Science and Scholarship. Encuentro RRI en España y su modelo de incorporación en los institutos de investigación sanitaria. Universidad Internacional Menéndez Pelayo del 1 al 3 de julio de 2019 , 2019 .

[42]  Anne-Wil Harzing Two new kids on the block: How do Crossref and Dimensions compare with Google Scholar, Microsoft Academic, Scopus and the Web of Science? , 2019, Scientometrics.

[43]  Pablo Dorta-González,et al.  Publication modalities ‘article in press’ and ‘open access’ in relation to journal average citation , 2019, Scientometrics.

[44]  L. Bornmann,et al.  What do citation counts measure? An updated review of studies on citations in scientific documents published between 2006 and 2018 , 2019, Scientometrics.

[45]  Weishu Liu,et al.  The secrets behind Web of Science’s DOI search , 2019, Scientometrics.

[46]  Ludo Waltman,et al.  Accuracy of citation data in Web of Science and Scopus , 2019, ISSI.

[47]  Giuseppe De Nicolao,et al.  Citation gaming induced by bibliometric evaluation: A country-level comparative analysis , 2019, PloS one.

[48]  Qing Ke,et al.  A probe into 66 factors which are possibly associated with the number of citations an article received , 2019, Scientometrics.

[49]  M. Koyle,et al.  Predatory publishing or a lack of peer review transparency?-a contemporary analysis of indexed open and non-open access articles in paediatric urology. , 2019, Journal of pediatric urology.

[50]  Wang-Ching Shaw,et al.  One category, two communities: subfield differences in “Information Science and Library Science” in Journal Citation Reports , 2019, Scientometrics.

[51]  E. Krauskopf Missing documents in Scopus: the case of the journal Enfermeria Nefrologica , 2019, Scientometrics.

[52]  Hermann A. Maurer,et al.  Comprehensive evaluation of h-index and its extensions in the domain of mathematics , 2019, Scientometrics.

[53]  Maziar Montazerian,et al.  A new parameter for (normalized) evaluation of H-index: countries as a case study , 2019, Scientometrics.

[54]  Has van Vlokhoven The Effect of Open Access on Research Quality , 2019, J. Informetrics.

[55]  Dag W. Aksnes,et al.  A Criteria-based Assessment of the Coverage of Scopus and Web of Science , 2019, J. Data Inf. Sci..

[56]  Lokman I. Meho Using Scopus's CiteScore for assessing the quality of computer science conferences , 2019, J. Informetrics.

[57]  Weishu Liu,et al.  DOI errors and possible solutions for Web of Science , 2019, Scientometrics.

[58]  Fei Shu,et al.  Comparing journal and paper level classifications of science , 2019, J. Informetrics.

[59]  Toby Green,et al.  Is open access affordable? Why current models do not work and why we need internet‐era transformation of scholarly communications , 2019, Learn. Publ..

[60]  Paul Wouters,et al.  Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories , 2019, SAGE Open.

[61]  Michael Gusenbauer,et al.  Google Scholar to overshadow them all? Comparing the sizes of 12 academic search engines and bibliographic databases , 2018, Scientometrics.

[62]  Danielle H. Lee Predictive power of conference-related factors on citation rates of conference papers , 2018, Scientometrics.

[63]  Selcuk Besir Demir,et al.  Predatory journals: Who publishes in them and why? , 2018, J. Informetrics.

[64]  Xianwen Wang,et al.  Four pitfalls in normalizing citation indicators: An investigation of ESI's selection of highly cited papers , 2018, J. Informetrics.

[65]  C. Demetrescu,et al.  Accuracy of author names in bibliographic data sources: an Italian case study , 2018, Scientometrics.

[66]  Alonso Rodríguez-Navarro,et al.  Evaluating research and researchers by the journal impact factor: is it better than coin flipping? , 2018, J. Informetrics.

[67]  V. Aman Does the Scopus author ID suffice to track scientific international mobility? A case study based on Leibniz laureates , 2018, Scientometrics.

[68]  Christian Herzog,et al.  Dimensions: Building Context for Search and Evaluation , 2018, Front. Res. Metr. Anal..

[69]  Mike Thelwall,et al.  Google Scholar, Web of Science, and Scopus: a systematic comparison of citations in 252 subject categories , 2018, J. Informetrics.

[70]  V. Traag,et al.  Systematic analysis of agreement between metrics and peer review in the UK REF , 2018, Palgrave Communications.

[71]  Giovanni Abramo,et al.  Revisiting the scientometric conceptualization of impact and its measurement , 2018, J. Informetrics.

[72]  Mike Thelwall,et al.  Can Microsoft Academic help to assess the citation impact of academic books? , 2018, J. Informetrics.

[73]  Li Tang,et al.  Missing author address information in Web of Science-An explorative study , 2018, J. Informetrics.

[74]  L. Bornmann Field classification of publications in Dimensions: a first case study testing its reliability and validity , 2018, Scientometrics.

[75]  Tad Dallas,et al.  Variable Bibliographic Database Access Could Limit Reproducibility , 2018, BioScience.

[76]  E. Krauskopf An analysis of discontinued journals by Scopus , 2018, Scientometrics.

[77]  Yang Xu,et al.  A quantitative exploration on reasons for citing articles from the perspective of cited authors , 2018, Scientometrics.

[78]  Peng Zhang,et al.  A bibliometric analysis of highly cited papers in the field of Economics and Business based on the Essential Science Indicators database , 2018, Scientometrics.

[79]  Lutz Bornmann,et al.  Critical rationalism and the search for standard (field-normalized) indicators in bibliometrics , 2018, J. Informetrics.

[80]  Henk F. Moed,et al.  Trends in Russian research output indexed in Scopus and Web of Science , 2018, Scientometrics.

[81]  Mark Akoev,et al.  Russian Index of Science Citation: Overview and review , 2018, Scientometrics.

[82]  Enrique Orduña-Malea,et al.  Coverage of highly-cited documents in Google Scholar, Web of Science, and Scopus: a multidisciplinary comparison , 2018, Scientometrics.

[83]  Ádám Kun,et al.  Publish and Who Should Perish: You or Science? , 2018, Publ..

[84]  Emanuel Kulczycki,et al.  Publication patterns in the social sciences and humanities: evidence from eight European countries , 2018, Scientometrics.

[85]  Mike Thelwall,et al.  Dimensions: A Competitor to Scopus and the Web of Science? , 2018, J. Informetrics.

[86]  M. Salesi,et al.  A Review of Scientific Outputs on Spirituality and Depression Indexed in Important Databases , 2018 .

[87]  Shaher Momani,et al.  Are university rankings useful to improve research? A systematic review , 2018, PloS one.

[88]  Manolis Antonoyiannakis,et al.  Impact Factors and the Central Limit Theorem: Why citation averages are scale dependent , 2018, J. Informetrics.

[89]  Michael H. MacRoberts,et al.  The mismeasure of science: Citation analysis , 2018, J. Assoc. Inf. Sci. Technol..

[90]  David Moher,et al.  Assessing scientists for hiring, promotion, and tenure , 2018, PLoS biology.

[91]  M. A. Lim,et al.  The building of weak expertise: the work of global university rankers , 2018 .

[92]  Na Zhu,et al.  The mutually beneficial relationship of patents and scientific literature: topic evolution in nanoscience , 2018, Scientometrics.

[93]  Judit Bar-Ilan,et al.  Tale of Three Databases: The Implication of Coverage Demonstrated for a Sample Query , 2018, Front. Res. Metr. Anal..

[94]  J. A. T. Silva,et al.  Multiple versions of the h-index: cautionary use for formal academic purposes , 2018, Scientometrics.

[95]  Tobias Siebenlist,et al.  Cross-metric compatability and inconsistencies of altmetrics , 2018, Scientometrics.

[96]  Heather A. Piwowar,et al.  The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles , 2018, PeerJ.

[97]  Loet Leydesdorff,et al.  The negative effects of citing with a national orientation in terms of recognition: National and international citations in natural-sciences papers from Germany, the Netherlands, and the UK , 2018, J. Informetrics.

[98]  Juan Miguel Campanario,et al.  Are leaders really leading? Journals that are first in Web of Science subject categories in the context of their groups , 2018, Scientometrics.

[99]  S. Haustein,et al.  Authorship, citations, acknowledgments and visibility in social media: Symbolic capital in the multifaceted reward system of science , 2018 .

[100]  Vincent Larivière,et al.  Impact Factor : A brief history , critique , and discussion of adverse effects , 2018 .

[101]  J. Bosman,et al.  Open access levels: a quantitative exploration using Web of Science and oaDOI data , 2018 .

[102]  Giovanna Badia,et al.  Identifying "best bets" for searching in chemical engineering: Comparing database content and performance for information retrieval , 2018, J. Documentation.

[103]  Muhammad Tanvir Afzal,et al.  Evaluation of h-index, its variants and extensions based on publication age & citation intensity in civil engineering , 2018, Scientometrics.

[104]  Helena Blažun Vošner,et al.  Discrepancies among Scopus, Web of Science, and PubMed coverage of funding information in medical journal articles , 2018, Journal of the Medical Library Association : JMLA.

[105]  Kai Li,et al.  Web of Science use in published research and review papers 1997–2017: a selective, dynamic, cross-domain, content-based analysis , 2017, Scientometrics.

[106]  O. Franco,et al.  Optimal database combinations for literature searches in systematic reviews: a prospective exploratory study , 2017, Systematic Reviews.

[107]  Maryam Okhovati,et al.  Novice and experienced users’ search performance and satisfaction with Web of Science and Scopus , 2017, J. Libr. Inf. Sci..

[108]  Emilio Delgado López-Cózar,et al.  Google Scholar as a source for scholarly evaluation: A bibliographic review of database errors , 2017 .

[109]  O. Ellegaard The application of bibliometric analysis: disciplinary and user aspects , 2017, Scientometrics.

[110]  Giulio Cimini,et al.  The scientific influence of nations on global scientific and technological development , 2017, J. Informetrics.

[111]  Cameron Barnes The h-index Debate: An Introduction for Librarians , 2017 .

[112]  William H. Walters,et al.  Citation-Based Journal Rankings: Key Questions, Metrics, and Data Sources , 2017, IEEE Access.

[113]  C. Lascar LibGuides: Copy of Web of Science Core Collection: Web of Science: Summary of Coverage , 2017 .

[114]  Claudia Lascar,et al.  LibGuides: Copy of Web of Science Core Collection: Quick Reference Cards (PDF) , 2017 .

[115]  D. A. Eisner Reproducibility of science: Fraud, impact factors and carelessness , 2017, Journal of molecular and cellular cardiology.

[116]  Lutz Bornmann,et al.  Algorithmically generated subject categories based on citation relations: An empirical micro study using papers on overall water splitting and related topics , 2017, J. Informetrics.

[117]  Kimberly R Powell,et al.  Coverage and quality: A comparison of Web of Science and Scopus databases for reporting faculty nursing publication metrics. , 2017, Nursing outlook.

[118]  María Bordons,et al.  Funding acknowledgments in the Web of Science: completeness and accuracy of collected data , 2017, Scientometrics.

[119]  Carol Tenopir,et al.  Imagining a Gold Open Access Future: Attitudes, Behaviors, and Funding Scenarios among Authors of Academic Scholarship , 2017, Coll. Res. Libr..

[120]  Gaston Heimeriks,et al.  What drives university research performance? An analysis using the CWTS Leiden Ranking data , 2017, J. Informetrics.

[121]  William H. Walters,et al.  Do subjective journal ratings represent whole journals or typical articles? Unweighted or weighted citation impact? , 2017, J. Informetrics.

[122]  Henk F. Moed,et al.  Suitability of Google Scholar as a source of scientific information and as a source of data for scientific evaluation - Review of the Literature , 2017, J. Informetrics.

[123]  Wei Jeng,et al.  DataCite as a novel bibliometric source: Coverage, strengths and limitations , 2017, J. Informetrics.

[124]  Lutz Bornmann,et al.  Model for Explaining Citations in Scholarly Publications: A Conceptual Overview of the Literature , 2017, J. Informetrics.

[125]  Z. Dvir,et al.  The surge of predatory open-access in neurosciences and neurology , 2017, Neuroscience.

[126]  Peter Haddawy,et al.  Uncovering fine-grained research excellence: The global research benchmarking system , 2017, J. Informetrics.

[127]  G. Sivertsen,et al.  Scholarly book publishing: Its information sources for evaluation in the social sciences and humanities , 2017 .

[128]  C K Ranjan,et al.  Bibliometric Indices of Scientific Journals: Time to overcome the obsession and think beyond the Impact Factor. , 2017, Medical journal, Armed Forces India.

[129]  Martin P. Brändle,et al.  The coverage of Microsoft Academic: analyzing the publication output of a university , 2017, Scientometrics.

[130]  Maria Liakata,et al.  Measuring scientific impact beyond academia: An assessment of existing impact metrics and proposed improvements , 2017, PloS one.

[131]  Anthony F.J. van Raan,et al.  Patent Citations Analysis and Its Value in Research Evaluation: A Review and a New Approach to Map Technology-relevant Research , 2017, J. Data Inf. Sci..

[132]  Lixin Chen,et al.  Do patent citations indicate knowledge linkage? The evidence from text similarities between patents and their citations , 2017, J. Informetrics.

[133]  Antonio Perianes-Rodríguez,et al.  A comparison of the Web of Science and publication-level classification systems of science , 2017, J. Informetrics.

[134]  Jaime A. Teixeira da Silva,et al.  CiteScore: A cite for sore eyes, or a valuable, transparent metric? , 2017, Scientometrics.

[135]  M. A. Lim,et al.  Active instruments: on the use of university rankings in developing national systems of higher education , 2017 .

[136]  Marisa Ruccolo LibGuides: Web of Science Core Collection: Searching for an Institution , 2016 .

[137]  Henk F. Moed,et al.  A critical comparative analysis of five world university rankings , 2016, Scientometrics.

[138]  Loet Leydesdorff,et al.  Skewness of citation impact data and covariates of citation distributions: A large-scale empirical analysis based on Web of Science data , 2016, J. Informetrics.

[139]  Fiorenzo Franceschini,et al.  Empirical analysis and classification of database errors in Scopus and Web of Science , 2016, J. Informetrics.

[140]  Arif Khan,et al.  The impact of author-selected keywords on citation counts , 2016, J. Informetrics.

[141]  Vitor Taga,et al.  Research Articles about Open Access Indexed by Scopus: A Content Analysis , 2016, Publ..

[142]  Rasim Alguliyev,et al.  Impact factor penalized by self-citations , 2016, 2016 IEEE 10th International Conference on Application of Information and Communication Technologies (AICT).

[143]  Loet Leydesdorff,et al.  Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report , 2016, Scientometrics.

[144]  J. Trapp Web of Science, Scopus, and Google Scholar citation rates: a case study of medical physics and biomedical engineering: what gets cited and what doesn’t? , 2016, Australasian Physical & Engineering Sciences in Medicine.

[145]  Judith Sutz,et al.  Academic Evaluation: Universal Instrument? Tool for Development? , 2016 .

[146]  Carol Tenopir,et al.  What Motivates Authors of Scholarly Articles? The Importance of Journal Attributes and Potential Audience on Publication Choice , 2016, Publ..

[147]  Judit Bar-Ilan,et al.  Citation success index - An intuitive pair-wise journal comparison metric , 2016, J. Informetrics.

[148]  Brian A. Nosek,et al.  How open science helps researchers succeed , 2016, eLife.

[149]  Kaare Aagaard,et al.  What happens when national research funding is linked to differentiated publication counts? A comparison of the Australian and Norwegian publication-based funding models , 2016 .

[150]  Antonia Ferrer-Sapena,et al.  The impact factor as a measuring tool of the prestige of the journals in research assessment in mathematics , 2016 .

[151]  R. Arencibia-Jorge,et al.  Elsevier's Journal Metrics for the Identification of a Mainstream Journals Core: A Case Study on Mexico , 2016 .

[152]  Lauren B. Collister,et al.  The academic, economic and societal impacts of Open Access: an evidence-based review. , 2016, F1000Research.

[153]  Askar Safipour Afshar,et al.  Factors affecting number of citations: a comprehensive review of the literature , 2016, Scientometrics.

[154]  Mike Thelwall,et al.  Not dead, just resting: The practical value of per publication citation indicators , 2016, J. Informetrics.

[155]  Gabriel-Alexandru Vîiu,et al.  A theoretical evaluation of Hirsch-type bibliometric indicators confronted with extreme self-citation , 2016, J. Informetrics.

[156]  Wim J. N. Meester,et al.  A response to "The museum of errors/horrors in Scopus" by Franceschini et al , 2016, J. Informetrics.

[157]  Liying Yang,et al.  Evaluating journal quality: A review of journal citation indicators and ranking in business and management , 2016, Eur. J. Oper. Res..

[158]  Mike Thelwall,et al.  Are there too many uncited articles? Zero inflated variants of the discretised lognormal and hooked power law distributions , 2016, J. Informetrics.

[159]  Michael M. Hopkins,et al.  Funding Data from Publication Acknowledgments: Coverage, Uses, and Limitations , 2016, J. Assoc. Inf. Sci. Technol..

[160]  Adèle Paul-Hus,et al.  Characterization, description, and considerations for the use of funding acknowledgement data in Web of Science , 2016, Scientometrics.

[161]  Loet Leydesdorff,et al.  Construction of a pragmatic base line for journal classifications and maps based on aggregated journal-journal citation relations , 2016, J. Informetrics.

[162]  Paul Wouters,et al.  Evaluation practices and effects of indicator use : a literature review , 2016 .

[163]  Peter Haddawy,et al.  A comprehensive examination of the relation of three citation-based journal metrics to expert judgment of journal quality , 2016, J. Informetrics.

[164]  Thierry Marchant,et al.  Ranking authors using fractional counting of citations: An axiomatic approach , 2016, J. Informetrics.

[165]  Juan Gorraiz,et al.  Availability of digital object identifiers (DOIs) in Web of Science and Scopus , 2016, J. Informetrics.

[166]  Fiorenzo Franceschini,et al.  The museum of errors/horrors in Scopus , 2016, J. Informetrics.

[167]  T. Ida,et al.  Science linkages between scientific articles and patents for leading scientists in the life and medical sciences field: the case of Japan , 2016, Scientometrics.

[168]  A. Harlev,et al.  Bibliometrics: tracking research impact by selecting the appropriate metrics , 2016, Asian journal of andrology.

[169]  Ely Francina Tannuri de Oliveira,et al.  Scientific Production on Open Access: A Worldwide Bibliometric Analysis in the Academic and Scientific Context , 2016, Publ..

[170]  Li Tang,et al.  Funding acknowledgment analysis: Queries and caveats , 2016, J. Assoc. Inf. Sci. Technol..

[171]  Félix de Moya Anegón,et al.  Updating the SCImago journal and country rank classification: A new approach using Ward's clustering and alternative combination of citation measures , 2016, J. Assoc. Inf. Sci. Technol..

[172]  Anne-Wil Harzing,et al.  Google Scholar, Scopus and the Web of Science: a longitudinal and cross-disciplinary comparison , 2015, Scientometrics.

[173]  Qi Wang,et al.  Large-scale analysis of the accuracy of the journal classification systems of Web of Science and Scopus , 2015, J. Informetrics.

[174]  G. Niel,et al.  Chemical bibliographic databases: the influence of term indexing policies on topic searches , 2015 .

[175]  Adèle Paul-Hus,et al.  The journal coverage of Web of Science and Scopus: a comparative analysis , 2015, Scientometrics.

[176]  Henk F. Moed,et al.  Comprehensive indicator comparisons intelligible to non-experts: the case of two SNIP versions , 2015, Scientometrics.

[177]  Dejan Pajic,et al.  On the stability of citation-based journal rankings , 2015, J. Informetrics.

[178]  B. Björk,et al.  ‘Predatory’ open access: a longitudinal study of article volumes and market characteristics , 2015, BMC Medicine.

[179]  Nees Jan van Eck,et al.  Evaluation of the citation matching algorithms of CWTS and iFQ in comparison to the Web of science , 2015, J. Assoc. Inf. Sci. Technol..

[180]  Mike Thelwall,et al.  The metric tide: report of the independent review of the role of metrics in research assessment and management , 2015 .

[181]  Ludo Waltman,et al.  A review of the literature on citation impact indicators , 2015, J. Informetrics.

[182]  Rafael Aleixandre-Benavent,et al.  A systematic analysis of duplicate records in Scopus , 2015, J. Informetrics.

[183]  H. Kawashima,et al.  Accuracy evaluation of Scopus Author ID based on the largest funding database in Japan , 2015, Scientometrics.

[184]  Giuseppe De Nicolao,et al.  Do they agree? Bibliometric evaluation versus informed peer review in the Italian research assessment exercise , 2015, Scientometrics.

[185]  S. Rijcke,et al.  Bibliometrics: The Leiden Manifesto for research metrics , 2015, Nature.

[186]  Fiorenzo Franceschini,et al.  Influence of omitted citations on the bibliometric statistics of the major Manufacturing journals , 2015, Scientometrics.

[187]  Robert J Volk,et al.  Citation searches are more sensitive than keyword searches to identify studies using specific measurement instruments. , 2015, Journal of clinical epidemiology.

[188]  Nobuko Miyairi Regional Citation Indexes , 2015 .

[189]  Xianwen Wang,et al.  The open access advantage considering citation, article usage and social media attention , 2015, Scientometrics.

[190]  Loet Leydesdorff,et al.  A review of theory and practice in scientometrics , 2015, Eur. J. Oper. Res..

[191]  Fiorenzo Franceschini,et al.  Errors in DOI indexing by bibliometric databases , 2014, Scientometrics.

[192]  Lutz Bornmann,et al.  Methods for the generation of normalized citation impact scores in bibliometrics: Which method best reflects the judgements of experts? , 2014, J. Informetrics.

[193]  Lutz Bornmann,et al.  Measuring impact in research evaluations: a thorough discussion of methods for, effects of and problems with impact measurements , 2014 .

[194]  Loet Leydesdorff,et al.  The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies” , 2014, J. Assoc. Inf. Sci. Technol..

[195]  Henk F. Moed,et al.  Multidimensional assessment of scholarly research impact , 2014, J. Assoc. Inf. Sci. Technol..

[196]  Loet Leydesdorff,et al.  Aggregated journal–journal citation relations in scopus and web of science matched and compared in terms of networks, maps, and interactive overlays , 2014, J. Assoc. Inf. Sci. Technol..

[197]  K. Moustafa The Disaster of the Impact Factor , 2014, Science and Engineering Ethics.

[198]  Lutz Bornmann,et al.  Sampling issues in bibliometric analysis , 2014, J. Informetrics.

[199]  Gianluca Setti,et al.  Bibliometric Indicators: Why Do We Need More Than One? , 2013, IEEE Access.

[200]  Biljana Kosanović,et al.  Output in WoS vs. representation in JCR of SEE nations: Does mother Thomson cherish all her children equally , 2013 .

[201]  Masood Fooladi,et al.  A Comparison between Two Main Academic Literature Collections: Web of Science and Scopus Databases , 2013, ArXiv.

[202]  Young Man Ko,et al.  An index for evaluating journals in a small domestic citation index database whose citation rate is generally very low: A test based on the Korea Citation Index (KCI) database , 2013, J. Informetrics.

[203]  Ludo Waltman,et al.  A systematic empirical comparison of different approaches for normalizing citation impact indicators , 2013, J. Informetrics.

[204]  Thed N. van Leeuwen,et al.  Some modifications to the SNIP journal impact indicator , 2012, J. Informetrics.

[205]  Ludo Waltman,et al.  Source normalized indicators of citation impact: an overview of different approaches and an empirical comparison , 2012, Scientometrics.

[206]  Henk F. Moed,et al.  Citation-based metrics are appropriate tools in journal assessment provided that they are accurate and used in an informed way , 2012, Scientometrics.

[207]  Ismael Rafols,et al.  Global maps of science based on the new Web-of-Science categories , 2012, Scientometrics.

[208]  Vicente P. Guerrero-Bote,et al.  A further step forward in measuring journals' scientific prestige: The SJR2 indicator , 2012, J. Informetrics.

[209]  Vicente P. Guerrero-Bote,et al.  A new approach to the metric of journals' scientific prestige: The SJR indicator , 2010, J. Informetrics.

[210]  Loet Leydesdorff,et al.  Scopus's Source Normalized Impact per Paper (SNIP) versus a Journal Impact Factor based on Fractional Counting of Citations , 2010, J. Assoc. Inf. Sci. Technol..

[211]  Henk F. Moed,et al.  Measuring contextual citation impact of scientific journals , 2009, J. Informetrics.

[212]  Vincent Larivière,et al.  Comparing Bibliometric Statistics Obtained from the Web of Science and Scopus , 2009, J. Assoc. Inf. Sci. Technol..

[213]  Carl T. Bergstrom,et al.  Eigenfactor Measuring the value and prestige of scholarly journals , 2007 .

[214]  E. Garfield Citation indexes for science; a new dimension in documentation through association of ideas. , 2006, Science.

[215]  Marjori Matzke,et al.  F1000Prime recommendation of An index to quantify an individual's scientific research output. , 2005 .

[216]  G. Sivertsen,et al.  Data quality and consistency in Scopus and Web of Science in their indexing of Czech Journals , 2020 .

[217]  András Schubert,et al.  All Along the h-Index-related Literature: A Guided Tour , 2019, Springer Handbook of Science and Technology Indicators.

[218]  Jessica Bates,et al.  Will Web Search Engines Replace Bibliographic Databases in the Systematic Identification of Research , 2017 .

[219]  Isabella Peters,et al.  Next-Generation Metrics : Reponsible Metrics and Evaluation for Open Science. Report of the European Commission Expert Group on Altmetrics , 2017 .

[220]  Emilio Delgado López-Cózar,et al.  The Google scholar experiment: How to index false papers and manipulate bibliometric indicators , 2013, J. Assoc. Inf. Sci. Technol..

[221]  R. Jeantet,et al.  Available online at: , 2009 .