Strategies and metrics commonly used to assess publication and research quality and impact.
Assessing Journal Credibility
Whether looking for credible research or deciding where to publish, these key indicators will help you assess the reliability of the journal. Reputable journals, editors, and publishers in the health science should generally be expected to adhere to the statements of ethics put forth by COPE and/or ICMJE. The Directory of Open Access Journals (DOAJ) offers a review of open access journal criteria and quality.
The credibility of an individual journal may be assessed by examining several key factors:
- Where is it indexed?
- Is the journal included or indexed in the major bibliographic databases for the field? Eg. PubMed, Web of Science, or Scopus.
- Are its articles (not just one or two) discoverable where the journal claims?
- Note: Google Scholar is considered a search engine, and a reviewed database for quality considerations as there is not any selection process or review criteria.
- What is its publishing history?
- How long has the journal been available?
- For new journals, is the journal mission clearly available? Who are the members of the editorial board?
- Is it peer-reviewed?
- How long does the peer review process take? Is this a reasonable time frame for a quality assessment?
Additional criteria and checklists for reviewing publisher credibility before selecting a journal to publish in can be found at Think-Check-Submit.
Finding Citation Counts
Web of Science provides information about how many times an article has been cited and by which publications. Citation counts can also be retrieved from the Scopus database, Dimensions database, or Google Scholar.
The Times Cited count can be used to follow the scholarly conversation provoked by a particular article. Citation counts are also used to determine the impact of an article or articles produced by an author.
- Search for the desired article or author in the respective database(s).
- The "Times Cited" or "Cited by" count will be listed for each article title in the result set.
- When searching for citation count summaries across multiple articles, use the "Create Citation Report" or "Analyze Results" options to the right of search results in Web of Science, or above the search results in Scopus.
Journal Impact Metrics
The Journal Impact Factor is a proprietary, statistical measure used to compare journals within a given field based on Web of Science provided citation data. Its calculation reflects the average number of times articles published in a journal over the previous two years were cited in the reported year. Impact Factors are published by subscription in the summer of each year as the Journal Citation Report. Journal impact factors must be compared only within their specified fields. Category quartile rankings or the Journal Citation Indicator, first released in 2020, should be used if comparing metrics across fields or disciplines.
Alternatively, the SciMago Journal (SJR) rank, is a freely available, normalized journal impact measure, based on Scopus-provided citation data. Weighted field citations over a three-year period are normalized to 1.0 allowing for comparisons across fields and disciplines. A journal with and SJR of 1.0 is performing at average for its field; those with SJR above or below one are performing that much above or below the expected average for a given year. SJR metrics are released annually each summer.
Author Impact Metrics
The H-Index is a popular tool for determining relative impact of an author's work by qualifying an author's cited publications.
The H-index is defined as the number of papers (H) that have that (H) or more citations
To find an author's H-Index in Web of Science:
- Search for author's name. Ex: Smith JB
- When results appear, link to the "Create Citation Report" on the right side of the screen.
- The H-index will appear with the Citation Report, on the right side of the screen.
To find an author’s H-index in Scopus:
- From a basic document search, switch to the Authors search tab
- Enter in author’s first and last name
- Select the appropriate profile(s) to view the calculated h-index
The Relative Citation Ratio (RCR) is an article level metric provided by the NIH Office of Portfolio Analysis’s iCite. A full calculation description can be found here. Since it is a normalized metric, summaries of author’s work can be reasonably made to provide author-level comparisons.
Note: articles must be indexed in PubMed, have a 12month publication history, and/or have at least 5-citations to receive a non-provisional RCR calculation.
Additional Metrics and Support
Additional metrics, including their definitions, use, limitations, and providers can be explored at the Metrics Toolkit: https://www.metrics-toolkit.org/
Our Research Impact and Publication Analysis Service can offer customized reporting, training, or consultation regarding the selection and application of appropriate metrics for your research project or group. This includes publication and citation statistical summaries, research profile support, co-authorship and network visualizations, and benchmarking. For metrics support, training, or customized reporting please contact us at Ask A Librarian or reach out to our Research Impact Informationist.
Ulrich's Global Serials Directory features information on over 300,000 periodicals.
Find information such as: Is this journal refereed (peer-reviewed)? Where it is indexed? Is it open access?