These metrics are essentially 'counting' metrics as indicators of publication productivity and impact. They form the basis for most of the other metrics used in InCites.
Web of Science Documents - The basic productivity measure. The total number of Web of Science Core Collection items for a given entity (researcher, organization, region, funder, etc.) published in the given time period, regardless of document type.
Times Cited - The basic impact measure. The total number of times at least one item in a set of Web of Science Documents has been cited.
Percent Cited - The percentage documents in the set that have received at least 1 citation
Citation Impact - The ratio of Times Cited per Web of Science Document. This gives the average number of citations per item. This measure is more 'stable' for large sets of documents, otherwise small changes in the number of documents can result in large shifts in citation impact.
h-index (or hirsch index) - A measure designed to combine productivity and impact into a single number. The h-index for a set of documents is the highest number for which the following is true: "There are h documents cited at least h times". This number is designed to correct for the influence of highly cited papers or highly prolific authors in a single measure.
These metrics should only be used to directly rank or compare researchers or publications in the same research area and time period, since patterns of publication and citation practices can vary widely across disciplines.
Percentile and Normalized metrics make comparisons across journals, research areas, and organizations easier since they take into account research subject areas, publication year, and document type.
In percentile metrics, the percentile is the percent of items cited less often than the item of interest, and therefore higher percentile indicates better relative performance. (This change was made in June 2021)
Related percentile metrics are based on the papers reaching certain percentile benchmarks in the 22 Essential Science Indicators (ESI) categories, rather than the narrower Web of Science Research Areas. Arts & Humanities journals are not included in the ESI categories.
All metrics dealing with percentiles should take into consideration that analysis of small groups of records (such as a single researcher) may not have statistical significance; these metrics are more appropriately used for medium and large data sets.
Normalized metrics for citation counts compare the times cited for an item to the expected (average) number of citations received by other items of the same publication type, year of publication, and subject area or journal. If the normalized citation impact is a ratio less than 1, the set of items is performing poorer than expected; if it is greater than 1, the set of items is performing better than its peers.
All metrics of citation impact should take into consideration that citation counts for recent publications can be low and variable, so citation impacts may be influenced by the time period under analysis.
When analyzing collaborations, InCites provides certain metrics that can indicate what type of collaborations exist as evidenced by co-authored publications.
Journals are grouped into quartiles according to what percentage of journals rank lower than them when arranged by Journal Impact Factor, a measure of the average number of citations received in a two-year period. A Quartile 1 (Q1) journal ranks higher than 75% of the journals in its category, a Q2 journal ranks higher than 50% of journals, but lower than Q1 journals, etc.
75% < Q1 < 100%
50% < Q2 < 75%
25% < Q3 < 50%
0% < Q4 < 25%
InCites has several metrics that use this ranking.
Note: InCites uses the best quartile for journals that appear in multiple Web of Science Research Areas. However, when a research area is specified in an analysis, the quartile for that particular journal and research area is used.
Please keep in mind that since this is a ranking based on citedness of journals, not individual articles, it is not a direct measure of an individual article's or researcher's citation performance. It is only an indirect measure of journal quality or prestige and should not be used as the sole basis of evaluating a
When looking at a analysis result, if you click on the number of Web of Science Documents shown, you will be able to see a report listing all of the documents that contribute to the summary statistics. This report always has the same column headers, and cannot be changed.
Metrics/Data shown:
Using the Report:
Main Library | 1510 E. University Blvd.
Tucson, AZ 85721
(520) 621-6442
University Information Security and Privacy
© 2023 The Arizona Board of Regents on behalf of The University of Arizona.