Skip to Main Content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.

Study Skills

Research skills

Online course for Cambridge researchers about publishing, managing data, finding and disseminating research.

Research metrics

Welcome to this module about the ways in which research can be measured.

undefinedData, Metrics, Key Performance Indicators... these terms are everywhere these days, as we increasingly seek hard data to monitor and improve the quality of many of the things we do. Research metrics can be very useful, but they also come with important caveats, so we need to be responsible in how we use these tools.

In this module, you will learn:

  • The meaning of common metrics such as  Journal Impact Factor and H-index
  • What are the main limitations of metrics
  • A better, responsible approach to using metrics

Types of metrics

Let’s start with the most commonly used research metrics. This video by Clair Castle at the Chemistry Library explains what they are and how they can be used.


To put this into practice, choose one article relevant to your work and follow these steps. Answers to the questions can be found at the bottom of this page. If you would like a breakdown of the steps for this activity, here is a technical guidance document 

  1. Open the article page on the publisher’s website and look for the number of citations for the article. Then search for the article title on Google scholar and check the number of citations listed on the search results page. Are they different? Why might this be?
  2. If the journal has an altmetrics report for the article, click on it. (If not, you may consider downloading plugins such as the bookmarklet for What can altmetrics tell you that was not captured by citations alone?
  3. Find the home page for the journal where the article was published. Does the journal advertise its Impact Factor? What does that number tell you?
  4. Search for the name of one author of the article on Scopus. (Note that, while very comprehensive, Scopus is not perfect in all disciplines. The course module on searching the literature suggests other discipline-specific databases you may want to try). What is their h-index? What does this number tell you?



There are many more types of research metrics, beyond the common ones discussed so far. If you are interested to know more Taylor and Francis have produced this quick and informative one-page guide.

If you want to dig deeper into metrics, the Metrics Toolkit has pages about a variety of metrics, with detailed explanations of how they are calculated and the appropriate ways of using them.

Uses and limitations of metrics

Metrics can be used to make comparisons at different levels.


In her blog post clarifying conversations about metrics, Lizzie Gadd identifies six possible uses for metrics.

Measure to understand. “Science of science” activities that study publication patterns and trends for the sole purpose of understanding them better.

Measure to show off. “Pick me!” activities. The use of metrics to market an individual, group or university on promotional materials or grant applications.

Measure to monitor. Plotting progress against an objective whether internally or externally set. This may include some comparison activity as outlined below.

Measure to compare. The use of indicators to compare one entity with another. University rankings are an example of this.

Measure to incentivise. The use of indicators to incentivise certain behaviours (recognising that once you start to measure anything it can act as an incentive by default). The measurement of open access content submitted to REF is an example of this.

Measure to reward. Any activity that results in some kind of reward for the entity being measured, be this a job, promotion, grant, prize or award of any description.

However, metrics have significant limitations. This video by Thomson-Reuters highlights some of the ways in which citation numbers can be misleading.


The risks associated with using metrics irresponsibly increase as one moves from large to small scale and from understanding to incentivising or rewarding. For instance, it is probably acceptable to use a simple metric to investigate how patterns of research vary between countries, but not acceptable to use a single metric to determine the hiring of individual researchers.

So what is a better alternative?

Responsible metrics

Using a single, or a few measures to assess research quality would be unfair and create skewed incentives that could perversely decrease the quality of the research. However, metrics still have value when understood and used appropriately. This video by Claire Sewell at the Moore Library will give you an overview of what we mean by the term ‘responsible metrics’. If you prefer, you could find the same information in this handy guide.

If you are interested in knowing more about the potential impact of irresponsible metrics and the recommendations made by the Independent Review of the Role of Metrics in Research Assessment and Management, you can read the Metric Tide report 

In 2019 the University of Cambridge signed the San Francisco Declaration On Research Assessment (DORA), committing to a responsible use of metrics in assessing research. Find out more about what this means through the slides below and on the DORA website.  

Declaration On Research Assessment ​ Improving how research is assessed ​
Promotes value of all scholarly outputs ​and Focuses on the merits of the work​
Do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions.
Recommendations for Research Institutes & Funders​. State that scientific content of a paper, not the JIF of the journal where it was published is what matters​ Consider value from all outputs and outcomes generated by research
Recommendations for Researchers​. Focus on content​ Cite primary literature when you can​ Use a range of metrics to show the impact of your work​ Change the culture!​
Recommendations for Publishers​. Cease to promote journals by Impact Factor; provide array of metrics​ Focus on article-level metrics​ Identify different author contributions​ Open the bibliographic citation data​ Encourage primary literature citations​
Recommendations for Metrics Providers​. Be transparent​ Provide access to metrics data​ Discourage data manipulation​ Provide different metrics for primary literature and reviews​


Types of metrics

You researched metrics for an article, journal and author. Here are the answers to the questions.

  1. Google Scholar often lists a higher number of citations for an article because it contains a broader range of publications, including for instance theses and grey literature, which are not included in the journal’s citations counts
  2. Using altmetrics you can see how much attention the article received in the news, policies, social media, Mendeley, and more. You can see geographical and temporal distributions and click on each source to see the original text that mentioned the article.
  3. The Journal Impact Factor is essentially the average number of times articles in a journal were cited over a 24 month period. This metric is proprietary, having been developed for WebOfScience. It can give a rough idea of how journals compare to each other, but has many limitations, examined in the third section of this module.
  4. A researcher’s h-index is the largest possible number for which the researcher has h papers, each cited at least h time[YN1] s. So an author with an h-index of 35 has 35 publications that have each been cited at least 35 times (they are likely to have more than 35 publications overall, but the less-cited ones do not contribute to the h-index). It measures both productivity and citation rates (a proxy for academic impact). This too has limitations, examined in the following section.

© Cambridge University Libraries | Accessibility | Privacy Policy