Skip to Main Content
Help

Study Skills

Research skills

Online course for Cambridge researchers about publishing, managing data, finding and disseminating research.

Research metrics

Welcome to this module about the ways in which research can be measured.

Image of a tape measureData, Metrics, Key Performance Indicators... these terms are everywhere these days, as we increasingly seek hard data to monitor and improve the quality of many of the things we do. Research metrics can be very useful, but they also come with important caveats, so we need to be responsible in how we use these tools.

In this module, you will learn:

  • The meaning of common metrics such as  Journal Impact Factor and H-index
  • What are the main limitations of metrics
  • A better, responsible approach to using metrics
To complete this section, you will need:

 

  • Approximately 60 minutes.
  • Access to the internet
  • Access to journal article of your choice
  • Access to Scopus (all staff and students of the University of Cambridge have access to Scopus via their Raven login).
  • Some equipment for jotting down your thoughts, a pen and paper will do, or your phone or another electronic device.

Types of metric

Let’s start with the most commonly used research metrics. This video by Clair Castle at the Chemistry Library explains what they are and how they can be used.

 

Activity: Let's take a look at some of the most common metrics in action.

Choose one article relevant to your work, perhaps something you have read during your studies. Follow the steps outlined in this metrics activity to take a look at some of the key metrics. Note down your answers to the questions and then select Next to reveal the answers. Don't worry - you are not being marked! 

If you would like a breakdown of the steps for this activity, you can consult the Technical Guidance Document Linked to below.

Uses and limitations of metrics

Metrics can be used to make comparisons at different levels. The diagram below shows some of these uses:

List of different elements that can be compared using metrics: single publications, individual researchers, research groups, institutions and countries

In her blog post clarifying conversations about metrics, Lizzie Gadd further explores some possible uses for metrics:

  • Measure to understand: "science of science" activities that study publication patterns and trends for the sole purpose of understanding them better.
  • Measure to show off: "pick me!" activities. The use of metrics to market an individual, group or university on promotional materials or grant applications.
  • Measure to monitor: plotting progress against an objective whether internally or externally set. This may include some comparison activity.
  • Measure to compare: the use of indicators to compare one entity with another. University rankings are an example of this.
  • Measure to incentivise: the use of indicators to incentivise certain behaviours. The measurement of open access content submitted to REF is an example of this.
  • Measure to reward: any activity that results in some kind of reward for the entity being measured, be this a job, promotion, grant, prize or reward of any description.

(Taken from The Blind and the Elephant: Bringing Clarity to our Conversations about Responsible Metrics  by Lizzie Gadd)

However, metrics have been shown to have significant limitations.   Limitations of metrics: focus on numbers, lack of consistency, potential bias, potential for gaming, fit for purpose?

  • Focus on numbers: metrics are numerical measures which don't always take the wider impact of the research into account.
  • Lack of consistency: each metrics is calculated in a different way and often uses proprietary sources of information. This means that researchers are often given different metrics on the same piece of work.
  • Potential bias: there is a lack of consistency across disciplines and career stages which may unfairly disadvantage some researchers. For example, a metric which calculates the number of publications will reflect better on a more established researcher than a junior colleague.
  • Potential for gaming: practices such as self or gift citation make it easy to manipulate the numbers. 
  • Fit for purpose?: many of the most popular metrics in use today were developed for other purposes such as helping librarians to select stock, Is it really fair that they are now being used for career decisions?

The risks associated with using metrics irresponsibly increase as one moves from large to small scale and from understanding to incentivising or rewarding. For instance, it is probably acceptable to use a simple metric to investigate how patterns of research vary between countries, but not acceptable to use a single metric to determine the hiring of individual researchers.

So what is a better alternative?

Responsible metrics

Using a single, or a few measures to assess research quality would be unfair and create skewed incentives that could perversely decrease the quality of the research. However, metrics still have value when understood and used appropriately. This video by Claire Sewell at the Moore Library will give you an overview of what we mean by the term ‘responsible metrics’. If you prefer, you could find the same information in this handy guide.

If you are interested in knowing more about the potential impact of irresponsible metrics and the recommendations made by the Independent Review of the Role of Metrics in Research Assessment and Management, you can read the Metric Tide report 

In 2019 the University of Cambridge signed the San Francisco Declaration On Research Assessment (DORA), committing to a responsible use of metrics in assessing research. Find out more about what this means through the slides below and on the DORA website.  

DORA slides

Declaration On Research Assessment ​ Improving how research is assessed ​
Promotes value of all scholarly outputs ​and Focuses on the merits of the work​
Do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions.
Recommendations for Research Institutes & Funders​. State that scientific content of a paper, not the JIF of the journal where it was published is what matters​ Consider value from all outputs and outcomes generated by research
Recommendations for Researchers​. Focus on content​ Cite primary literature when you can​ Use a range of metrics to show the impact of your work​ Change the culture!​
Recommendations for Publishers​. Cease to promote journals by Impact Factor; provide array of metrics​ Focus on article-level metrics​ Identify different author contributions​ Open the bibliographic citation data​ Encourage primary literature citations​
Recommendations for Metrics Providers​. Be transparent​ Provide access to metrics data​ Discourage data manipulation​ Provide different metrics for primary literature and reviews​

Further resources

There is much to explore in the wonderful world of metrics. We have collected some of the most informative links below if you want to read further on any aspect of bibliometics, Altmetrics or responsible metrics.

The Metrics Toolkit contains lots of useful information on how traditional bibliometrics are calculated and how they can be used appropriately. 

Learn more about Altmetrics (alternative metrics) including how they are calculated and how to see the results on the Altmetric website. You can also see the metrics for your own research using the University of Cambridge login option.

The University of Cambridge has committed to the principles of DORA - the San Francisco Declaration on Research Assessment. You can read the document for yourself on the DORA website.

Did you know?

You can self-report that you have completed this module to have it added to your training record. Simply visit the booking page and register.

How did you find this Research Skills module

We'd love to hear about your experience of this module to keep improving our offer. Please take the feedback survey, it should only take around 2 minutes to complete. 

Image Credits: Image by Thomas Wolter from Pixabay

© Cambridge University Libraries | Accessibility | Privacy policy | Log into LibApps