Skip to Main Content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.

Study Skills

Wolfson College Academic Skills: Measuring the impact of your research

Help with finding, managing and using information from the Wolfson Library Team.

What are metrics?

Research impact can be measured in many ways. Qualitative methods vary; the most important one is various forms of peer-review. Quantitatively you can use publication counts; amount of research income; number of PhD students; size of research group; number of PI projects; views and downloads of online outputs; number of patents and licenses obtained; and metrics - ways of measuring patterns of authorship, publication, and the use of literature.

These are used by researchers to decide what to read and where to publish but also to measure how their own work may have been cited. Universities might use them to judge the quality of research and to decide on strategic priorities. They tend to be more heavily used by researchers in STEM subjects, because of the citation culture and that the key data suppliers are journal article-rich databases.

There are a bewildering array of metrics , all of which have their limitations. These include:

  • Research output metrics - to find out how often an output, or a group of outputs, has been cited by others.
  • Author metrics - to explore the impact of an author, or a group of authors, based on the citation rates of their outputs.
  • Journal metrics - to assess the impact of a journal and to compare journals in the same field.
  • Altmetrics - measures based on social media data rather than traditional publication citation counts. 
Read on to find out more about the different type of metrics and look at these guides produced by publishers:

Research in 3 minutes: metrics

Watch this video for a quick overview of terms and concepts.


This is statistical analysis of written publications. You can analyse journal titles using the well-known Impact Factor or alternatives such as Eigenfactor or SCImago. Alternatively you may wish to analyse an individual paper or researcher (including yourself). 

This measure relates to journal titles and is mainly used by researchers in Science and some Social Science disciplines. It is the average number of times articles from the journal published in the past two years have been cited in the Journal Citation Reports (JCR) year. It is therefore a measure of recent citation activity. Clarivate, who produce JIFs, also provide data over a 5 year period for fields where it take longer to build up citations. It can be an indication of how widely read and regarded a title is but remember that citations may be for negative reasons.


A journal that is cited once for every article that it publishes, has an impact factor of 1. InCites Journal Citation Reports index selected journals (generally the most cited) in some fields. Therefore niche and specialist journals are not represented, regardless of the quality that an academic community assigns to them.

To find out more click on the journal title in an article record inWeb of Science or connect to Journal Citation Reports.

As the JIF draws on data from certain journals indexed in the Journal Citation Reports database, it is essential to consider alternative ways of measuring the impact of a journal title.

Scimago - this ranking uses data from SCOPUS for journal and country rankings. Citation data is drawn from over 21,500 titles from more than 5,000 international publishers and from 239 countries worldwide. The data is displayed in graphs for the last 20 years. It is possible to limiting rankings to only Open Access journals, for authors looking to make their work available to as broad an audience as possible.

The Scimago Journal Ranking (SJR) indexes journals by their 'average prestige per article'. It is based on the idea that all citations are not created equal. SJR accounts for both the number of citations received by a journal and the importance or 'prestige' of the journals where such citations come from.


Eigenfactor - This uses the same JCR data as the Impact Factor. The differences are:

- Counts citations to journals in both the sciences and social sciences.

- Eliminates self-citations. Every reference from one article in a journal to another article from the same journal is discounted.

- Weights each reference according to a measure of the amount of time researchers spend reading the journal.

Eigenfactor scores are scaled so that the sum of the Eigenfactor scores of all journals listed in JCRs is 100.The top thousand journals, as ranked by this score, are all above 0.01. There is a mapping tool available to see the flow of citation between disciplines, based on the JCR data.

Eigenfactor map

Google Scholar Metrics provide a quick and relatively uncomplicated way of measuring the impact of journals. The are based around the h-index (see below, under author metrics, for details), and the specifics are laid out on the Scholar's help pages.

At the document level, you can count the number of citations that a paper or book receives. This simple measurement is listed on Google Scholar and databases such as SCOPUS and Web of Science. In general, there are more citations to a document in Google Scholar because it is a larger database, which a broader range of documents included in its index (e.g. reports, theses, less highly cited journals).
Use citation counts to make a judgement about the impact an article or author has made (although remember to compare like with like) and to build links between research that has something in common.
Find out more about using them to search for new material on our Doing a literature search tab.

This is an author-level metric which measures the cumulative impact of an author. It is a measure of the number of published articles by an author (or department/institutions) that have received at least h citations. For example, an author with an h-index of 10 has had 10 articles published which have at least 10 citations. They may have had many more published, but they have fewer than 10 citations.

It is a cumulative measure and therefore disproportionally benefits researchers who have been active for longer.

It is possible to find an author's h-index on Web of Science, SCOPUS and Google Scholar.

Select an author's name in Web of Science or SCOPUS to see their h-index (in WoS you need to click on 'Create Citation Report'). Only academics who have created an author profile have their citation data listed on Google Scholar. Here you can see the profile for Wolfson fellow Anna Bagnoli's profile, with her h-index highlighted in red.

Other author-level metrics include the g-index (which assigns more weight to highly cited articles) and the i-10 index, used only in Google Scholar (the number of publications with at least 10 citations).

If you want to produce a comprehensive report on your own research, explore the freely available Publish or Perish software, which uses Google Scholar data to calculate:
  • Total number of papers and citations
  • Average number of citations per paper, per author, per year
  • Average number of papers per author
  • h-Indexes
  • g-Indexes
  • The age-weighted citation rate
  • An analysis of the number of authors per paper

Problems with bibliometrics

Bibliometrics remain highly controversial as proxy indicators of the impact or quality of published researchers. There are many caveats to be aware of:

  • citation cultures vary enormously between disciplines; what is low ranking in one subject, may be considered high ranking in another.
  • seminal papers can skew results
  • review papers may be cited more than original research
  • self-citations 
  • poor quality research may be highly cited for the wrong reasons (see the, now retracted, Andrew Wakefield et al article making a link between the MMR vaccine and colitis and autism spectrum disorders).

This video demonstrates the importance of noting the possible pitfalls when using bibliometrics.


Alternative metrics track the online attention that research receives. These are a more immediate reflection of impact of work than more traditional metrics, which take many years to generate any meaningful data.. When judging the impact of research online it is helpful to consider:

  • Context - Raw data or metrics can be interesting, but they are less useful or meaningful without the story behind them.
  • Active engagement - Lightweight engagement (e.g. likes, shares, follows) might be a valuable indicator, but it might be difficult to assert what it actually proves.
  • Evidence of engagement or impact - This means having access to screen captures or urls, rather than descriptions.
  • Currency - it is easy for this sort of data to be deleted or buried, so record it as it happens, don't try to collect it retrospectively

They are intended to compliment traditional metrics, not replace them.


You can use in-built feature like Twitter Analytics or the statistics on your blog to see the impact you are making. Sites such as list the number of times a paper has been viewed. Alternatively, there are tools which aggregate impact such as:

Altmetric ▼

Altmetric tracks public posts on Twitter, Facebook, Google+, Sina Weibo; mentions on 1,300 news outlets; reference managers such as Mendeley; YouTube; Wikipedia; and policy documents. It uses doughnut infographics to communicate this use. You may see it on publisher websites, such as Routledge, but you can find out the score for any article with a DOI by downloading the Altmetric Bookmarklet.

The Altmetric Attention Score is a weighted count of the attention that a scholarly article has received. It is derived from three main factors:
- Volume (the score for an article rises as more people mention it)
- Sources (each category of mention contributes a different base amount to the final score)
- Authors (how often the author of each mention talks about scholarly articles influences the contribution of the mention)

More information and access is available from the Research Information web pages. Your Altmetric score will appear on the item page for anything you deposit in the university's repository, Apollo. You can also use your Altmetric Attention score and display your doughnuts on your profile page.


Plum X ▼

Plum Analytics uses alternative metrics too. In addition to mentions on social media, it also measures how many times an abstract has been looked at, link outs and views of the full-text article. It categorises metrics into 5 separate categories: Usage, Captures, Mentions, Social Media, and Citations. You can find out more about the metrics at Plum Analytics.
The distinctive 'splat' infographic appears in results in SCOPUS. Search for an article and click on the graphic to find out about the impact that the article has made.


You can also set up automated tools to track your impact on social media, such as Impact Story. For a list, along with their pros and cons, see the handout of a session delivered by Jenni Lecky-Thompson (Philosophy), Matthias Ammon (MML), Katie Hughes (OSC), Helen Murphy (English) in November 2017. Please note that you can no longer register with Storify.

Problems with altmetrics

Just as bibliometrics have their flaws, so do altmetrics. Mentions on Twitter and in the news do not necessarily correspond to research gaining a high profile. It may because the research was flawed, has been disproved, contained a funny word, or was involved in some tangential way to a news-worthy story. For example, when Stephen Hawking's PhD thesis was recently made Open Access, the scale of traffic to the server caused major problems on the internet. It was this, rather than the research itself, that was reported in the press and elsewhere online. However, it isn't possible to tell this from the raw data:

Altmetric score for Stephen Hawking's PhD - Properties of expanding universes (1966)

Managing your citations

Sign up for an ORCID Identifier:  The Open Researcher Community ID is an increasingly recognized persistent digital identifier.  See the tab on Publishing your Research for more information

Get a ResearcherID with Web of Science:  A ResearcherID can be linked to your ORCID number and facilitates citation metrics and publication tracking using Web of Science tools.  With a ResearcherID, you will be included in the Web of Knowledge author index allowing other researchers to learn more about your work and affiliations.  Sign up here.

Create a Google Scholar Citations Profile:  Google Scholar citations allows authors to track citations to their scholarly works and to calculate numerous citation metrics based on Google Scholar citation data.  By setting up a profile, you will be able to disambiguate yourself from authors with the same or similar names.

Increasing your citations

If you are interested in increasing the number of citations your work receives, you may be interested in the 33 ways mentioned in the paper Effective Strategies for Increasing Citation Frequency.

One way is to make your research more visible by making it Open Access. See the tab on Publishing your Research for more information.

Many publishers also offer ways of increasing the attention your articles receive, such as the Kudos service from Wiley.

Using social media is a great way to increase the coverage your work receives. If you aren't sure what you can share with followers and peers, check out How can I Share It?

Further resources

Metrics Toolkit - a resource for researchers and evaluators that provides guidance for demonstrating and evaluating claims of research impact.  With the Toolkit you can quickly understand what a metric means, how it is calculated, and if it’s good match for your impact question.

Snowball Metrics - a bottom-up initiative owned by research-intensive universities (including Cambridge) around the globe, to ensure that its outputs are of practical use to them, and are not imposed by funders, agencies, or suppliers of research information.

Collecting Research Impact Evidence - a guide for best practice.

© Cambridge University Libraries | Accessibility | Privacy policy | Log into LibApps | Report a problem