Skip to Main Content

Research Impact

A guide to how to maximise your research influence and extend your metrics.

Metrics

Research can be an isolating, grueling process, and often, once a paper is published, it is difficult to be sure that it has made an impact on the world. Different metrics can be used to report research efficacy, which may include page views, downloads, and (most commonly), citations in other publications. The traditional thinking goes that if a paper has been influential enough to be cited in another publication, the ideas contained within are of greater value.

This logic is sound: to an extent. Many databases and repositories will offer metrics that report on a paper's impact, but these must be used with caution. Tools like OpenAlex, SciVal, and InCites (all described below) offer insight and metrics, but remember:

  • Metrics are often proprietary, limited to a single dataset owned by a company like Elsevier or Clarivate, which means some publications are excluded;
  • Recent publications will have lower citation rates, which is due to the time it takes to write and publish a paper that may respond to or build on an idea;
  • Certain publications may have higher citation rates due to their form (reviews, front matter) or specific content, as well as unethical citation practices like Citation Clubs.
  • Certain metrics may be inflated, or are difficult to compare between databases, so should be reported with caution.

Publication-level metrics should be addressed critically, but these remain a valuable factor in competition for employment, advancement, and awards, so you will be expected to be aware of how your research is engaging the community.

In growing numbers, scholars are moving their everyday work to the web. Online reference managers Zotero and Mendeley each claim to store over 40 million articles (making them substantially larger than PubMed); as many as a third of scholars are on Twitter, and a growing number tend scholarly blogs. These new forms reflect and transmit scholarly impact. (Priem et al, 2)

Jump to:

What are Altmetrics?

Types of Altmetrics

Areas of Caution

Altmetrics - literally, alternative metrics - are designed to reflect the different ways scholarly output can influence the world, beyond simple citation in academic articles. Altmetrics do not replace traditional bibliometrics, but rather complement them by mining areas of influence which were heretofore underused.

This new and burgeoning field offers a great deal of opportunity, along with considerable areas for caution.

Altmetrics takes on many forms, and can grow and expand depending on innovations in communication tools and social media. The philosophy behind Altmetrics states that citations on social media, television, and in public policy are just as valid points of research impact, and by using API tools to scrape references soon after they are made, Altmetrics are reactive and up to date.


Types of Altmetrics

Credit: "Altmetrics" from the University Library System, University of Pittsburgh is licensed under a Creative Commons Attribution 4.0 International License

Articles

  • Citations: Scopus, Web of Science, PubMed Central, and Google Scholar citations; citations in policy documents
  • Bookmarks: scholarly bookmarks on Mendeley & CiteULike; bookmarks by the public on Delicious & Pinboard; Twitter favorites
  • Discussion: peer reviews on F1000, Publons, and other post-publication peer review websites; Twitter mentions and Facebook wall posts; newspaper articles, videos, and podcasts; mentions on scholarly blog networks like ResearchBlogging
  • Shares: Twitter mentions, Facebook shares
  • Views: Pageview & download statistics from the journal website or repository where you've archived your paper

Books and book chapters

  • Citations: Web of Science and Scopus citations; Google Book citations
  • WorldCat holdings: the number of libraries worldwide that have purchased your book
  • Views: Pageview & download statistics from your publisher's website or the repository where you've archived your book/chapter.
  • Ratings: Amazon.com and Goodreads ratings
  • Discussion: see "Articles" above
  • Bookmarks: see "Articles" above

Data

  • Citations: Data Citation Index and Google Scholar citations
  • Views: views and downloads from Figshare, Zenodo, Dryad, ICPSR, or other subject or institutional repositories
  • Reuse: GitHub forks
  • Discussion: Figshare comments; also see "Articles" above
  • Bookmarks: see "Articles" above

Software

  • Citations: Google Scholar citations
  • Downloads: download statistics from GitHub, Bitbucket, Sourceforge, or other institutional or subject repository
  • Adaptations: GitHub forks, Bitbucket clones
  • Collaborators: GitHub collaborators
  • Discussion: GitHub gists, mentions on Twitter, Figshare comments
  • Bookmarks, Shares: see "Articles" above

Posters

  • Views: views and downloads on Figshare, Zenodo, or other institutional or subject repository
  • Discussion: Figshare comments; seealso "Articles" above
  • Bookmarks, Shares: see "Articles" above

Slides

  • Views: views and downloads on Slideshare, Speakerdeck, and Figshare
  • Discussion: Slideshare and Figshare comments; see also "Articles" above
  • Shares: Slideshare embeds on other websites; mentions on Twitter, Facebook shares, LinkedIn shares
  • Likes: Slideshare and Speakerdeck likes
  • Bookmarks: see "Articles" above

Videos

  • Views: Youtube, Vimeo, and Figshare views
  • Likes/Dislikes: Youtube likes and dislikes; Vimeo likes
  • Discussion: Youtube, Vimeo, and Figshare comments; see also "Articles" above
  • Shares, Bookmarks: see "Articles" above

Areas of caution

Altmetrics do expand the scope of your research impact, but that expansion can mean results must be analysed carefully and with caution. Remember:

  • Altmetrics focus on frequency of use, such as increased sharing of a meme attached to a research piece: counting the numbers alone does not take into account the context of the piece
  • High Altmetric numbers on social media could be attributed to negativity rather than an endorsement of the content.
  • Altmetric numbers can be skewed, or gamed by individuals who artificially inflate their numbers with bots or other forms of exaggeration
  • Popular topics may gain more traction on social media, which disadvantages those working in more technical fields. We should be careful not to necessarily equate popularity with quality.
  • Paywalled articles may have a harder time gaining traction being shared in a social media context (although this is a strong advertisement for Open Access)
  • There is a general suspicion of the quality of social media engagement, particularly as moderation has broken down on some of the major sites, so these metrics may not be as reliable as others.

Works Cited

Priem, Jason; Taraborelli, Dario; Groth, Paul; and Neylon, Cameron, "altmetrics: a manifesto" (2011). Copyright, Fair Use, Scholarly Communication, etc.. 185. https://zenodo.org/records/12684249 

Sgouras Jenkins, Aimee. "Altmetrics" from the University Library System, University of Pittsburgh. Licensed under a Creative Commons Attribution 4.0 International License

Impact Tools

A Canadian-founded article database named for the legendary Library of Alexandria, OpenAlex was launched in 2023. It features access to hundreds of millions of journal articles (a large proportion of which are Open Access), and is most notable for striving to provide access to materials not available in its paywalled rivals like Scopus and Web of Science.

As OpenAlex notes:

OpenAlex offers an open replacement for industry-standard scientific knowledge bases like Elsevier's Scopus and Clarivate's Web of Science. Compared to these paywalled services, OpenAlex offers significant advantages in terms of inclusivity, affordability, and availability.

Importantly, OpenAlex aims to capture citation patterns and publication data from a wide variety of journals, including those not captured in other indexes, such as Law and the Humanities. For reporting purposes, OpenAlex offers insight into Sustainable Development Goals, citation data, and author level information. This is a particularly useful database for searches involving smaller-sized journals and Canadian content.


Click the link below to explore OpenAlex!

SciVal is a benchmarking and analytics tool, owned by Elsevier, which draws from the Scopus database (1996-onwards). Access is provided by TMU Libraries for members of the community to explore, or librarian support can be requested. Metrics are updated weekly from Scopus, and coverage includes data on editorial boards, quality, discipline coverage, publication type, retractions and additions, and metric scope.

InCites offers perspectives on the following selected indicators:

  • Field-Weighted Citation Impact (FWCI)
  • Author/total articles
  • Outputs in top percentiles
  • International collaborations
  • Sustainable Development Goals

For access to SciVal, click the link below.

InCites is a benchmarking and analytics tool, Owned by Clarivate Analytics, and which draws from the wide-ranging Web of Science database (1980-onwards). Access is provided by TMU Libraries for members of the community to explore, or librarian support can be requested. Metrics are updated monthly from Web of Science, and coverage includes data on editorial boards, quality, discipline coverage, publication type, retractions and additions, and metric scope.

InCites offers perspectives on the following selected indicators:

  • Category Normalised Citation Impact (CNCI)
  • Percentage of documents in top 1% or 10%
  • Times cited
  • h-index
  • Sustainable Development Goals

For access to InCites, click the link below.

Creative Commons License

This guide has been created by the Toronto Metropolitan University Library and is licensed under a Creative Commons Attribution International 4.0 License unless otherwise marked.

Creative Commons Attribution License