Skip to main content

Research Impact Metrics

Introduction to Altmetrics

Altmetrics = alternative metrics

“Altmetrics is the creation and study of new metrics based on the Social Web for analyzing, and informing scholarship.” -

Altmetrics are measures of the impact of a scholarly research product based on online activity, using information beyond scholarly citations alone. They are designed to capture research impact more quickly and to recognize more types of impact. The Altmetrics Manifesto describes the purpose of Altmetrics in detail.

Altmetrics can include data on a wide range of online activities. For example, some frequently tracked Altmetrics include the number of views or downloads an article receives, mentions in social media, bookmarks in Mendeley or Zotero, and citations in Wikipedia articles and news reports.

Altmetrics are designed to account for both scholarly and social visibility. While some tools and metrics distinguish between uses within and outside academia, others combine them.

There is no single tool, institution, or company that "owns" Altmetrics. Different tools include different measures and different approaches to interpreting them.

Most Altmetrics track the impact of a single publication. Unlike scholarly citations, Altmetrics can be well-suited to capturing the impact of a wider variety of publications than journal articles, such as datasets, blog posts, videos, software, conference presentations, and so forth.

Types of Altmetrics



Different Altmetric tools track different measures of impact. The following categorization system is used by Impact Story.



This scheme is representative of many Altmetric categorizations. It approaches the question of research impact by examining how published research is being used, and then finding measures of online activity that indicate different types of use.

Other tools use similar, but not identical, sources of Altmetric data. For example, PLOS tracks extensive metrics on articles in its journals. The suite of PLOS article-level metrics does not include delicious bookmarks, but adds Reddit activity and use of graphics on figshare. adds additional measures.

The following tabs discuss Altmetric categories in more detail and provide examples of how these types of usage are tracked by different tools.

Note: Some tools aggregate multiple Altmetric measures into a single total or score. Others view this as an inappropriate use of Altmetrics and instead provide separate information for each measure.

Article views are the simplest measures of use and are most frequently tracked using with HTML page views, PDF downloads, or both. Some also include abstract views or downloads of figures in this category as well.

Views are sometimes described as evidence of "potential impact." They can indicate that a publication has attracted readers' attention, but by themselves, do not necessarily indicate the meaning of that attention. Some sources call these metrics readership counts.

If an article is much more highly viewed than it is cited, it may indicate that the article has attracted attention from a diverse online audience. Students or members of the general public, for example, may benefit greatly from reading an article, but are unlikely to provide evidence in the form of formal scholarly citations.


ImpactStory uses HTML downloads to count views by the general public and PDF downloads to count views by scholars. Example.

PLOS reports HTML page views and PDF and XML downloads of its articles at the PLOS sites and HTML views and PDF downloads at the PubMedCentral site. Example. does not track page views; however, some of the publishers that incorporate it report that information separately.

Many Altmetric sources track citations as well as newer metrics of online use. In addition to citations in scholarly sources, some Altmetrics tools track citations in popular sources like Wikipedia or news articles to measure the impact of a publication with the general public.

Note: When citations are reported from more than one source (e.g., both Web of Science and Scopus), it is not accurate to simply add up those sources to create a citation total, as the same citation is likely to be duplicated in multiple tools. Ideally, the full lists of citations should be obtained and compared, and duplicates removed, to create a correct total.


ImpactStory reports popular citations in the form of Wikipedia articles as well as scholarly citations from Scopus. Example.

PLOS uses multiple sources of scholarly citations, including Scopus, Web of Science, CrossRef, and PubMed Central. Its articles' metrics pages also link to Google Scholar. Example. includes Scopus citations, Wikipedia citations, and mentions in news articles. Example (article metrics at the right; click the "donut" to view the full Altmetric report).

"Saved" metrics, also known as "captures," refer to measurements of the number of times an article is saved in some online tool for later use. They are tracked using bookmarking sites or reference management tools such as Mendeley.

If an article is saved, it can indicate a number of potential future uses - that the reader plans to cite it later, share it with other scholars, assign it in class, or set it aside to use in a future project. It can also indicate that a user is simply filing it to be read later.


ImpactStory reports popular saves using Delicious bookmarks and scholarly saves using CiteULike and Mendeley. Example.

PLOS uses CiteULike and Mendeley. Example. reports Mendeley bookmarks, but does not include them in the total Altmetric score. Example (article metrics at the right; click the "donut" to view the full Altmetric report).

"Discussed" metrics, also known as "mentions," are measures of the amount of online discussion and attention an article attracts. These metrics are often among the quickest to appear after an article's initial publication.

Metrics in this category include citations in blog posts, comments on the article itself (if the publisher's website allows them), and most measures of social media activity (e.g., Twitter mentions).


ImpactStory reports popular discussion using blogs, Twitter, and Facebook. Scholarly discussion is tracked using selected science blogs and journal comments. Example.

PLOS uses Twitter, Facebook, Reddit, Wikipedia, PLOS comments, ResarchBlogging, ScienceSeeker, Nature blogs, and to track activity. Note: While ImpactStory calls Wikipedia activity an example of a popular citation, PLOS puts it in the "discussed" category. This illustrates the multiple ways in which it is possible for the same emerging metrics to be interpreted. Example. includes a range of social media and multimedia platforms, such as Facebook, Twitter, LinkedIn, Pinterest, Google+, YouTube, and Reddit. Example (article metrics at the right; click the "donut" to view the full Altmetric report).

Metrics in this category include sites that provide expert recommendations or "post-publication peer review." The most commonly used source is Faculty of 1000 (F1000).

As there are not many sources of data in this category, publications are overall least likely to accumulate "recommended" metrics.


ImpactStory reports popular recommendations by tracking press articles, and scholarly recommendations via F1000. Example.

PLOS uses F1000 Prime. Example. does not use this category.

Science Librarian

Brent Tweedy's picture
Brent Tweedy
Bizzell Memorial Library 235
(405) 325-5287
Website Skype Contact: