Skip to Main Content

Research Impact Challenge at Bradley University

This guide contains 10 activities for researchers to better understand and manage their online scholarly presence, as well as the impact and reach of their research.

Day 9: Alternative Metrics

Welcome to Day 9 of the U-M Library Research Impact Challenge!

Yesterday we looked at the h-index, a calculation of author productivity and impact. Today we’ll look beyond citation-based metrics, considering the ways that alternative metrics (or “altmetrics”) can add to the picture of what we know about the impact of scholarship.

Let’s get started!

Background

The term “altmetrics,” coined in 2010 in “altmetrics: a manifesto,” is defined as “the creation and study of new metrics based on the social web for analyzing, and informing scholarship.” (Priem, Taraborelli, Groth, and Neylon, 2010).

It’s easy to assume that altmetrics are all about social media (people tend to think of Twitter in particular), but that is only part of what they offer. By tracking links from all kinds of websites back to scholarly research, altmetrics can reveal references to and engagement with scholarship in the news, in policy documents, in syllabi, on scholarly blogs, and beyond.

Today’s challenge introduces you to a tool called Altmetric Explorer for Institutions. It’s important to note that Altmetric Explorer—a proprietary tool from a company called Digital Science—is by no means the only source for altmetric data. However, folks affiliated with U-M have access to the Altmetric Explorer, and it has some interesting features that make it easy for you to track and share information about how the web (and the world) is interacting with the research that is important to you.

About Altmetric (with a capital 'A')

First, some background about what Altmetric does and how it works:

Altmetric searches the web for "mentions" of research outputs, such as journal articles or book chapters, to show how readers are engaging with scholarly publications online. Mentions can appear in social media, scholarly blogs, news outlets, Wikipedia, citation managers like Mendeley, and more (read more about which sources Altmetric is tracking).

You may have seen Altmetric donuts, badges, and scores on journal websites, perhaps even attached to your own research. Each stripe of color on the donut represents a different type of engagement. For example, light blue indicates Twitter, red indicates news, and yellow indicates blogs. If you hover over the donut, you'll see an abbreviated summary of engagement with the work.

screenshot of hover menu breaking down citations by category

You can click on the donut to view the Altmetric details for that item and learn more about every single mention:

The Altmetric score of attention—the number inside the donut—is a proprietary number generated by both counting and weighting the value of different types of mentions. Altmetric describes the attention score as an "indicator of engagement." The score of attention does not communicate anything about the quality of the work. To learn more about the score of attention, see Altmetric's support page, "How is the Altmetric Attention Score Calculated?"

 

 

 

What next?

Learn more: 

  • Where it all began: J. Priem, D. Taraborelli, P. Groth, C. Neylon. (2010). Altmetrics: A manifesto, 26 October 2010.
  • The state of the art: Erdt, M., Nagarajan, A., Sin, SC.J. et al. (2016). Altmetrics: an analysis of the state-of-the-art in measuring research impact on social media. Scientometrics 109: 1117.
  • From 2013-2020, Altmetric announced an annual list of the 100 research outputs published in the last year, that also received the most attention in the last year. This list—and all the responses to it—provide an interesting perspective on what research captures people’s attention, how we can creatively analyze this information to make new inferences about the scholarly conversation, and plenty of healthy skepticism about altmetrics altogether. For context, I also recommend reading the following comments from 2018 about what we can learn from this list. 
  • As of 2021, Altmetric has decided to stop producing an annual "Top 100" list. Instead, visit their series of blog posts of how-to, tips, and tricks for maximizing Altmetric scores.

Preparing for the next challenge: 

Congratulations! You’ve completed Day 9 of the Library Research Impact Challenge, and we’re almost to the finish line—just one more day to go! Tomorrow, we’ll aim to prepare you to take your new knowledge and skills back out into the world by introducing frameworks for the responsible and ethical application of research impact metrics.