Skip to main content
Skip to navigation

April 23, 2015 Volume 36, No. 28

Social media metrics might help researchers measure scholarly impact

Use of alternative metrics becoming more common in academia

altmetrics

Illustration courtesy of MU Libraries.

YouTube tracks how many views a video receives and offers viewer feedback through comments sections and thumbs-up and thumbs-down prompts.

What if something similar were available to track online readership of scholarly works? A University of Missouri professor would know instantly the level at which his or her research is reaching the general public, and perhaps even some scholars.

Does this sound like a crude method of judging scientific impact? Strong views are held on both sides. Even so, many administrators, scientists and librarians at institutions across the country are curious about its potential.

As online social media explodes, many of its tools are being examined for measuring scholarly impact. In 2010, the term “altmetrics” (a conflation of alternative metrics) came in to use to refer to tallying usage of online social media platforms to measure the impact of professors’ research papers and articles on the general public.

Altmetrics quantify a scholar’s research reach online by tracking news outlets and science blogs mentioning the work, the number of downloads it receives, as well as shares on Twitter, Facebook and other social media platforms.

What it doesn’t do that well is tally scholarly citations, which science organizations are mostly interested in. “It tracks your research outside the scholarly world,” Gwen Gray, an MU Libraries social sciences librarian, said during an April 10 altmetrics workshop at Ellis Library.

MU Libraries devotes an entire web section to altmetrics, including listing providers and digital repositories, such as the University of Missouri’s MOspace, that track usage.

During the workshop, librarians showed a social media score from an alternative metrics provider on a research paper written by a University of Missouri professor. The service was the aptly named Altmetric. After feeding in the professor’s name and subject matter, Altmetric showed that the work had been picked up by 36 news outlets, had 44 Facebook posts, was linked to six science blogs and was Tweeted 125 times.

Tracking social media hits has its advantages for researchers, the librarians said. But the results can be ambiguous. Demographics are not available on the people using the social media and downloading papers. “It may not measure scholarly impact — what scholars are saying about your work,” said Janice Dysart, an MU Libraries science librarian.

Also, altmetrics don’t dig past quantification. Tweets of a work might be disparaging or positive. Downloads of a work can be tracked, but the providers are silent on how the downloads are used. Manipulation of the media score is also possible (such as by having friends and colleagues share a Tweet over and over).

Even so, it is likely that the services in this nascent field are going to get better at measuring impact. Clear utility for university administrators, researchers and librarians may soon be at hand. 

“I don’t see how it could hurt to flesh out this area,” said Anne Barker, MU Libraries head of research services. 

Learn more about altmetrics from MU Libraries. Or email MU librarians:

Anne Barker, barkera@missouri.edu

Janice Dysart, dysartj@missouri.edu

Gwen Gray, grayg@missouri.edu