Evaluation with Altmetrics
Altmetrics arise from the need and the possibility of the development and adoption of metrics that include the use of articles and not only for article citation. These alternative metrics enable network behavior analysis (of coauthorship networks of knowledge, institutions networks, network projects, etc.). Altmetrics should not be viewed as alternatives but rather as complementary metrics to evaluate the dissemination of science and its impact based on the use of the publications. Thus, the planning of scientific production divulgation can use robust platforms, accelerating
Acronym |
Bibliometric indicators |
p |
Number of publications in WoS-covered journals of a specific entity in a given time period |
C |
Number of citations without self-citations |
CPP |
Average number of citations per publication without self-citations |
Pnc |
Percentage of publications not cited |
JCS |
Journal Citation Score—average journal impact for each journal used by the specific entity, without self-citations |
FCS |
Field Citation Score—average field-based impact as an international reference, without self-citations |
CPP/ |
Comparison of the actually received impact of the specific entity with the |
JCS |
worldwide average |
Source: Van Raan (2012, p. 458)
information availability and accessibility, allowing the dissemination of knowledge to diverse audiences.
Table 5.7 presents some tools for data collection, aggregation, and compilation. These tools are organized into five categories: (1) coverage (articles, books, etc.), (2) usage (access and downloads), (3) citations (in several databases), (4) capture, and (5) social media.
Thus, it is possible to obtain statistical information about scientific production and impact at an institutional level going beyond citation studies. The impact can also be observed in the use of publications in various platforms, specifically by obtaining information on access to publications and the number of downloads. Additionally, it is possible to know the geographic distribution of users. Citations can be measured not only in the traditional way but also through other spread channels, such as in videos, posts, and blogs, among others. The individual academic profile of researchers can also be enriched with information relating to other products of their activities. It is possible to consider that altmetrics yield an immediate view of the visibility of publications, which allows knowledge managers to gauge some of the social impacts of science. The potential of these metrics is limited by traditional problems inherent to the lack of standardization of data and of clear definition about what each indicator represents.
Due to technological developments, new metrics have emerged based on Internet resources. These platforms are continuously collecting
Article- level metrics tool |
Main categories of sources for aggregation of information |
||||
Coverage |
Usage |
Citations |
Captures |
Social media |
|
ALM-PLoS www. plosone.org/ static/almlnfo / #static-content- wrap |
Papers from PLoS |
PLoS and PubMed Central |
PubMed Central, Scopus, IS I Web of Science and CrossRef |
CiteULike, Mendeley, Reddit, Google+, Stumble Upon Connotea |
Twitter, Facebook, Google Blogs, Researchblogging, org, Nature Blogs |
Altmetric www.alt metric.com |
Scholarly articles |
PubMed, ArXiv or pages containing a DOI |
Scopus, Web of Science CrossRef |
CiteULike, Mendeley |
Twitter, Facebook, Blogs, YouTube, Google +, Pinterest, Wikipedia, Weibo users, Redditors |
ImpactStory impactstory.org |
АД the research products (journal articles, blog posts, data sets, and software, etc.) |
PLoS, PubMed, ArXiv, SlideShare, Vimeo, YouTube, Dryad package views, Figshare views, webpages (from Impactstory), ScienceSeeker, ORCID |
Scopus, Web of Knowledge, High wire, Google Scholar Citations, PubMed |
CiteULike, Mendeley, CrossRef, Vimeo, Figshare, GitHub, SlideShare, YouTube, Delicious |
Twitter, Facebook, Blogs, Figshare, Wikipedia, Vimeo, YouTube, SlideShare, Delicious, GitHub |
70 EVALUATING COLLAB ORATION NETWORKS IN HIGHER..
Table 5.7 (continued)
Article- level metrics tool |
Main categories of sc |
mrces for aggregation of information |
|||
Coverage |
Usage |
Citations |
Captures |
Social media |
|
Plum Analytics www. plum analy tics.com |
Journal articles, books, videos, presentations, conference proceedings, data sets, source code |
EBSCO, PLOS, bit.ly, Facebook, GitHub, Dryad, Figshare, SlideShare, Institutional Repositories, WorldCat |
CrossRef, PubMed Central, Scopus, USPTO |
CiteUFike, Delicious, SlideShare, YouTube, GitHub, Goodreads, Mendeley, Vimeo |
Facebook, Reddit, SlideShare, Vimeo, YouTube, GitHub, StackExchange, Wikipedia, SourceForge, Research Blogging, Science Seeker, Amazon, Google Plus, Twitter via DataSift |
Source-. Melero (2015)
WHAT DO WE MEASURE BY EVALUATING RESEARCH.

Fig. 5.2 Measure of impact. (Source: Melero (2015))
information not only from databases but also from their use. On the user side, it is possible to interact with the platform at different levels: search and select articles relevant to the researcher, read the articles and save them (either on a hard drive or in the cloud), and in the web 2.0, rate, comment, recommend, and share them. All those user steps are automatically registered and will feed the metadata for each item accessed. Thus, article initial data (publication magazine, year, title, authors, abstract, and keywords) will become richer by access and use data.
Publication changes, from the analogic to the digital world, allow not only a reduction in time for publication but also rapid distribution and global spread. These changes have also an impact on metrics. As shown in Fig. 5.2, the main differences between traditional metrics and altimetry are summarized in two axes. The first axis considers measure granularity: the traditional impact metrics entities are magazines while in altmetrics there is an increase of granularity as the impact is seen from the articles and authors (disaggregation from journals as entities). The second axis is regarded as the time dimension, that is, altmetrics provide impact data in real time (immediate), while traditional methods need longer times to evaluate impact data.
It is also necessary to understand what each indicator measures: an article’s citation is different from the download of this article. Despite the difference, one can think of an association between articles with a high number of downloads and articles cited, but we should not confuse the concepts with the traditional impact indicator. Anyways those indicators can contribute to defining personal and institutional strategies of dissemination and disclosure of production. Using such tools in association with institutional repository policies networking will generally lead to a more intense use of its scientific production.