The last time I trawled the scholarly ‘sea’ for relevant, quality research was when I was studying for my Postgraduate Diploma/MEd around 5 years ago. Back then, citation and impact factor, government papers and contemporary, trending theorists and topics were the way to navigate, assess the waves and hopefully make a good catch.
But nowadays there are ‘donuts.’ Woven, colourful donuts that visualise the online ‘attention’ that scholarly articles in journals attract. And before you start thinking that I’ve had a senior moment and mixed up my home baking blog with this one, I am in fact referring to the donut style visualisation from Altmetric.com, a company who have ‘created and maintained a cluster of servers that watch social media sites, newspapers, government policy documents and other sources for mentions of scholarly articles,’ bringing all the recognition together to formulate article level metrics or “alternative metrics.”
Altmetrics.com present a very user friendly ‘Explorer’ interface for search and analysis using the Altmetrics API (also available for scholars/developers), a bookmarklet that you can drag to your search engine task bar that will report on attention received by research you visit online and embeddable ‘donut’ or label badges to denote online impact on users’ article pages. The two previous highlighted links also provide simple overviews as does the embed below.
Besides Altmetrics.com, there are a variety of websites and projects that are calculating online impact, such as ImpactStory, Plum Analytics, Public Library of Science (PLoS) and Publish or Perish. In turn, publishers have begun providing such information to readers, including the Nature Publishing Group, Elsevier and (again) the Public Library of Science,
The evolving field of altmetrics provides article-level data. This is in contrast to the traditional bibliometric, journal level, citation method which has received criticism for it’s quantitative bias that can be slow to reveal impact and open to manipulation.
As the altmetrics method uses a range of data sources, it is suggested that it can provide qualitative as well as quantitative information, and aspires to give a finer tuned picture of an article’s influence. It also has possible advantages of constructing that picture at a much greater speed than that of academic publishing.
However, as altmetrics are still in their infancy, there is not as yet a shared view on what choices, analysis or data combinations are a reliable indicator of influence. In addition, there is debate on the correct conduct within and across Twitter, blogs and other social media sources. Altmetrics.com comment themselves in their blog that , ‘Each altmetrics tool will have its own way of handling suspicious activity,’ and that they use ‘a combination of automatic systems and manual curation,‘ that does take much time and effort and so the company also requests that users aid monitoring and report anything unusual.
In terms of addressing scholarly consistency and widening access and impact to research, Ernesto Priego comments on the need for curating and maintaining an academic audience on Twitter, so that a tweeted article is propelled to an optimum reach. A ‘yin yang’ synergy of qualitative and quantitative methods is also argued for, with one informing and the other tempering, culminating in a fair and hopefully trustworthy measure.
Finally, just as assessment has always needed moderation in my familiar world of education and teaching, so does setting agreed standards in what constitutes quality assessment of research in order to bring excellence and consistency to practice. DORA (the San Francisco Declaration on Research Assessment – to which Altmetrics.com has signed) currently provides recommendations for academic institutions, funding agencies and organisations that supply metrics, reminding us in its 2012 report that it is ‘imperative that scientific output is measured accurately and evaluated wisely.’
Lovely donuty tree picture source Natasha Wescoat ‘wescoatart’