A quick post about Altmetrics, which we looked at in the DITA lecture and lab session this week. Alternative metrics (to give them their full name) are an alternative to traditional metrics. Traditional metrics, within an academic research context, tend to focus on quantifiable measurements, most often the number of citations an article receives in other articles. Alternative metrics are based on the initial premise of measuring something more qualitative: the social impact that the article/research might have. As well as this change in emphasis (or perhaps because of it), the scope of the statistics is widened, from the relatively narrow outlook of the discipline, or related disciplines, the article sits within, to a much wider, potentially global community. The measurements are primarily made (or collected anyway) through ‘mentions’ (a mention being defined as a link to the article embedded in a text) on various platforms – from broadcast media through to Twitter, Facebook and others.
A manifesto (always a welcome thing) by the instigators of the idea, setting out their vision for what altmetrics can be, can be found here. In this they also, with admirable honesty, present the notable question mark that hangs over how helpful altmetrics ultimately might be –
“Researchers must ask if altmetrics really reflect impact, or just empty buzz.”
Just because something is ‘mentioned’, it obviously doesn’t follow that it either has really had an impact (the author might just have lots of friends who have helpfully shared their work – or, more cynically, the services of a ‘click farm’), or that what has made an impact is necessarily the most impactful work in the field (there could be other research, less well publicised, sitting out there, unread). While the scope of the counting in this form of metrics has undoubtedly been widened, the fact remains that counting is still taking place – it is just based on a different form of citation.
Companies such as Altmetric, which we used the services of in the lab session this week, have developed, and continue to develop, the tools required to manipulate and use this data, and it’s clear that this is a path that should be followed (the idea that metrics taking into account a wider scope of impact should be thought of as ‘alternative’ at all seems quite incredible: it will surely become the ‘standard’/’traditional’ way fairly quickly, if it hasn’t, in some disciplines at least, already). Tools to analyse the context of the sharing – the semantic content of the tweet, post, article etc. – will greatly enhance what can be done, and (importantly) the validity (real or perceived) of the results that come out of it.
As I found in the lab session, at the moment, while they clearly havn’t reached their full potential, ‘altmetrics’ are very useful as another tool (but not the only tool) to find information, and to find that information in a different way. Once they have fully reached their full potential they will surely lose their first three letters and become just ‘metrics’ – they will be the norm. Going back to that manifesto again, which I really did quite like:
“No one can read everything. We rely on filters to make sense of the scholarly literature, but the narrow, traditional filters are being swamped. However, the growth of new, online scholarly tools allows us to make new filters; these altmetrics reflect the broad, rapid impact of scholarship in this burgeoning ecosystem. We call for more tools and research based on altmetrics.”