Citations and bibliometrics
One of the key ways of enhancing the academic impact of one’s work is to increase the number of times it is read and cited by other researchers. This note sets out some guidelines on the significance of citations to publications and some suggestions for improving the Faculty’s citation performance.
Citations to published work can come from a variety of sources such as newspaper articles and even patents but for the purposes of this note the discussion will be confined to the references researchers append to their published papers to indicate earlier work which is in some way significant for their own. The main concern here will be the growing use of citations as an indicator of research impact and hence, ways in which we can increase Manchester’s performance on this dimension.
The relevance of this note will be more immediate for those in social sciences than for those in arts subjects because citations are normally derived from journal articles rather than books or chapters within edited works and the former are better covered in the databases. However, the growth of electronic publication means that citation to books and book chapters is becoming more common. The European Science Foundation has been developing a European Reference Index for the Humanities to try to address the inadequacy of the Arts and Humanities Citation Index.
Many will contest the validity of bibliometric/scientometric analyses of citations but the fact is they are already widely used and institutionalised. The Treasury’s Ten Year Investment Framework for Science and Technology which underpins Research Council funding uses the UK’s relative citation performance as its first measure of performance; the Research Excellence Framework uses citations to a limited extent in the sciences as a measure of research quality; the Shanghai Jiao Tong Academic Ranking of World Universities (which is formally part of this University’s objectives) awards 60% of its scores to three bibliometric indicators.
The University independently commissions work on our bibliometric performance based on our share of the world’s most highly cited papers. In addition, citations, where appropriate, are used as a promotion criterion.
The use of these measures is more than an academic exercise, increasingly they will be used both directly and indirectly to drive funding decisions.
Two reasonably clear guides to using bibliometrics (and some of the limitations) are:
- The use of bibliometrics to measure research quality in UK higher education institutions a report by Evidence Ltd for UUK (tailored to the UK situation)
- White Paper – Using Bibliometrics, produced by Thomson Reuters
Apart from Web of Knowledge there are two other main sources used for bibliometrics
All are freely available at the University. Scopus claims a wider coverage than ISI and a more Eurocentric outlook. Google Scholar is in many ways more useful for social science and arts and humanities because it picks up more citations from conference papers, official reports etc but on the other hand it is not drawing upon a peer reviewed database.
A useful analytical tool for producing individual profiles from Google Scholar can be freely downloaded. It is called Harzing’s Publish or Perish. If using this be sure to uncheck irrelevant references by people with the same name (known to bibliometricians as homonyms!).
Many indicators can be derived from bibliometrics but the two most important are:
- Publication count or total number of publications – this counts the total number of publications in a given list of journals, normally those in the ISI database (which is itself mainly based upon those journals which receive the most citations (high Journal Impact Factor). This is used for the Shanghai rankings. Papers in the Social Science Citation Index receive a double weighting compared to those in the Science Citation Index to reflect that fact that fewer outputs are produced in our subjects.
- Citations per paper – sometimes described as the ‘crown indicator’, this calculates the average number of citations per paper, or the distribution of these citations, for a given institution, country etc. To allow for the fact that there are very different populations of researchers in different fields, and for the accumulation of citations over time, this is usually normalised by dividing citations per paper by the average impact of all papers in the given field in the year of publication. Normalisation also differentiates by paper type, measuring reviews against other reviews and original research articles against their equivalents.
Some RAE/REF panels in the past have used Journal Impact Factors as shorthand for assessing the quality of papers published in those journals. Quoting ISI’s Journal Citation Reports which lists Impact Factors, this is defined as follows:
The journal Impact Factor is the average number of times articles from the journal published in the past two years have been cited in the JCR year. The Impact Factor is calculated by dividing the number of citations in the JCR yearby the total number of articles published in the two previous years. An Impact Factor of 1.0 means that, on average, the articles published one or two year ago have been cited one time. An Impact Factor of 2.5 means that, on average, the articles published one or two year ago have been cited two and a half times. Citing articles may be from the same journal; most citing articles are from different journals.
In fact an impact factor is a poor tool for analysis of individual papers as a high impact factor may be driven by a few papers attracting a very large number of citations. Conversely 48.0 percent of social sciences articles and 93.1 percent of articles in arts and humanities journals are never cited. On the other hand most highly cited papers (possibly 90%) are published in the top 10% of journals by impact factor.
Inspection of impact factors tells us that the ‘hurdle’ to be in the upper echelons is quite low for most Faculty subjects. Hence, the highest impact factor for a sociology journal is 3.338, rising to 5.113 over 5 years (American Journal of Sociology). For business the Academy of Management Journal scores 5.017 and 6.029 respectively. The numbers fall away fast – number 10 in Sociology scores 1.577/2.366 respectively.
On the other hand the most highly cited researchers score very highly indeed. Among our iconics we can see for example:
- Nelson RR and Winter S, In Search of Useful Theory of Innovation Research Policy 6:1 1977 pp 36-76 375 citations on WoS
- Rothschild M and Stiglitz JE, Increasing Risk .1. Definition, Journal of Economic Theory 2:3 pp 225-243 1970 1,109 citations on WoS