Gathering evidence about the effectiveness of “open access” publishing policies in agriculture

Gathering evidence about the effectiveness of “open access” publishing policies in agriculture

Hugo Besemer


There are many studies meant to demonstrate that “open access” publishing policies improve the impact of scientific papers. All these studies use some number of citation count (either counting citations in other journals, or web links to a paper) as a surrogate measure for impact. Especially amongst “open access” activists there is a number of quite prolific writers and bloggers. There is a current bibliography (1) that is updated regularly although it may miss some of the more critical and non-“open access” literature. Below we will discuss the main findings, especially in the light of possible questions with regard to the evidence that we would like to gather to show that an “open access” publishing strategy may work to make agricultural science be disseminated more effectively. We will than rase some ideas for studies to gather evidence that applies specifically to the agricultural sector. Such an effort should take into account a number of issues that are specific for the agricultural sector:

  • Agricultural science is applied science indeed and the potential readers of publications may be other beneficiaries than scientific peers, such as extension officers, readers from the private sector etc. We might need to establish other surrogate measures than citedness in scientific journals to quantify the impact of publications for such groups.
  • Reports always have been important in agricultural science and we should find ways to measure their impact. Almost all the studies discussed below concentrate on journal articles.
  • Agriculture science is multidisciplinary. There is evidence that “open access” journals rank higher in multidisciplinary fields (3). We should be able to show that this applies to different fields in agriculture.
  • A recent study showed that amongst African journal editors there is little awareness of, and scepsis if an “open access” model can be sustainable for their journals (5). We need to find evidence to convince them that going “open access” may attract better papers and lets their journals reach a wider readership.

A much debated area

There are reasons why these issues are much debated and why generalizations of findings are difficult. For one thing the way that scientific information is communicated has changed drastically over the past years

1. Citation studies require a certain time-span as papers need time to get cited. During this time-span many patterns in scientific communication have changed. When the first paper to demonstrate a positive effect of “open access” on citations appeared in 2001 (2) “online” was probably equivalent to “freely accessible on the public internet”. Nowadays most scientific publishers use the Internet to distribute scientific journals to their subscribers.

2. The term “open access” is used rather fluidly and sometimes misused (“open access for two weeks” meaning a trial subscription). It may mean at least the following things:

a. Published in an “open access” journal, i.e. a journal that is accessible on the Internet and does not charge a subscription fee. This is sometimes referred to as the “golden route”.

b. Published in a non-“open access” journal and later submitted by the author in an institutional repository (the “green route”). Thomson ISI estimated that in 2004 55% of the journals and 65% of the articles published in their “Web of Science” are produced by publishers who permit some sort of self-archiving.(3)

c. Many authors publish their papers on their personal websites, and many of them may feel that they do not have time to check each publisher’s archiving policy. According to (4) informal this method “de facto ” opens access to a considerable proportion of the articles in subscription-based journals.

d. Submitted directly to an institutional archive without being published in a journal.

3. Some studies concentrate on the level of individual articles, others on the journal level. In discussions it is sometimes forgotten that the impact factor of a journal is often a poor predictor for the number of citations of individual articles. Therefore conclusions on one level.

4. There is a perceived need for new measures for scientific impact (Web / URL citations, numbers of downloads) that take into account the Web as a means of scientific communication (8, 12)

What is agreed and what not

Studies that concentrate on the article level generally find a positive correlation between “open access” and the citedness of articles. These studies compare the citedness of “open access” and non-“open access” articles in collections of that are comparable otherwise: Lawrence’s classic study (2) compares articles from the same conference; other studies compare articles from a journal like PNAS that offer the author the choice to publish their article as “open access” or not (6). Many studies are from a limited number of fields like mathematics, (astro)physics, computer science etc. where the ArXiv document server plays an important role since the early nineties (see for example (7)) Studies across different disciplines show that the positive impact of “open access” may differ (8, 9) between disciplines. Some studies seem to indicate that this positive correlation between “open access” and citedness is most significant for the most cited papers (4) but there is certainly not a general agreement on this. (Personally I am also wondering what the causal relationship might be: are these articles better cited because they are online? I can also imagine that much cited authors get tired of sending around reprints and put their papers online for that reason)

There is no general agreement about the value of web/URL citation or web usage to predict the citedness of a paper (9)

At the journal level the most important study is from Thomson ISI, the producers of the science citation index. (4) It is important to note that according to that study in all disciplines that are compared “open access” journals are in the lower range of impact factors, but do better on the immediacy index (a measure to indicate that papers are noted on the shorter term). The proportion of “open access” journals from North America and Western Europe that make it to the ISI citation indexes is significantly lower (1.5% and 1.1 % respectively) than the percentage from other parts of the world (42.3 % from South/Central America and 14.9% from the Asia/Pacific region) This seems to be an indication that “open access” strategies can play an important role to let these other parts of the world play a more important role in the scientific arena.

Some ideas for studies in the agricultural area

Hopefully this paper will illicit a discussion what evidence we need to gather to convince different groups of players. Below are a number of

  • It would be relatively easy to repeat the study of impact factors of “open access” journals (3) for the agricultural sector. This would give an indication of the importance of “open access” for the communication amongst scientific peers.
  • Citations to reports are probably are probably not covered very well by the Science Citation Index but some of its recent competitors (Google Scholar, Scopus) may do better. This would give an indication of the importance of “open accessibility” for the impact of reports for a scientific target group.
  • It will probably not be easy to come up with a measure for the impact of publications (either reports or journal articles) for other groups than scientific peers (such as extensionists, private sector agricultural knowledge workers etc.) It would be possible though to count web / url citations by specific sites that are menat for those groups. It will require additional thinking to come up with a way to use such a measure to assess the impact of “open accessibility”.
  • The studies above concentrate on quantitative evidence. We should also think of qualitative measures that go beyond the anecdotal level.


1.    The effect of open access and downloads (‘hits’) on citation impact: a bibliography of studies
2.    Lawrence, S. (2001) Free online availability substantially increases a paper’s impact Nature, 31 May 2001
3.    McVeigh, M. E. (2004)
Open Access Journals in the ISI Citation Databases: Analysis of Impact Factors and Citation Patterns
Thomson Scientific, October 2004
4.    Wren, J. D. (2005)
Open access and openly accessible: a study of scientific publications shared via the internet
BMJ, 330:1128, 12 April 2005
5.    Daisy Ouya (2006) Open Access survey of Africa-published journals INASP infobrief 7: June 2006
6.    Eysenbach, G. (2006) Citation Advantage of Open Access Articles
PLoS Biology, Volume 4, Issue 5, May 2006
7.    Henneken, E. A., Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C., Thompson, D., and Murray, S. S. (2006)
Effect of E-printing on Citation Rates in Astronomy and Physics
ArXiv, Computer Science, cs.DL/0604061, v2, 5 June 2006, submitted to the Journal of Electronic Publishing
8.    Kousha, K. and Thelwall, M. (2006)
Google Scholar Citations and Google Web/URL Citations: A Multi-Discipline Exploratory Analysis
E-LIS, 05 June 2006, also in Proceedings International Workshop on Webometrics, Informetrics and Scientometrics & Seventh COLLNET Meeting, Nancy (France), May 2006
9.    Vaughan, L. and Shaw, D. (2005)
Web citation data for impact assessment: A comparison of four science disciplines (abstract only)
Journal of the American Society for Information Science and Technology, Vol. 56, No. 10, 1075 – 1087, published online 27 May 2005
10.    Mueller, P. S., Murali, N. S., Cha, S. S., Erwin, P. J. and Ghosh, A. K. (2006)
The effect of online status on the impact factors of general internal medicine journals
Netherlands Journal of Medicine, 64 (2): 39-44, February 2006
11.    Moed, H. F. (2005)
Statistical Relationships Between Downloads and Citations at the Level of Individual Documents Within a Single Journal (abstract only)
Journal of the American Society for Information Science and Technology, 56(10): 1088-¬1097, published online 31 May 2005
12.    Brody, T. and Harnad, S. (2005)
Earlier Web Usage Statistics as Predictors of Later Citation Impact
Author eprint, 18 May 2005, University of Southampton, School of Electronics and Computer Science, Journal of the American Association for Information Science and Technology, Volume 57, Issue 8, 2006, 1060-1072


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: