Publish and… perish

In the old days, research quality was measured by the number of papers you published. Publishing was a hard process and only few scientists were able to publish several papers per year. However, with the bloom of new journals, the appearance of electronic editorial process, and the specialization of research fields, the number of publications per year has grow exponentially during the last decades. Thus publishing is not longer a good measure of the quality of research. As an example if this I recently attended a talk by Sid Redner in which he showed the following data extracted from the Physical Review citation data of 353000 papers:

  • Only 11 papers got more than 1000 citations
  • 245000 got less than 10 citations
  • 100000 got one or none citations

For me it is amazing to see that roughly 1/3 of the papers published got almost no citations at all. It is a publish and perish process in which 1/3 of the papers are lost.

long-tailThe situation is similar to what has been found recently in the music industry.  Out of 13 million sons available to buy online, 10 million of them have never been bought. As Will Page put it:

  • Only 20% of the tracks in their sample are ‘active’, that is to say they sold at least one copy, and hence, 80% of the tracks sold nothing
  • 80% of the revenue came from around 3% of the active tracks
  • Only 4 tracks sold more than 100000 copies

This led Will Page to question the Long-Tail theory by Chris Anderson which states that the market share of low demanded items can be bigger than that of best-sellers. To put it in mathematical terms, the mass of the distribution in the tail of can be bigger than the mass around the peak of the distribution. This happens mostly with Pareto-law distributions and thus the name “long-tail”. But Will Page’s data seems to suggest that there is not even such a tail and planning your business in the long tail is risky: if you center your business plan in trying to sell the tail of the distribution, most probably you won’t succeed. As Andrew Bud (Executive Chairman from Mblox) put it: “in this tail, you starve”

The same can happen to a journal if it lives in the long-tail: what is the fraction of perishable papers a editor of a journal is willing to accept? What is the “citation model” the journal is intending to have? We all know about the impact factor of a journal, which is only giving us information about (mostly) regular-cited papers. A better information will be also the zero-index of a journal (or a researcher), i.e. the fraction of papers that never get cited at all. An idea I am working on recently… Stay tuned

You may also like...

5 Responses

  1. Artur Adib says:

    Hmm, it seems to me that one can tune a power law so that either the Pareto principle or Mr. Anderson’s regime are true.

    So, isn’t it just a matter of empirically analyzing the data and establishing which one is true: given a threshold, say 20%, is the mass more concentrated in the first 20 percentile, or in the remaining 80 percentile?


  2. admin says:

    The problem is not how to describe (statistically) the tail. The report by Will Page is telling you that there is not even a tail, because 80% of the items are not sold at all.

  1. October 24, 2009

    […] Publish and… perish | Implicit None (tags: citation bibliometrics research reference TheLongTail) […]

  2. July 16, 2010

    […] again, a system where a full 1/3 of all published articles are never cited (!) is taking it a bit far.  But that’s where we […]

  3. March 16, 2011

    […] and then publish high confidence results only (positive or negative). However, the current “publish or parish” attitude makes this approach career suicide. But, there could be a shift towards “more […]

Leave a Reply

Your email address will not be published.