5.0029 Troubling Citation Study (1/60)

Elaine Brennan & Allen Renear (EDITORS@BROWNVM.BITNET)
Sun, 12 May 91 21:47:24 EDT

Humanist Discussion Group, Vol. 5, No. 0029. Sunday, 12 May 1991.

Date: Fri, 10 May 91 17:29:49 EDT
From: "Steven J. DeRose" <EL406011@BROWNVM>
Subject: Study of article citation

I just ran across an odd and troubling set of statistics in the
May 1991 Communications of the Association for Computing Machinery,
in the Editorial. The discussion centered around a study based
on the citation indices (or indexes, for the modernists among us).
A few extracts:

* There are 108,600 scholarly journals in all fields.

* Of papers published in the 4,500 most prominent journals 1981-1985:
55% were never cited
60% were never cited except by their own author(s)

Breaking this down somewhat by fields, the study claims:
14% were never cited, among "virology" articles
37% were never cited, among "physics" articles
56% were never cited, among "mathematics" articles
66% were never cited, among "electrical engineering" articles
and most troubling:
98% were never cited, among "arts and humanities" articles.

* There were, in 4-year institutions in the US,
about 100,000 "researchers" in 1968
about 200,000 "researchers" in 1988
(this makes for about a ballpark 3.5% growth rate, right?)

ACM failed to point out that this means each researcher has over half a
journal to her/himself. I certainly should work harder; I must admit
I haven't published a half-journal's-worth of articles every year!
Where are these all coming from, esp. since the count appears not to
include conference proceedings or monographs?

* A *Newsweek* reporter concluded from this that "nearly half the
scientific work in this country is worthless" -- ergo, 98% of our work
as humanists must also be worthless. I tend not to agree with this
(though I suppose we could each nominate particularly useless publications,
such as, perhaps, this posting).

* Naturally, the editor goes on to list many reasons why the citation
index is a worthless measure of an article's usefulness (along with
several gratuitous comments about sociology, teaching, and even the
medieval church).

Does this figure of 98% tell us anything? If so,
what, and what should we do about it? It would be interesting to do a
similar study for an earlier period and compare; or to check the number of
citations for several highly- respected articles; perhaps 98% of them
also have gone uncited? Do tenure committees weigh citations heavily,
and if so, does that make much sense? Related to current topics, what
does this tell us about our relationship to publishers? The number of
journals is climbing fast, and the increase of specialization has led to
relevant articles being scattered throughout many journals, at the same
time that the percentage of useful articles in any subscription I take
is lower. That means we are paying more and more for the same amount of
relevant information (this could be a direct cause of a decrease in
citation, if such there is). I would like to raise directly the
question: Can we design a better publication system, which will provide
more useful measures of usefulness, and cheaper access to specifically
relevant information?

Steve DeRose