Johannes Stegmann, Free University Berlin, University Clinic Benjamin Franklin, Medical
Library,
D-12200 Berlin
Email: stegmann@ukbf.fu-berlin.de
Résumé: Pourquoi ne pas utiliser les facteurs d'impact des
périodiques
L'article analyse la crise bibliométrique mentionnée
dans un article publié récemment dans le Bulletin d'Information de l'AEIBS et considère
l'image négative des facteurs d'impact qui y est représenté. Le potentiel d'une analyse
bibliométrique poursuivie au-delà des seuls facteurs d'impact considérés est étudié.
Une variation apportée à la détermination des facteurs
d'impact pourrait reposer sur un cadre d'analyse plus étendu dans le temps et sur
l'établissement de facteurs d'impact pour des périodiques non compris dans les rapports
de citation des périodiques est présentée.
Why not use Journal Impact Factors?
Abstract:
The "bibliometric crisis" mentioned in
a feature article published recently in the EAHIL newsletter is analysed further with
respect to the negative image of impact factors. The potential of bibliometric analysis
beyond impact factors is noted. A variation of journal impact factors, based on wider time
windows, and the construction of impact factors for journals not listed in the Journal
Citation Reports is commentated.
Background
In a recent article, Lazarev complains about librarians'
increasingly sceptical attitude towards bibliometrics (1). Although it seems that there is
a growing number of "bibliometrics papers" authored by libraries (see Figure 1;
it might be worth mentioning that of the papers presented at the EAHIL conference in
Utrecht also about 7 % deal with a bibliometric issue), the author is surely right
mentioning "chaos in bibliometric terminology" (1, p. 18; 2) and the apparent
predominance of electronic issues (1, p. 17). In addition, librarians might be prevented
from bibliometric analyses due to further reasons. One of them, perhaps, is the obvious
assumption shared by many people, that bibliometrics and impact factor usage are
identical, the latter being inevitably associated with "research evaluation".
Because evaluation of their own (or foreign) institutions hardly is a genuine goal of
librarianship, and taken into account the negative image of impact factors, it is - along
with patrons' objection to any kind of restrictions in journal acquisition -
understandable that librarians are hesitant to applicate quantitative bibliometric
analyses.
Citation analysis
However, bibliometrics is more than the mere application
of impact factors. Bibliometrics -also in its narrower meaning of citation analysis -
enables one to track science and to find out interrelationships between authors, journals,
and subjects (see 3 and 4 for review). Bibliometrics thus might help librarians to
contribute to a detailed analysis of the research activities of the institution they
belong to. This can be achieved by searches (for institutional addresses and/or author
names) in bibliographic databases and subsequent assignment of the individual papers
retrieved to scientific subfields, followed by citation analysis including cocitation
analysis and bibliographic coupling (3, p. 154-156).
Citation databases
At present, the citation databases (containing the
references cited in the indexed articles) necessary for these purposes are supplied only
by the Institute for Scientific Information (ISI). It has been suggested to build a freely
available, web-based "universal citation database" (5), but this seems not to be
a trivial task. For now, searching for cited references is restricted to the ISI databases
(of course, manual searching is also possible). Relevant to biomedicine and health
sciences are the databases SCISEARCH and SOCIAL SCISEARCH which are are offered by several
vendors including DIMDI (Deutsches Institut fuer Medizinische Information und
Dokumentation) as well as ISI itself through its Web of Science (6).
Journal impact factors
Although perhaps tired of discussing journal impact
factors, librarians cannot disregard them because of an upward trend in usage of these
journal measures by patrons. Journal impact factors, the underlying concept and the
associated data published in the JCR have been the topic of numerous articles. In the
EAHIL newsletter, too, impact factors and citation analysis have been extensively
discussed (7, 8) as well as the relations between "use", "value" and
"quality" of citations (8, p. 17).
Normally, journal impact factors are taken from the Journal
Citation Reports (JCR), an ISI product published annually in two editions
(science and social sciences) in print and on CD-ROM. The most recent science edition
(1997) lists 4,963 journal titles with (amongst other data) their impact factors which has
been calculated by dividing the number of cites (in ISI source journals) a journal
received in 1997 to papers of any kind published in 1995 and 1996 by the number of
research-relevant papers (articles, reviews, notes, but not editorials, letters, news
items, meeting abstracts) published in 1995 and 1996 in the journal in question.
With respect to the three years time-window (publications
in two years, cites in the third year), other periods are conceivable, e.g., in (9) the
citation averages of five years (papers published and cited 1981-1985) are listed for more
than 2,600 journals; today, these numbers might still be of some value. In a more recent
paper, it was suggested to apply not only the "short term" but also a
"longer term" impact factor to journal evaluation (10).
Constructed impact factors
It is also possible to calculate impact factors
independently of the JCR by use of the citation databases mentioned above (11, 12, 13).
This might be especially important for journals not listed in the JCR. The method to
"construct" impact factors for non-JCR journals is described in detail in (13).
In addition, the list of CIF (Constructed Impact Factor) journals will be made available
via the world wide web (13). It must be emphasized, however, that the CIF list completely
relies on the cited-references data collected in the ISI databases SCISEARCH and SOCIAL
SCISEARCH. However, a future might be imaginable with (a) bibliographic database(s) which
include(s) all cited references contained in non-JCR and non-ISI source journals. Those
data could be used in addition to ISI's data for more thorough citation studies including
impact factor calculations. Thus, why not use impact factors - in the future?
Figure 1: The percentage of library-authored
papers with a bibliometric subject was calculated from the data retrieved by online
searching at DIMDI in the databases SCISEARCH, SOCIAL SCISEARCH, MEDLINE, EMBASE, BIOSIS.
The search was restricted to libraries as authors
by applying the search step:
FIND CS=(LIBRAR? OR BIBLIOT?) (CS:
Corporated source; the "?" denotes end truncation).
The search was limited to bibliometric subjects
by applying the terms (as keywords and free text):
BIBLIOMETR?, SCIENTOMETR?, INFORMETR?,
CITATION?, CITED RFERENC?,
CITED JOURNAL?, CITING JOURNAL?, IMPACT
FACTOR?,
(EVALUA? AND RESEARCH?)/SAME SENT.
References:
Lazarev VS. On the role of bibliometrics in the knowledge society:
bibliometric quicksand or bibliometrics challenge? EAHIL Newsletter to European Health
Librarians 1998;(44):17-18.
Lazarev VS. On chaos in bibliometric terminology. Scientometrics
1996;35(2):271-277.
Osareh F. Bibliometrics, citation analysis and co-citation analysis: a
review of literature I. Libri 1996; 46(4):148-158.
Osareh F. Bibliometrics, citation analysis and co-citation analysis: a
review of literature II. Libri 1996;46(4):217-225.
Cameron RD. A universal citation database as a catalyst for reform in
scholarly communication. Firstmonday 1997;2(4) <http://www.firstmonday.dk/issues/issue2_4/cameron/index.html>
Institute for Scientific Information: Web of Science, <http://www.isinet.com/products/citation/wos.html>
Salmi L. Citation analysis and impact factors. EAHIL Newsletter to
European Health Librarians 1994;(28):17-18.
Lazarev V. Citation analysis: what property of cited documents is really
reflected: further to the paper by Liisa Salmi. EAHIL Newsletter to European Health
Librarians 1995;(33):16-17.
Schubert A, Glänzel W, Braun T. Scientometric datafiles. A
comprehensive set of indicators on 2649 journals and 96 countries in all major science
fields and subfields 1981-1985. Scientometrics 1989;16(1-6):3-478.
Moed HF, Van Leeuwen TN, Reedijk J. A new classification system to
describe the ageing of scientific journals and their impact factors.
Journal of Documentation
1998;54(4):387-419.
Christensen FH, Ingwersen P, Wormell I. Online determination of the
journal impact factor and its international properties. Scientometrics 1997;
40(3):529-540.
Stegmann J. How to evaluate journal impact factors. Nature 1997;390:550.
Stegmann J. Building a list of journals with constructed impact factors.
Journal of Documentation 1999;55(3):310-324.
Johannes Stegmann
Free University Berlin, University Clinic Benjamin Franklin, Medical
Library, D-12200 Berlin
email: stegmann@ukbf.fu-berlin.de
back to ToC EAHIL
Newsletter Nr. 47 (May 1999)