Integrated Impact Indicators Compared With Impact Factors: An Alternative Research Design With Policy Implications

被引:118
作者
Leydesdorff, Loet [1 ]
Bornmann, Lutz [2 ]
机构
[1] Univ Amsterdam, Amsterdam Sch Commun Res, NL-1012 CX Amsterdam, Netherlands
[2] Max Planck Gesell, D-80539 Munich, Germany
来源
JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY | 2011年 / 62卷 / 11期
关键词
WORSHIPING FALSE IDOLS; CITATION ANALYSIS; PERCENTILE RANK; FACTOR DILEMMA; SCIENCE; JOURNALS; LIBRARY;
D O I
10.1002/asi.21609
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In bibliometrics, the association of "impact" with central-tendency statistics is mistaken. Impacts add up, and citation curves therefore should be integrated instead of averaged. For example, the journals MIS Quarterly and Journal of the American Society for Information Science and Technology differ by a factor of 2 in terms of their respective impact factors (IF), but the journal with the lower IF has the higher impact. Using percentile ranks (e. g., top-1%, top-10%, etc.), an Integrated Impact Indicator (I3) can be based on integration of the citation curves, but after normalization of the citation curves to the same scale. The results across document sets can be compared as percentages of the total impact of a reference set. Total number of citations, however, should not be used instead because the shape of the citation curves is then not appreciated. I3 can be applied to any document set and any citation window. The results of the integration (summation) are fully decomposable in terms of journals or institutional units such as nations, universities, and so on because percentile ranks are determined at the paper level. In this study, we first compare I3 with IFs for the journals in two Institute for Scientific Information subject categories ("Information Science & Library Science" and "Multidisciplinary Sciences"). The library and information science set is additionally decomposed in terms of nations. Policy implications of this possible paradigm shift in citation impact analysis are specified.
引用
收藏
页码:2133 / 2146
页数:14
相关论文
共 45 条
[1]   Requirements for a cocitation similarity measure, with special reference to Pearson's correlation coefficient [J].
Ahlgren, P ;
Jarneving, B ;
Rousseau, R .
JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, 2003, 54 (06) :550-560
[2]  
[Anonymous], 1976, EVALUATIVE BIBLIOMET
[3]  
[Anonymous], STI C 9 11 SEPT LEID
[4]  
[Anonymous], 2010, SCI ENG IND
[5]   Scientific and technical serials holdings optimization in an inefficient market: A LSU serials redesign project exercise [J].
Bensman, SJ ;
Wilder, SJ .
LIBRARY RESOURCES & TECHNICAL SERVICES, 1998, 42 (03) :147-242
[6]   Garfield and the impact factor [J].
Bensman, Stephen J. .
ANNUAL REVIEW OF INFORMATION SCIENCE AND TECHNOLOGY, 2007, 41 :93-155
[7]   Is Interactive Open Access Publishing Able to Identify High-Impact Submissions? A Study on the Predictive Validity of Atmospheric Chemistry and Physics by Using Percentile Rank Classes [J].
Bornmann, Lutz ;
Schier, Hermann ;
Marx, Werner ;
Daniel, Hans-Dieter .
JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, 2011, 62 (01) :61-71
[8]   Further steps towards an ideal method of measuring citation performance: The avoidance of citation (ratio) averages in field-normalization [J].
Bornmann, Lutz ;
Mutz, Ruediger .
JOURNAL OF INFORMETRICS, 2011, 5 (01) :228-230
[9]   Towards an ideal method of measuring research performance: Some comments to the Opthof and Leydesdorff (2010) paper [J].
Bornmann, Lutz .
JOURNAL OF INFORMETRICS, 2010, 4 (03) :441-443
[10]   A meta-evaluation of scientific research proposals: Different ways of comparing rejected to awarded applications [J].
Bornmann, Lutz ;
Leydesdorff, Loet ;
Van den Besselaar, Peter .
JOURNAL OF INFORMETRICS, 2010, 4 (03) :211-220