The effects and their stability of field normalization baseline on relative performance with respect to citation impact: A case study of 20 natural science departments

被引:29
作者
Colliander, Cristian [1 ]
Ahlgren, Per [2 ]
机构
[1] Umea Univ, Dept Sociol, SE-90187 Umea, Sweden
[2] Univ Lib, Dept E Resources, SE-10691 Stockholm, Sweden
关键词
Stability analysis; Field normalization baseline; Journal; ISI/Thomson Reuters subject category; Essential Science Indicators field; Citation impact; CROSS-FIELD; INDICATORS; EXCELLENCE;
D O I
10.1016/j.joi.2010.09.003
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
In this paper we study the effects of field normalization baseline on relative performance of 20 natural science departments in terms of citation impact. Impact is studied under three baselines: journal, ISI/Thomson Reuters subject category, and Essential Science Indicators field. For the measurement of citation impact, the indicators item-oriented mean normalized citation rate and Top-5% are employed. The results, which we analyze with respect to stability, show that the choice of normalization baseline matters. We observe that normalization against publishing journal is particular. The rankings of the departments obtained when journal is used as baseline, irrespective of indicator, differ considerably from the rankings obtained when ISI/Thomson Reuters subject category or Essential Science Indicators field is used. Since no substantial differences are observed when the baselines Essential Science Indicators field and ISI/Thomson Reuters subject category are contrasted, one might suggest that people without access to subject category data can perform reasonable normalized citation impact studies by combining normalization against journal with normalization against Essential Science Indicators field. (C) 2010 Elsevier Ltd. All rights reserved.
引用
收藏
页码:101 / 113
页数:13
相关论文
共 20 条
[1]   Calibrating the zoom - a test of Zitt's hypothesis [J].
Adams, Jonathan ;
Gurney, Karen ;
Jackson, Louise .
SCIENTOMETRICS, 2008, 75 (01) :81-95
[2]   Creation of journal-based publication profiles of scientific institutions - A methodology for the interdisciplinary comparison of scientific research based on the J-factor [J].
Ball, Rafael ;
Mittermaier, Bernhard ;
Tunger, Dirk .
SCIENTOMETRICS, 2009, 81 (02) :381-392
[3]   Towards an ideal method of measuring research performance: Some comments to the Opthof and Leydesdorff (2010) paper [J].
Bornmann, Lutz .
JOURNAL OF INFORMETRICS, 2010, 4 (03) :441-443
[4]   Subfield-specific normalized relative indicators and a new generation of relational charts: Methodological foundations illustrated on the assessment of institutional research performance [J].
Glaenzel, Wolfgang ;
Thijs, Bart ;
Schubert, Andras ;
Debackere, Koenraad .
SCIENTOMETRICS, 2009, 78 (01) :165-188
[5]   An item-by-item subject classification of papers published in journals covered by the SSCI database using reference analysis [J].
Glänzel, W ;
Schubert, A ;
Schoepflin, U ;
Czerwon, HJ .
SCIENTOMETRICS, 1999, 46 (03) :431-441
[6]   Citation analysis of research performer quality [J].
Kostoff, RN .
SCIENTOMETRICS, 2002, 53 (01) :49-71
[7]   Lifting the crown-citation z-score [J].
Lundberg, Jonas .
JOURNAL OF INFORMETRICS, 2007, 1 (02) :145-154
[8]  
Lunneborg C.E., 2000, DATA ANAL RESAMPLING
[9]   Measuring contextual citation impact of scientific journals [J].
Moed, Henk F. .
JOURNAL OF INFORMETRICS, 2010, 4 (03) :265-277
[10]   NEW BIBLIOMETRIC TOOLS FOR THE ASSESSMENT OF NATIONAL RESEARCH PERFORMANCE - DATABASE DESCRIPTION, OVERVIEW OF INDICATORS AND FIRST APPLICATIONS [J].
MOED, HF ;
DEBRUIN, RE ;
VANLEEUWEN, TN .
SCIENTOMETRICS, 1995, 33 (03) :381-422