An evaluation of impacts in “Nanoscience & nanotechnology”: steps towards standards for citation analysis

被引:0
作者
Loet Leydesdorff
机构
[1] University of Amsterdam,Amsterdam School of Communications Research (ASCoR)
来源
Scientometrics | 2013年 / 94卷
关键词
Citation; Impact; Evaluation; Nanotechnology; Statistics; Standards;
D O I
暂无
中图分类号
学科分类号
摘要
One is inclined to conceptualize impact in terms of citations per publication, and thus as an average. However, citation distributions are skewed, and the average has the disadvantage that the number of publications is used in the denominator. Using hundred percentiles, one can integrate the normalized citation curve and develop an indicator that can be compared across document sets because percentile ranks are defined at the article level. I apply this indicator to the set of 58 journals in the WoS Subject Category of “Nanoscience & nanotechnology,” and rank journals, countries, cities, and institutes using non-parametric statistics. The significance levels of results can thus be indicated. The results are first compared with the ISI-impact factors, but this Integrated Impact Indicator (I3) can be used with any set downloaded from the (Social) Science Citation Index. The software is made publicly available at the Internet. Visualization techniques are also specified for evaluation by positioning institutes on Google Map overlays.
引用
收藏
页码:35 / 55
页数:20
相关论文
共 105 条
[1]  
Bensman SJ(2007)Garfield and the impact factor Annual Review of Information Science and Technology 41 93-155
[2]  
Bensman SJ(2009)Definition and identification of journals as bibliographic and subject entities: librarianship vs. ISI Journal Citation Reports (JCR) methods and their effect on citation measures Journal of the American Society for Information Science and Technology 60 1097-1117
[3]  
Leydesdorff L(2010)Proliferation dynamics in new sciences Research Policy 39 1034-1050
[4]  
Bonaccorsi A(2010)A meta-evaluation of scientific research proposals: Different ways of comparing rejected to awarded applications Journal of Informetrics 4 211-220
[5]  
Vargas J(2011)Further steps towards an ideal method of measuring citation performance: The avoidance of citation (ratio) averages in field-normalization Journal of Informetrics 5 228-230
[6]  
Bornmann L(2008)Citation counts for research evaluation: standards of good practice for analyzing bibliometric data and presenting and interpreting results Ethics in Science and Environmental Politics (ESEP) 8 93-102
[7]  
Leydesdorff L(2011)Is interactive open access publishing able to identify high impact submissions? A study on the predictive validity of Atmospheric Chemistry and Physics by using percentile rank classes Journal of the American Society for Information Science and Technology 52 61-71
[8]  
Van den Besselaar P(2012)Averages of ratios compared to ratios of averages: Mathematical results Journal of Informetrics 6 307-317
[9]  
Bornmann L(1972)Citation analysis as a tool in journal evaluation Science 178 471-479
[10]  
Mutz R(1979)Is citation analysis a legitimate evaluation tool? Scientometrics 1 359-375