SIR Methodology

General considerations

  • The SIR (SCImago Institutions Rankings) is a characterization of institutions, formed by three sets of rankings based on research, innovation and web visibility indicators.
  • The developed interface allows the visualization of rankings for each indicator separately. It also enables the comparison of the indicator trends for one or more institutions (up to six). It offers the possibility of generating sector distribution charts for the different indicators.
  • To facilitate benchmarking, the ranks of institutions for each of the indicators have been normalized on a scale of 0 to 100.Therefore, the published values of each indicator have no other role than determining the position of each institutionwith respect to the other institutions.
  • The line graphs and bar graphs always represent ranks. Therefore, the lowest values show the better positions in the list and the highest values the worst.
  • Institutions have been selected using the sole criterion that they are research institutions with over 100 published works included in the SCOPUS database during the last year of the selected time period.
  • The sorting of institutions is generated each year using the results obtained in the five-year period ending in the selected year. For instance, if the selected year is 2012, the results used are those from the five-year period 2008-2012. The only exception is the case of web indicators which have only been calculated for the last year.
  • Institutions can be grouped by the countries to which they belong. Multinational institutions (MUL) which cannot be attributed to any country have also been included.
  • There are institutions (marked with an asterisk) which group sub-institutions (marked with the abbreviated name of the parent institution). In the case of the parent institutions, results always include the names of the sub-institutions.
  • Institutions can be also grouped by sectors in order to generate lists with a higher degree of institutional homogeneity.
  • In order to achieve the highest level of precision in the institutional rankings for the different indicators, an exhaustive manual process of disambiguation of the institution’s names has been carried out.
  • The source of information used to generate the rankings for innovation is PATSTAT.
  • The sources of information used to generate the rankings for web visibility are Google and ahrefs.com.

The SIR ARE NOT LEAGUE TABLES. The only goal of SIR is to characterize research outcomes of organizations to provide useful scientometric rankings to institutions, policymakers and research managers for the analysis, evaluation and improvement of their research results. If someone uses this information to rank institutions or to build a league table, he/she will act under his/her own responsibility.

Indicators

Indicators are divided into three groups intended to reflect scientific, economic and social characteristics of institutions. It needs to be kept in mind that, once indicators are calculated the resulting values of institutions for each of the indicators have been normalized on a scale of 0 to 100. The SIR includes both, size-dependent and size-independent indicators; that is indicators influenced and not influenced by the size of the institutions. In this manner, the SIR provides overall statistics of the scientific publication and other output of institutions, at the same time that enables comparison between institutions of different sizes.

Research

  1. Output: Total number of documents published in scholarly journals indexed in Scopus (Romo-Fernández, et al., 2011). This is a size-dependent indicator.
  2. International Collaboration: Institution's output ratio produced in collaboration with foreign institutions. The values are computed by analyzing an institution's output whose affiliations include more than one country address (Guerrero-Bote, Olmeda-Gómez and Moya- Anegón, 2013; Lancho-Barrantes, Guerrero-Bote and Moya-Anegón, 2013; Lancho-Barrantes, et al., 2013; Chinchilla-Rodríguez, et al., 2012). This is a size-independent indicator.
  3. Normalized Impact: Normalized Impact is computed using the methodology established by the Karolinska Institutet in Sweden where it is named "Item oriented field normalized citation score average". The normalization of the citation values is done on an individual article level. The values (in decimal numbers) show the relationship between an institution's average scientific impact and the world average set to a score of 1, --i.e. a NI score of 0.8 means the institution is cited 20% below world average and 1.3 means the institution is cited 30% above average (Rehn and Kronman, 2008; González-Pereira, Guerrero-Bote and Moya- Anegón, 2011).This is a size-independent indicator.
  4. High Quality Publications: Ratio of publications that an institution publishes in the most influential scholarly journals of the world, those ranked in the first quartile (25%) in their categories as ordered by SCImago Journal Rank (SJRII) indicator (Miguel, Chinchilla-Rodríguez and Moya-Anegón, 2011). This is a size-independent indicator.
  5. Specialization Index: The Specialization Index indicates the extent of thematic concentration or dispersion of an institution’s scientific output. Values range between 0 and 1, indicating generalist versus specialized institutions respectively. This indicator is computed according to the Gini Index used in Economy (Moed, et. al., 2011; López-Illescas, Moya-Anegón and Moed, 2011; Arencibia-Jorge et al., 2012). An indicator value of 0 signifies that data are insufficient to calculate. However, it should be noted that although the resulting specialization values range between 0 and 1, these values have been scaled by 100 to present a similar range as the rest of indicators. This indicator is size-independent.
  6. Excellence Rate: Excellence rate indicates the amount (in %) of an institution’s scientific output that is included in the top 10% of the most cited papers in their respective scientific fields. It is a measure of high quality output of research institutions (SCImago Lab, 2011; Bornmann, Moya-Anegón and Leydesdorff, 2012; Guerrero-Bote and Moya-Anegón, 2012). This is a size-independent indicator.
  7. Scientific Leadership: Leadership indicates the percentage of an institution’s output as main contributor, that is, the amount of papers in which the corresponding author belongs to the institution (Moya-Anegón, 2012; Moya-Anegón et. al, 2013; Moya-Anegón, et al.,). This is a size-independent indicator.
  8. Excellence with Leadership: Excellence with Leadership indicates the amount of documents in the Excellence rate in which the institution is the main contributor (Moya-Anegón, et al., 2013). This is a size-independent indicator.
  9. Scientific talent pool: Total number of different authors from an institution in the total publication output of that institution during a particular period of time. This indicator is size-dependent.

Innovation

  1. Innovative Knowledge: Scientific publication output from an institution cited in patents. Based on PATSTAT (http://www.epo.org). This indicator is size-dependent.
  2. Technological Impact: Percentage of the scientific publication output cited in patents. This percentage is calculated considering the total output in the areas cited in patents, which are the following: Agricultural and Biological Sciences; Biochemistry, Genetics and Molecular Biology; Chemical Engineering; Chemistry; Computer Science; Earth and Planetary Sciences; Energy; Engineering; Environmental Science; Health Professions; Immunology and Microbiology; Materials Science; Mathematics; Medicine; Multidisciplinary; Neuroscience; Nursing; Pharmacology, Toxicology and Pharmaceutics; Physics and Astronomy; Social Sciences; Veterinary). Based on PATSTAT (http://www.epo.org). This indicator is size-independent

Web

  1. Web size: Number of pages associated to the institution URL according to Google (https://www.google.com). This indicator is size-dependent.
  2. Domain’s inbound links: Number of incoming links to an institution’s domain according to ahrefs (https://ahrefs.com). This indicator is size-dependent.

Bibliography

  • Arencibia-Jorge, R., Vega-Almeida, R. L., Chinchilla-Rodríguez, Z., Corera-Álvarez, E., Moya-Anegón, F. (2012) Patrones de especialización de la investigación nacional sobre Salud”. Revista Cubana de Salud Pública 38 (5). http://dx.doi.org/10.1590/S0864-34662012000500007
  • Bornmann, L., De Moya Anegón, F., Leydesdorff, L. (2012) The new Excellence Indicator in the World Report of the SCImago Institutions Rankings 2011. Journal of Informetrics, 6 (2), pp. 333-335. http://dx.doi.org/10.1016/j.joi.2011.11.006
  • Chinchilla-Rodríguez, Z., Benavent-Pérez, M., Miguel, S., Moya-Anegón, F. (2012) “International Collaboration in Medical Research in Latin America and the Caribbean (2003-2007)”. Journal of the American Society for Information Science and Technology 63 (11), pp. 2223-2238. http://dx.doi.org/10.1002/asi.22669
  • González-Pereira, B., Guerrero-Bote,V., Moya-Anegón, F. (2010). A new approach to the metric of journal’s scientific prestige: The SJR indicator. Journal of Informetrics, 4(3), pp. 379–391. http://dx.doi.org/10.1016/j.joi.2010.03.002
  • Guerrero-Bote, V.P., Moya-Anegón, F. (2012) A further step forward in measuring journals' scientific prestige: The SJR2 indicator. Journal of Informetrics, 6 (4), pp. 674-688. http://dx.doi.org/10.1016/j.joi.2012.07.001
  • Guerrero Bote, V.P., Olmeda-Gomez, C., De Moya-Anegon, F. (2013) Quantifying the benefits of international scientific collaboration. Journal of the American Society for Information Science and Technology, 64 (2), pp. 392-404. http://dx.doi.org/10.1002/asi.22754
  • Lancho-Barrantes, B.S., Guerrero-Bote, V.P., de Moya-Anegón, F. (2013) Citation increments between collaborating countries. Scientometrics, 94 (3), pp. 817-831. http://dx.doi.org/1002/asi.22754
  • Lancho-Barrantes, B. S., Guerrero-Bote, V. P., Chinchilla-Rodríguez, Z., Moya-Anegón, F. (2012) Citation Flows in the Zones of Influence of Scientific Collaborations. Journal of the American Society for Information Science and Technology 63 (3), pp. 481-489. http://dx.doi.org/10.1002/asi.21682
  • Lopez-Illescas, C., de Moya-Anegón, F., Moed, H.F. (2011) A ranking of universities should account for differences in their disciplinary specialization. Scientometrics, 88 (2), pp. 563-574. http://dx.doi.org/10.1007/s11192-011-0398-6
  • Miguel, S., Chinchilla-Rodríguez, Z., Moya-Anegón, F. (2011) Open Access and Scopus: A New Approach to Scientific From the Standpoint of Access. Journal of the American Society for Information Science and Technology, 62 (6), pp. 1130-1145. http://dx.doi.org/ 10.1002/asi.21532
  • Moya-Anegón, F., Chinchilla-Rodríguez, Z., Vargas-Quesada, B., Corera-Álvarez, E., González-Molina, A., Muñoz-Fernández, F. J., Herrero-Solana, V. (2007) Coverage analysis of SCOPUS: a journal metric approach. Scientometrics 73 (1), pp. 57-58. http://dx.doi.org/ 10.1007/s11192-007-1681-4
  • Moed, H.F., Moya-Anegón, F., López-Illescas, C., Visser, M. (2011). Is concentration of university research associated with better research performance? Journal of Informetrics. 5 (4) 649-658. http://dx.doi.org/10.1016/j.joi.2011.06.003
  • Moya-Anegón, F. Liderazgo y excelencia de la ciencia española (2012) Profesional de la Información, 21 (2), pp. 125-128. http://dx.doi.org/10.3145/epi.2012.mar.01
  • Moya-Anegón, F. (dir.), Chinchilla-Rodríguez, Z. (coord.), Corera-Álvarez, E., González-Molina, A., Vargas-Quesada, B. (2013) Principales Indicadores Bibliométricos de la Actividad Científica Española: 2010. Madrid: Fundación Española para la Ciencia y la Tecnología.
  • Moya-Anegón, F. (dir.), Chinchilla-Rodríguez, Z. (coord.), Corera-Álvarez, E., González-Molina, A., Vargas-Quesada, B. (2013) Excelencia y liderazgo de la producción científica española 2003-2010. Madrid: Fundación Española para la Ciencia y la Tecnología.
  • Rehn C, Kronman U. (2008) Bibliometric handbook for Karolinska Institutet. Karolinska Institutet University Library. Version 1.05.
  • Romo-Fernández, L.M., Lopez-Pujalte, C., Guerrero Bote, V.P., Moya-Anegon, F. (2011). Analysis of Europe's scientific production on renewable energies. Renewable Energy, 36 (9), pp. 2529-2537. http://dx.doi.org/10.1016/j.renene.2011.02.001
  • Moya-Anegón, F., Guerrero-Bote, V., Bornmann, L., y Moed, H. (2013). The research guarantors of scientific papers and the output counting: a promising new approach. Scientometrics, 97, 421-434. http://dx.doi.org/10.1007/s11192-013-1046-0