Notes

[30] The data in this edition of Indicators do not include articles from journals in professional fields. Thus the article counts reported here for past years will be slightly lower than counts reported in previous editions. See sidebar, "Bibliometric Data and Terminology."

[31] European Union (EU) data include all member states as of 2007 (see appendix table 5-33 for a list of member countries); previous editions of Indicators considered a smaller set. Thus the larger world share of S&E articles accounted for by the European Union is in no small part a result of the expanded EU membership. However, see the discussion of growth rates by region and country later in this section.

[32] The Asia-10 includes China (including Hong Kong), Japan, India, Indonesia, Malaysia, Philippines, Singapore, South Korea, Thailand, and Taiwan.

[33] Uzun (2006) describes 20 years of Turkish science and technology policies that underlie the expansion of its article output.

[34] Another use of these data, showing within-country/within-region S&E article field distributions as an indicator of the region/country portfolio of S&E research, has been discussed in past editions of Indicators. Although countries and regions display somewhat different emphases in their research portfolios, these patterns are stable and change only slowly over time. See, for example, Science and Engineering Indicators 2006, figure 5-38 and appendix tables 5-44 and 5-45 (NSB 2006).

[35] The reader is reminded that the data on which these indicators are based give the nationality of the institutional addresses listed on the article. Authors are not associated with a particular institution and may be of any nationality. Therefore the discussion in this section is based on the nationality of the institutions, not authors themselves and, for practical purposes, makes no distinction between nationality of institutions and nationality of authors.

[36] Merton (1973, p. 409) points out the tension between the norms of priority and of allocating credit in science: "Although the facts are far from conclusive, this continuing change in the social structure of research, as registered by publications, seems to make for a greater concern among scientists with the question of 'how will my contribution be identified' in collaborative work than with the historically dominant pattern of wanting to ensure their priority over others in the field…It may be that institutionally induced concern with priority is becoming overshadowed by the structurally induced concern with the allocation of credit among collaborators."

[37] In this section only, author names refer to counts of individually listed authors of articles, not institutional authors. Since authors may appear on more than one article per year, they may be counted more than once. However, because NSF does not analyze individual author names, the extent of such multiple counting is unknown.

[38] The coauthorship data discussed in this paragraph are restricted to coauthorship across the regions/countries identified in table 5-23; i.e., collaboration between or among countries of the European Union, for example, is ignored. Intraregional coauthorship is discussed in the following sections.

[39] Readers are reminded that each country participating in an international coauthorship receives one full count for the article; i.e., for an article coauthored by the United States and Canada, both the United States and Canada receive a count of one. In the percentages discussed in this paragraph, the numerators for the country pairs are the same. The denominators vary, accounting for the different rates of coauthorship.

[40] Readers are reminded that the number of coauthored articles between any pair of countries is the same; each country is counted once per article in these data. However, countries other than the pairs discussed here may also appear on the article.

[41] Identification of the sector of the non-U.S. institution is not possible with the current data set.

[42] Readers are reminded that coauthors from different departments in an institution are coded as different institutions.

[43] See note 42.

[44] This chapter uses the convention of a 3-year citation window with a 2-year lag, e.g., 2005 citation rates are from references in articles in the 2005 tape year to articles on the 2001, 2002, and 2003 tapes of the Thomson Scientific Science Citation Index and Social Sciences Citation Index databases. Analysis of the citation data shows that, in general, the 2-year citing lag captures the 3 peak cited years for most fields, with the following exceptions: in astronomy and physics the peak cited years are generally captured with a 1-year lag, and in computer sciences, psychology, and social sciences with a 3-year lag.

[45] Percentiles are specified percentages below which the remainder of the articles falls, for example, the 99th percentile identifies the number of citations 99% of the articles failed to receive. Across all fields of science, 99% of articles failed to receive at least 21 citations. Matching numbers of citations with a citation percentile is not precise because all articles with a specified number of citations must be counted the same. Therefore, the citation percentiles discussed in this section and used in appendix table 5-38 have all been conservatively counted, and the identified percentile is in every case higher than specified, i.e., the 99th percentile is always >99%, the 95th percentile is always >95%, etc. Actual citations/percentiles per field vary widely because counts were cut off to remain in the identified percentile. Using this method of counting, for example, the 75th percentile for engineering contained articles with two citations, whereas the 75th percentile for biological sciences contained articles with 5–8 citations.

[46] This pattern holds for even lower citation percentiles (e.g., the 95th or 90th).

[47] The previous edition of Indicators discussed various factors that may have contributed to the rise in university patenting, including federal statutes and court decisions (see NSB 2006, p 5-51 through 5-53).

[48] For an overview of these developments in the 20th century, see Mowery (2002).

[49] It is unclear whether the recent downturn in patents granted to universities/colleges is a result of changes in processing at the U.S. Patent and Trademark Office (USPTO). For example, in its Performance and Accountability Report Fiscal Year 2006, USPTO reported an increase in overall applications from 2002 to 2006; a decrease in "allowed" patent applications; and an increase in average processing time from 24 to 31 months (USPTO 2006).

[50] The institutions listed in appendix table 5-40 have been reported consistently by USPTO since 1982. Nevertheless some imprecision is present in the data. Several university systems are counted as one institution, medical schools may be counted with their home institution, and universities are credited for patents only if they are the first-name assignee on a patent; other assignees are not counted. Universities also vary in how they assign patents, e.g., to boards of regents, individual campuses, or entities with or without affiliation with the university.