Authors based in China contributed 8.5 percent of all research papers published in Nature branded journals in 2012, up 35 percent on 2011 figures. This is according to the Nature Publishing Index 2012 (NPI) China, published as a supplement to Nature. Authors from institutions in China contributed 303 papers published in Nature branded journals in 2012, up from 7.0 percent (225) in 2011 and 5.3 percent (152) in 2010. In 2000, just six articles published in Nature branded journals had co-authors from institutions in China.
The data released in the NPI adds to evidence that China is rapidly boosting its quality research output, and becoming a global leader in scientific publishing and scientific research. A global analysis will be released in June 2013. China is expected to have made gains in 2012 against nations that traditionally lead in scientific outputs.
The supplement offers insights into how national investments, institutions and cities have contributed to China's rapid scientific expansion.
The top two institutions remain stable from 2011 to 2012: the Chinese Academy of Sciences (CAS) leads, followed by the University of Science and Technology of China (USTC). Tsinghua University, Peking University, and Shanghai Jiao Tong University (SJTU) complete the top five. In sixth place, BGI was a strong performer in 2012, up from tenth in 2011. An analysis in the NPI indicates that SJTU and Zhejiang University (seventh in 2012, up from 11th in 2011) are rapidly growing their high quality research output. The NPI also provides indicators that China, traditionally strong in physical sciences, is making gains in high quality life sciences research.
The Nature Publishing Index 2012 China supplement also presents a ranking by city. Beijing continues to dominate, followed strongly by Shanghai. Hefei, Hong Kong and Wuhan round out the top five cities.
The NPI measures the output of research articles from nations and institutes in terms of publications in the 18 Nature-branded primary research journals in 2012. The Nature Publishing Index 2012 China supplement is available online at nature.asia/publishing-index-/china and is published as a supplement to Nature. The ranking is a snapshot based on papers published in 2012, with 2008-2011 data also included to show trends. The index, updated weekly, is available at www.natureasia.com/publishing-index/china.
The Optical Society (OSA) and Scholarly iQ has announced that OSA's usage statistics for its portfolio of peer-reviewed journals have been upgraded and are now available using the new COUNTER Release 4 (R4) standards. These new standards, like the ones already in place, have been accepted by the scientific publishing industry worldwide and govern the recording and exchange of online usage data.
The new COUNTER 4 reports are the latest set of usage reports that can be used by librarians to ensure the independent and accurate reporting of publishers' usage statistics.
New features included in the COUNTER Release 4 standards include a single, integrated Code of Practice covering journals, databases, books, reference works and multimedia content; the ability to report usage of Gold Open Access articles separately via the new 'Journal Report 1 GOA'; an expanded 'Journal Report 2,' which includes information on denied-access content and unlicensed content, in addition to the 'Turnaways' data, which is covered by earlier COUNTER Releases; a modified "Journal Report 5," which reports on the usage of full-text article requests by year and by journal from a library's acquired archival content; a new multimedia report, which covers the usage of non-textual multimedia resources, such as audio, video and images; and the ability to look at usage statistics for specified date ranges depending on the library's needs.
Publisher John Wiley&Sons, Inc. has launched a trial of Altmetric, a service that tracks and measures the impact of scholarly articles and datasets on both traditional and social media. The six month trial will run on a number of subscription and open access journals published by Wiley including Advanced Materials, Angewandte Chemie, BJU International, Brain and Behavior, Methods in Ecology and Evolution and EMBO Molecular Medicine.
As part of the trial, Altmetric will track social media sites like Twitter, Facebook, Google+, Pinterest, blogs, newspapers, magazines and online reference managers like Mendeley and CiteULike for mentions of scholarly articles published in the journals included in the trial. Altmetric will create and display a score for each article measuring the quality and quantity of attention that the particular article has received. The Altmetric score is based on three main factors: the number of individuals mentioning a paper, where the mentions occurred and how often the author of each mention talks about the article.
Article level metrics are emerging as important tools to quantify how individual articles are shared, used and discussed. These are being used in conjunction with more traditional metrics focused on long-term impact of a collection of articles found in a journal based on the number of citations.
Scientists, journal editors and publishers, scholarly societies, and research funders across many scientific disciplines have posted an international declaration calling on the world scientific community to eliminate the role of the journal impact factor (JIF) in evaluating research for funding, hiring, promotion, or institutional effectiveness. The San Francisco Declaration on Research Assessment, or DORA, was framed by a group of journal editors, publishers, and others convened by the American Society for Cell Biology (ASCB) in December 2012 in San Francisco, during the Society's Annual Meeting.
The San Francisco group agreed that the JIF, which ranks scholarly journals by the average number of citations their articles attract in a set period, has become an obsession in world science. Impact factors warp the way that research is conducted, reported, and funded. Over five months of discussion, the San Francisco declaration group moved from an 'insurrection,' in the words of one publisher, against the use of the prominent two-year JIF to a wider reconsideration of scientific assessment.
The DORA statement makes 18 recommendations for change in the scientific culture at all levels - individual scientists, publishers, institutions, funding agencies, and the bibliometric services themselves - to reduce the dominant role of the JIF in evaluating research and researchers and instead to focus on the content of primary research papers, regardless of publication venue.
The declaration is timed to coincide with editorials in scientific journals worldwide including an endorsement of DORA by Bruce Alberts, Editor-in-Chief of Science Magazine in the journal's May 17th issue. Other editors signing DORA represent Journal of Cell Biology (JCB), Traffic, Genetics, eLife, Journal of Cell Science, Aging Cell, Molecular Biology of the Cell (MBoC), BioArchitecture, The EMBO Journal, Journal of Cell Science, Journal of Surfactants&Detergents, Cell Structure&Functions (Japan), Lipids, Genes, Journal of the Electrochemical Society, and Development. A complete list of signatories to date is available at http://www.ascb.org/SFdeclaration.html.
The San Francisco declaration cites studies that outline known defects in the JIF, distortions that skew results within journals, that gloss over differences between fields, and that lump primary research articles in with much more easily cited review articles. Further, the JIF can be 'gamed' by editors and authors, while the data used to compute the JIF 'are neither transparent nor openly available to the public,' according to DORA.
ORCID and ISNI have issued a joint statement on the need for interoperation between the two organisations and about the first collaborative steps in defining system interoperability. While ORCID is an international, interdisciplinary, open and not-for-profit organisation, ISNI is a UK-based non-for-profit organisation.
According to the statement, ORCID and ISNI are separate, independent organisations that assign identifiers to individuals and are using the same identifier format. While ISNI and ORCID have different missions and employ different processes for assigning identifiers and linking people with their works and their affiliated institutions, points of overlap between them justify investigating options for interoperability. ORCID and ISNI agreed on the need to aim for interoperation through linking and of sharing public data between the two systems. In addition, the two are committed to investigate the feasibility of a shared identifier scheme for a single number to represent an individual in both the ORCID and ISNI databases.
ISNI is an ISO certified global standard for identifying the millions of contributors to creative works and those active in their distribution, including writers, artists, creators, performers, researchers, producers, publishers and aggregators. It is part of a family of international standard identifiers that includes identifiers of works, recordings, products and right holders in all repertoires, e.g. DOI, ISAN, ISBN, ISRC, ISSN, ISTC, and ISWC.
ORCID is an open, non-profit, international, and interdisciplinary community-based effort to provide a self-claim registry of unique researcher identifiers. To ensure the identifier links researchers with their works, ORCID works with the research community to embed these identifiers in workflows, such as manuscript submission, grant application, and dataset deposition. ORCID is stated to be unique because of its direct relationship with researchers and with organisations throughout the research community.
ISNI manages a disambiguation process based on data matching between authoritative sources and with a broader remit than the researcher community, including rights management. In addition to identifying individuals, ISNI assigns identifiers to organisations, including research institutions. Through its core relationship with the VIAF database, ISNI provides the prospect of widespread diffusion of identifiers into the databases of hundreds of national and major university research libraries worldwide. The ISNI database is also rich in other data concerning researchers. Among the assigned ISNIs are 396,618 identities from six research based sources: American Musicological Society, British Library Theses, JISC Names (UK), Modern Languages Association, Proquest Theses, and Scholar Universe. More researchers' rich data is being added from OCLC theses and ZETOC, among other sources. Assignment is made when two independent sources have data in agreement, or where a name is unique.
ORCID has reportedly gained worldwide interest from the research community since its launch in October 2012. The registry has over 100,000 unique iDs and is growing by about 5,000 researchers each week, through a combination of direct registrations and registrations through external systems that have embedded the iD, such as manuscript submissions systems, repositories, and associations with external identifiers. More than a quarter of the researchers are stated to have enriched their record with additional metadata about their publications and research works.
ORCID and ISNI have already taken a first collaborative step in defining system interoperability. The ORCID iD is compatible in format with the ISNI ISO Standard (ISO 27729). The ORCID Registry randomly assigns ORCID iDs from a block of numbers set aside for them by the ISNI International Agency which avoids having the same number assigned to different people.