China's largest funder of basic science is piloting an artificial intelligence tool that selects researchers to review grant applications, in an attempt to make the process more efficient, faster and fairer. Some researchers say the approach by the National Natural Science Foundation of China is world-leading, but others are sceptical about whether AI can improve the process. Choosing researchers to peer review project proposals or publications is time-consuming and prone to bias. Several academic publishers are experimenting with artificial intelligence (AI) tools to select reviewers and carry out other tasks, and a few funding agencies, including some in North America and Europe, have trialled simple AI tools to identify potential reviewers.
Academic publishing is the backbone of science. Publishing papers is one of the primary ways in which scientists disseminate findings to peers as well as the general public. Academia has been plagued by the 'publish or perish' ethos, such that the number of publications they have determines important career events such as procuring tenure. The process of publishing a scientific paper follows the publication process of any magazine but with more rigour, and the profits garnered by scientific journals are massive. This is because the costs associated with publishing - employing and paying authors, reviewers, and editors - are negligible as none of them are paid. However, to access most publicly-funded knowledge, readers are charged a humongous fee.
The University of North Carolina Press is leading an experiment to significantly lower the cost of producing scholarly books - an important step toward a sustainable open-access publishing model for monographs. Many university presses have experimented with open-access monographs, but few have transitioned away from charging fees for most work, as they are unable to do so sustainably, said John Sherer, director of UNC Press. A big part of the problem is that monographs are incredibly expensive to produce. A 2016 Ithaka S+R study found that monographs can cost anywhere from $15,140 to $129,909 to publish depending on overhead, staff time, design, production and marketing costs. In contrast, a typical science journal might charge around $2,000 to make an article free to read.
Open Access (OA) literature is published on the internet, free of most copyright and licensing restrictions. There are OA journals for new research and OA repositories to store published work. By allowing anyone with an internet connection to read and learn from a work, its impact is maximised. Anything less is a muzzle on academia, a blindfold taut across the eyes of the citizens of the world. UCSB, along with the majority of the UC system, has embarked on a journey to fight this restrictive norm. It seeks to morph current subscription based journals to OA, channel the money once used for subscriptions to the creation of OA business models, and encourage institutions of scholarship everywhere to join in this transition to make the world a better place.
There is a digital revolution underway. It is changing how many things are done - including scholarly publishing. The way that academic research is published, and its availability, has shifted over time. Academic and scholarly journals used to be available only in hard copy. Then came fairly ubiquitous internet access. This ushered in increasingly expensive subscription access to digital copies of journals. And then open access publishing arrived. Now, it is becoming increasingly easy and free to access academic research that was once hidden behind pay walls in specialist journals. This changing landscape prompted the Academy of Science of South Africa to carefully study the potential impact of the digital revolution on scholarly publishing.
Following the Finch Report in 2012, Universities UK established an Open Access Coordination Group to support the transition to open access (OA) for articles in scholarly journals. The Group commissioned an initial report published in 2015 to gather evidence on key features of that transition. This second report aims to build on those findings, and to examine trends over the period since the major funders of research in the UK established new policies to promote OA.
Whitepaper by INSPEC (White Paper - Cookies, fake news and single search boxes: the role of A&I services in a changing research landscape) examines the growing importance of A&I databases in an open web landscape increasingly dominated by advertising and irrelevant results. Librarians and researchers share their thoughts on how they use search tools for academic research and highlight the differences between curated resources and general search engines. The contrast between these search results demonstrates why A&I services have an important role to play in contemporary research.
It is frequently claimed that open access (OA) has the potential to increase usage and citations. This report substantiates such claims for books in particular, through benchmarking the performance of Springer Nature books made OA through the immediate (gold) route against that of equivalent non-OA books. The report includes findings from both quantitative analysis of internal book data (chapter downloads, citations and online mentions) and external interviews conducted with authors and funders. This enables the comparison of actual performance with perceptions of performance for OA books.
This report is the outcome of research commissioned and funded by the four presses. It engages with usage data made available by JSTOR relating to OA books in order to assist publishers in understanding how their OA content is being used; inform strategic decision making by individual presses in the future; and shed light on the potential for data relating to the uses of OA books to support the potential of open access books to reach wide audiences. Additional key aims of the research are to help inform JSTOR in the development of the JSTOR OA Books platform; and to inform the development of JSTOR usage reporting. Ensuring that JSTOR usage reporting reflects the needs of OA publishers is also an important goal of the project. All four publishers have contributed to a discussion of the role and practicalities of usage reporting services provided by JSTOR.
This report examines how peer review can be improved for future generations of academics and offers key recommendations to the academic community. The report is based on the lively and progressive sessions at the SpotOn London conference held at Wellcome Collection Conference centre in November 2016. It includes a collection of reflections on the history of peer review, current issues such as sustainability and ethics, while also casting a look into the future including advances such as preprint servers and AI applications. The contributions cover perspectives from the researcher, a librarian, publishers and others.
What can sales data tell us about e-book adoption and digital reading habits? In this presentation Len Vlahos, Executive Director of the Book Industry Study Group (BISG), takes a close look at book industry statistics from the publisher's perspective, identifying trends related to global e-book adoption, and answering questions about where digital reading is going, to help publishers and libraries prepare for the future.
At the annual Project Muse Publishers Meeting held in Baltimore, Todd Carpenter, Executive Director of NISO (National Information Standards Organization), shared a presentation with attendees about his organization and current projects and initiatives they're working on. Following what NISO is up to is a useful (and interesting) way to monitor emerging and current trends/technology as well as seeing how current standards are being adapted for the changing landscape.
The survey is a follow up to Wiley's 2012 open access author survey and is the second such survey conducted by Wiley. Consistencies were seen between the 2012 and 2013 surveys in authors' desire to publish in a high-quality, respected journal with a good Impact Factor, but the survey also shed light on differences between early career researchers (respondents between the ages of 26-44 with less than 15 years of research experience) and more established colleagues in their opinions on quality and licenses. Differences were also seen across funding bodies and in the funding available for open access to different author groups.
Dr. Charles Kurzman, Professor of Sociology, University of North Carolina, Chapel Hill, presented "Shifts in Scholarly Communications Among World Regions" at the OCLC Research Briefing at UNC Chapel Hill on June 7, 2013. At this event, Dr. Kurzman presented his research on changing academic attention to world regions over the past 50 years, "attention" as measured by analyzing works published about each region of the world and collected in U.S. academic libraries for each year of publication since 1958. The patterns that emerge from this research will help to inform social scientists and educational policymakers about trends and possible gaps in scholarly attention to different regions of the world.
These 201 slides from a pre-con tutorial titled, 'Introduction to Linked Open Data (LOD)' was presented on September 2, 2013 at Dublin Core 2013 (DC-2013) in Lisbon, Portugal. The instructor was Ivan Herman, Semantic Web Activity Lead at the World Wide Web Consortium (W3C). The goal of the tutorial is to introduce the audience into the basics of the technologies used for Linked Data. This includes RDF, RDFS, main elements of SPARQL, SKOS, and OWL. Some general guidelines on publishing data as Linked Data will also be provided, as well as real-life usage examples of the various technologies.