Advocates of open access are quick to bemoan the 'paywall' that keeps people from reading research findings. The adoption of open-access publication does not eradicate the paywall, but instead moves the cost burden in front of researchers themselves. Open access has been around long enough for us to recognise that its cost cannot be borne by the external funding of individual research labs. There are also sincere academic-integrity concerns about scholars paying money to have their work published, especially as many open-access journals are run on a for-profit basis. In a sense, open access is - or can be - payola. The only source of integrity is the faith that the editors are acting honourably.
Although many scientific journals try to provide more details about author contributions by requiring explicit statements, such contribution statements get much less attention than authorship order, according to new findings from a Georgia Tech-University of Passau team. The authors found that while researchers evaluating a paper consider contribution statements helpful for understanding the specific skills individual team members brought to the study, they still use author order for deciphering which researchers did how much of the work and deserve most of the credit. Authorship is a topic that looms large on the minds of researchers. Publications play a major role in career advancement at universities and research institutions, and authorship order is a widely used, but imprecise, way of inferring contributions from researchers. In part, the problem with contribution statements is that they aren't always available, and when they are, the statements tend to have no uniform structure.
Once faculty members write, submit and have articles published in magazines or journals, copyright is transferred from the author to the publication. This restricts readership to those with access to that specific publication. But with Open Access, that is changing. Open Access is a principle-based movement, largely driven by university professors and librarians, to transform academic publishing so that everyone - not just those affiliated with wealthy institutions - has access to high-quality information found in academic publications.
Simplifying access to the right information across the organisation has become the mantra for the successful, research-driven enterprise - but it is only the first step in an enterprise-wide knowledge management strategy. So, how do biomedical and drug discovery researchers effectively transform information into useful knowledge in the Big Data era? The answers lie in how the magnitude of available information is being harnessed and exploited. With at least 50 million scholarly journal articles already filling information pipelines, and more than 2.5 million more added each year, the ways content is discovered and utilised by scientists and technologists working at millions of companies, must evolve.
Universities rely on published research to bolster the school's reputation as well as the researcher or academic's own prospects. However, as jobs at premier institutions become harder to obtain, experts suggest scholars have increasingly begun to submit research to these predatory journals knowing well they are not legitimate publications - an act experts call academic fraud because it wastes taxpayer money, chips away scientific credibility and muddies important research, according to a recent The New York Times report. Experts cite more than 10,000 of these journals in recent years. Many of those publications' names mimic the names of well-known journals. These journals have few expenses because they do not seriously review submitted content before publishing it online.
Whitepaper by INSPEC (White Paper - Cookies, fake news and single search boxes: the role of A&I services in a changing research landscape) examines the growing importance of A&I databases in an open web landscape increasingly dominated by advertising and irrelevant results. Librarians and researchers share their thoughts on how they use search tools for academic research and highlight the differences between curated resources and general search engines. The contrast between these search results demonstrates why A&I services have an important role to play in contemporary research.
It is frequently claimed that open access (OA) has the potential to increase usage and citations. This report substantiates such claims for books in particular, through benchmarking the performance of Springer Nature books made OA through the immediate (gold) route against that of equivalent non-OA books. The report includes findings from both quantitative analysis of internal book data (chapter downloads, citations and online mentions) and external interviews conducted with authors and funders. This enables the comparison of actual performance with perceptions of performance for OA books.
This report is the outcome of research commissioned and funded by the four presses. It engages with usage data made available by JSTOR relating to OA books in order to assist publishers in understanding how their OA content is being used; inform strategic decision making by individual presses in the future; and shed light on the potential for data relating to the uses of OA books to support the potential of open access books to reach wide audiences. Additional key aims of the research are to help inform JSTOR in the development of the JSTOR OA Books platform; and to inform the development of JSTOR usage reporting. Ensuring that JSTOR usage reporting reflects the needs of OA publishers is also an important goal of the project. All four publishers have contributed to a discussion of the role and practicalities of usage reporting services provided by JSTOR.
This report examines how peer review can be improved for future generations of academics and offers key recommendations to the academic community. The report is based on the lively and progressive sessions at the SpotOn London conference held at Wellcome Collection Conference centre in November 2016. It includes a collection of reflections on the history of peer review, current issues such as sustainability and ethics, while also casting a look into the future including advances such as preprint servers and AI applications. The contributions cover perspectives from the researcher, a librarian, publishers and others.
Publishers wanting to develop long-term content strategies to increase the value of their scholarly book programs must consider chapter-level metadata, particularly abstracts, to stay competitive. The long-term benefits of investing in abstracts for the backlist and building production workflows into new releases are supported by publishers, aggregators, librarians, and researchers alike. As technology rapidly changes, and publishers face increased pressure to grow revenue, abstracts are a clear opportunity for publishers to meet these demands. This white paper examines the return on investment publishers in humanities and social science fields may gain by adding chapter-level abstracts and curated keywords to their metadata.
What can sales data tell us about e-book adoption and digital reading habits? In this presentation Len Vlahos, Executive Director of the Book Industry Study Group (BISG), takes a close look at book industry statistics from the publisher's perspective, identifying trends related to global e-book adoption, and answering questions about where digital reading is going, to help publishers and libraries prepare for the future.
At the annual Project Muse Publishers Meeting held in Baltimore, Todd Carpenter, Executive Director of NISO (National Information Standards Organization), shared a presentation with attendees about his organization and current projects and initiatives they're working on. Following what NISO is up to is a useful (and interesting) way to monitor emerging and current trends/technology as well as seeing how current standards are being adapted for the changing landscape.
The survey is a follow up to Wiley's 2012 open access author survey and is the second such survey conducted by Wiley. Consistencies were seen between the 2012 and 2013 surveys in authors' desire to publish in a high-quality, respected journal with a good Impact Factor, but the survey also shed light on differences between early career researchers (respondents between the ages of 26-44 with less than 15 years of research experience) and more established colleagues in their opinions on quality and licenses. Differences were also seen across funding bodies and in the funding available for open access to different author groups.
Dr. Charles Kurzman, Professor of Sociology, University of North Carolina, Chapel Hill, presented "Shifts in Scholarly Communications Among World Regions" at the OCLC Research Briefing at UNC Chapel Hill on June 7, 2013. At this event, Dr. Kurzman presented his research on changing academic attention to world regions over the past 50 years, "attention" as measured by analyzing works published about each region of the world and collected in U.S. academic libraries for each year of publication since 1958. The patterns that emerge from this research will help to inform social scientists and educational policymakers about trends and possible gaps in scholarly attention to different regions of the world.
These 201 slides from a pre-con tutorial titled, 'Introduction to Linked Open Data (LOD)' was presented on September 2, 2013 at Dublin Core 2013 (DC-2013) in Lisbon, Portugal. The instructor was Ivan Herman, Semantic Web Activity Lead at the World Wide Web Consortium (W3C). The goal of the tutorial is to introduce the audience into the basics of the technologies used for Linked Data. This includes RDF, RDFS, main elements of SPARQL, SKOS, and OWL. Some general guidelines on publishing data as Linked Data will also be provided, as well as real-life usage examples of the various technologies.