1. Open science is all very well but how do you make it FAIR in practice?
Open science is about increasing the reuse of research, and making sure that publicly funded research is accessible to all. Key to achieving this is adhering to FAIR principles: ensuring the findings and data behind research results are findable, accessible, interoperable, and reusable. In their post in the LSE Impact of Social Sciences Blog, Rachel Bruce and Bas Cordewener share findings from a recent report which takes stock of how far FAIR principles are supporting open science in the UK and how they are understood and adopted by the research community.
The blog post says (quote): A consultation between stakeholders in the UK on the feasibility of a national GoFAIR node, and a Jisc workshop to coordinate uptake of FAIR, are just two possibilities. The Open Research Data Taskforce will also be key to taking the agenda forward, as will the European Commission's H2020 project investment in this area. The report also shines a light on the drivers for successful adoption of FAIR principles, including political, economic, social, and technological aspects. To support researchers to deliver open results and open data, the policies and technology available need to support FAIR from the outset. To ensure reproducibility of research and good research data management, which increases research impact, the FAIR principles are inseparable from open science; they are gaining enormous traction in Europe - the "Integrated advice of the Open Science Policy Platform, Recommendations FAIR Data" delivers strong recommendations………(unquote)
The full entry can be read Here.
2. Guest Post: Manuscript Exchange - What MECA Can Do for the Academic Publishing World - And What it Can't
A new academic publishing initiative Manuscript Exchange Common Approach (MECA) was recently accepted by members of the NISO as a framework for best-practices development in manuscript transfer across systems. The idea behind the project is that the industry's leading technology providers will work together on a more standardised approach to the transfer of manuscripts between and among systems, such as those in use by publishers and preprint servers, discusses John Sack, in his guest post in the Scholarly Kitchen Blog.
The blog post says (quote): The support of NISO, which will provide a neutral forum for MECA to be discussed and refined, will be crucial to enable this collaboration to expand to a larger group of stakeholders. It's important that they hear the voices and concerns of all interested parties in order to continually refine MECA's standards to ensure industry relevance. Obviously industry-wide collaborations such as MECA come with a whole set of challenges that need to be overcome. With the many different stakeholders and interested parties involved, achieving agreement on a set of standards is complex, but they have made important first steps. There is already one fully-operational implementation of MECA in production, and this will serve as a base for documentation and elaboration through the NISO review and approval process………(unquote)
The full entry can be read Here.
3. New Research Dispels Myths About Demand-Driven Acquisition
Demand-driven acquisition (DDA) gives libraries the power to make acquisitions that are triggered by patron usage and demand. The often-controversial model has been around for well over a decade, but has been in the spotlight more frequently lately as libraries look for different ways to build their book collections and save money, discusses a post in the ProQuest Blogs.
The blog post says (quote): Reasons that libraries turn to DDA include "just-in-time" access for patrons, increasing the relevance of their collections, and providing a better method of evaluation before purchase. For example, the La Trobe University Library in Australia has seen 100 percent usage of DDA-acquired content, versus 20 percent of librarian-acquired content. Not all libraries use DDA – and those who don't cite reasons like lack of budget, lack of staff time and expertise, and lack of need. But some of these libraries are considering adding DDA to their collection development strategy. DDA is used by some libraries to build small and targeted collections that meet the needs of specific populations. One university added a DDA plan just for ebooks in health sciences, specifically for remote students who do not live near campus………(unquote)
The full entry can be read Here.
4. Partners Across the Accessibility Ecosystem
Accessibility compliance weighs heavily on academic researchers' minds lately. Everyone has the right to have equal access to the information needed to complete their assignments for school. Jill Power, in her post in the EBSCOpost Blog, explores the ecosystem involved in compliance and accessible content for academic researchers.
The blog post says (quote): Providing the platform for which a user finds and consumes content is another critical element in making for an accessible experience for users. Aggregators must encourage their publishing partners to provide the content in a structured format that allows for flexibility in presentation and integration with other content and assistive technology tools. All labels or tagging provided by the author must be maintained and passed through any data feeds. Once the aggregator consumes the data feeds, any labels and tagging must then be leveraged directly in the user experience of the platform. The aggregator must test the content for use with a variety of assistive technologies and users to make sure the information and presentation meets expectations. An open communication between aggregators and publishers is critical in this process to make continued improvements as technology allows………(unquote)
The full entry can be read Here.
Leave a Reply