Science and Research Content

Blogs selected for Week Mar 5 to Mar 11, 2018 -



1. A Curious Blindness Among Peer Review Initiatives

The world of scholarly communications is awash with innovation around peer review. There is, however, a worrying thread running through many of these initiatives. The common refrain is that academics should take back control of peer review, which carries the heavy implication that journal staff and publishers add literally nothing to the process because volunteer reviewers and editors do all of the work, discusses Tim Vines, in his post in the Scholarly Kitchen Blog.

The blog post says (quote): PCI recommenders could be asked to check through each new submission for missing figures and so forth, and they might be able to manage (without complaining) a few manuscripts per month. Any more than that and their volunteer enthusiasm will evaporate, so the checks don't get done. PCI could bring in more recommenders as submissions increase, so that each only handles a few papers. Then the problem becomes consistency: not all academics would be equally diligent with these checks, and disasters-in-waiting will slip through. The most logical approach would be to pay someone to consistently do these checks before the manuscripts go to the recommenders. While they’re at it, this person could chase up late reviewers, sort out issues with figures, and spell check the decision letters..........(unquote)

The full entry can be read Here.

2. Europe set to miss flagship open access target

The European Union is set to miss its target of having all scientific research freely available by 2020, as progress towards open access hits a "plateau" because of deeper problems in how research is assessed. Despite some progress, researchers are still reluctant to switch journals because of fears it could hinder their careers, notes David Matthews, in his post in the Times Higher Education Blog.

The blog post says (quote): The biggest barrier to publishing in an open access repository was the “high priority given to publishing in conventional journals”, a hindrance cited by more than eight in 10 universities. "Concerns about the quality of open access publications" were also mentioned by nearly 70 per cent of respondents. In some disciplines, to publish open access, you have to be a believer or activist and it comes at the risk of damaging your own career. Echoing a long-standing concern in science, we need a whole new system of research assessment that does not rely so heavily on citations and impact factors. The EU's flagship Horizon 2020 funding scheme requires grant recipients to publish their findings openly, but this was a far from universal policy for national funding bodies.........(unquote)

The full entry can be read Here.

3. Submit your manuscript via ScienceOpen

Choosing a journal to publish a research is not easy. Among thousands of journals one must decide which one will get the best visibility for their work. ScienceOpen can make it easier with a "Submit a manuscript" button, notes Stephanie Dawson, in her post in the ScienceOpen Blog.

The blog post says (quote): The Statistics of a journal on ScienceOpen can also be checked (number of articles added over time, number of views over time, number of shares…). It is a great tool to get an overview of the activity of a journal, and it also allows comparison with the activity of journals in the same field on ScienceOpen. Moreover, Following a Featured collection will provide you with an update whenever new content is added. ScienceOpen uses the context of a body of scholarly articles to make information more accessible and interactive. This new "Submit a manuscript" feature paired with the intuitive interface of ScienceOpen and insights provided by the data can save time for researchers in making an informed decision about where to publish their next paper.........(unquote)

The full entry can be read Here.

4. False investigators and coercive citation are widespread in academic research

A recent study has revealed widespread unethical behaviour in academic research. In his post in the LSE Impact of Social Sciences Blog, Allen Wilhite focuses on two activities in particular; the addition to funding proposals of investigators not expected to contribute to the research, and editors who coerce authors to add citations to manuscripts even though those citations were not part of the scholars' reference material.

The blog post says (quote): Coercive citation has been defined as editors who direct authors with manuscripts under review to add citations to articles from the editor's journal even though there is no evidence that references are lacking. These editors do not refer to a stream of research that has been overlooked nor do they mention specific manuscripts, in fact their only guidance is that the articles be from the editor's journal, sometimes insisting that the citations refer to recent issues of their journal. More than 35 percent of all academics responding to the survey were aware that some editors coerce and more than 14 percent have been coerced. Coercive citation seems to be an attempt to increase a journal's impact factor score which, despite its recognised shortcomings, has become the primary measure of journal quality. But this effort is a zero-sum game; that is, as one journal moves up in the rankings by coercing its authors, other journals which play ethically lose ground.........(unquote)

The full entry can be read Here.

Leave a Reply

Your email address will not be published. Required fields are marked *


sponsor links

For banner ads click here