1. Editorial control is a critical part of open peer review
Some researchers regard public, post-publication peer review as a non-rigorous, non-structured and poor alternative to traditional peer review. Much of this might be down to the view that there are no standards, and no control in a world of ‘open’, notes Jon Tennant, in his post in the ScienceOpen Blog.
The blog post says (quote): For starters, without an Editor, peer review will never get done. Researchers are busy, easily distracted, and working on 1000 other things at once. Opting to go out into the world and randomly distribute your knowledge through peer review, while selfless, is actually quite a rare phenomenon. Peer review needs structure, coordination, and control. In the same way as traditional peer review, this can be facilitated by an Editor. But why should this imply a closed system? In a closed system, who is peer reviewing the Editors? What are editorial decisions based on? Why and who are Editors selecting as reviewers?..........(unquote)
The full entry can be read Here.
2. Citation Networks Yield Competitive Intelligence
Citation networks can provide much more than journal metrics and rankings. Publishers should look to them for competitive intelligence, notes Phil Davis, in his post in the Scholarly Kitchen Blog.
The blog post says (quote): If we focus in on this citation map, we can learn a lot more about the relationships among orthopedics journals. First, you can see how the spine journals cluster together at the west side of the map; hand journals cluster together at the east side. The journal Injury (bottom) is in close proximity to the Journal of Orthopaedic Trauma. There are a lot of clusters that make perfect sense even by those of us who know little about orthopedics. Nevertheless, there are journals that you would expect to cluster together but don’t, for example, why doesn’t the Journal of Foot and Ankle Research (north-east quadrant) cluster with other foot and ankle journals in the south-east quadrant? Why do some journals cluster near the center of the map while others are relegated to the periphery, largely unconnected through citations to other orthopedic journals?...........(unquote)
The full entry can be read Here.
3. Integrate to Innovate: Using Standards to Push Content Forward
While many of the traditional publishing tasks remain intact, new tasks that are much more technical in nature have changed the skill sets required to be scholarly publishers. As new and developing standards and services such as Funder Identification, ORCID, CHORUS, and more come online, publishers and their vendors must integrate when they would rather innovate. The trick is in realising where integration allows more innovation, notes Angela Cochran, in her post in the Scholarly Kitchen Blog.
The blog post says (quote): ORCID is attempting to solve a number of identified problems in our ecosystem but the one of most interest to publishers is the disambiguation of author names. The trick with ORCID implementation is that unlike DOIs, it requires participation by our authors, many of whom don’t understand the benefits and couldn’t care less about our plumbing problems. So we ended up in a situation where authors couldn’t see the benefits because ORCIDs weren’t appearing in the publications and profiles weren’t being automatically updated. Of course, there were no ORCIDs in the publications because the authors weren’t getting them..........(unquote)
The full entry can be read Here.
4. Here’s how indexing could evolve with ebooks
An exceptional indexer can bring much value to a project. For example, there’s a huge difference between simply capturing all the keywords in a book and producing an index that’s richly filled with synonyms, cross-references and related topics. And while we may never be able to completely duplicate the human element in a computer-generated index, value can be added via automated text analysis, algorithms and all the resulting tags, notes Joe Wikert, in his post in the Digital Content Strategies Blog.
The blog post says (quote): This approach lends itself to an automated process. Once the logic is established, a high-speed parsing tool would analyze the content and create the initial entries across all books. The tool would be built into the ebook reader application, tracking the phrases that are most commonly searched for and perhaps refining the results over time based on which entries get the most click-thru’s. Sounds a lot like one of the basic attributes of web search results, right? Note that this could all be done without a traditional index. However, I also see where a human-generated index could serve as an additional input, providing an even richer experience..........(unquote)
The full entry can be read Here.
5. What makes research excellent? Digging into the measures aimed at quantifying and promoting research excellence.
‘Research excellence’ is a tricky concept in theory and arguably trickier to capture in practice. In his post in The Impact Blog, Toni Pustovrh shares findings from a recent study which looks at how research is currently quantified and evaluated in Slovenia. In-depth interviews with scientists reveal a variety of views on the concept and the current mechanisms in place. The analysis suggests that neither a predominantly peer-review based evaluation system, nor one based mainly on quantitative metrics will ever be the best solution, as both have their inherent problems. As one survey respondent notes, “numerics do not reveal the content”.
The blog post says (quote): Slovenia is one of the EU countries that has adopted and implemented a number of quantitative metrics in an automated system (SICRIS – Slovenian Current Research Information System) that quantifies, scores and evaluates the research bibliographies (outputs) of practically all Slovenian scientists (about 15,000). It enables the quantification of the research work of individual researchers, but also of research groups and organisations. Metrics are thus becoming a crucial R&D policy instrument in evaluating the research excellence of Slovenian scientists. They also strongly determine the allocation of project and research funds, grants and awards, as well as the employment and promotion prospects of every researcher. As stated in the Resolution on the research and innovation strategy of Slovenia from 2011, the system is intended to “encourage internationally recognisable, excellent research.”............(unquote)
The full entry can be read Here.
6. Opening up scientific publishing for the Flickr generation
For an aspiring scientist, being published in a creditable journal is a major step towards gaining respect in the field. But for Mark Hahnel, founder and CEO of Figshare, this old system was drastically in need of an update. He says, the internet was built for sharing academic data but the way scientific papers are published had hardly changed since the early days of the printing press, notes Jon Card, in his post in theGuardian Blog.
The blog post says (quote): Figshare has two revenue streams: it provides universities with the means to publish online - universities are given mini Figshares which let students self-publish - and it provides cloud solutions for publishers to host and publish data. Figshare has now published 2.5m objects (papers, datasets and videos) and provides several institutions with its services including the University of Sheffield, The Royal Society, Wiley and University of Auckland. But not everyone in academia is impressed. Hahnel says younger scientists and tenured professors are positive about Figshare, but those less secure in their work are more wary...........(unquote)
The full entry can be read Here.
Leave a Reply