Monday, 15 December 2014

JISC Monitor, a webinar, an update and some commentary

I’ll try and keep this brief. An update on this will be forthcoming on the JISC Monitor blog and the audio of the webinar will also be made available. For details on the project so far visit the blog here and the space on GitHub here. Aims of the project are to meet JISC requirements on monitoring open access/APC expenditure using a format that is achievable and makes best use of data capture available

Monitor so far

There are two parts to this project:

  1. Registry of all UK research publications
  2. APC administration system at institutional level

Outcomes of the project so far include:

  • GUIDE ‘Getting useful IDs early’ 
  • Best ways of capturing and storing data, e.g. when is it available in the overall ecosystem
  • Ways of improving data over time

Monitor Publications Registry (Richard Jones of Cottage Labs)

A prototype is available for online testing here.

So far Monitor has been working with aggregated data provided by Stuart Lawson through the JISC APC spreadsheet. The prototype allows you to break down expenditure by publisher/publication and this can be limited to institutional level. You can see the average APC price as well as min/max/avg spent against other institutions.

This raises some interesting questions like why is the average price not consistent across different institutions and this might help form a basis for discussion at future negotiations.
In the short term they are focusing on three reports that they believe will be of most interest and use to stakeholders:

  1. Expenditure by publisher
  2. Expenditure by institution
  3. Gold publications and compliance
Spend across institutions

Min/Max and Avg APC Article Cost


This data has been applied to the Sherpa FACT API for compliance information. The preconditions for checking this data require a FACT funder (RCUK/COAF) along with a publication. This has already revealed a large number of compliant green options available with these publications.










APC administration

Workflow

Lots of work is being done in terms of scoping requirements and identifying processes. The project is primarily centred on tracking OA journal publications. So far there is no software but a working specification that should address some of the problems facing administrators in managing OA publishing grants.

The scope of the project is to:
  • Ensure information from publication of open access outputs are being captured correctly
  • Ensure APC funding is being used appropriately
  • Ensure appropriate decisions are being made
  • Internal/External reporting, requiring data output in a flexible manner
  • Outputs other than journal articles
This flow diagram describes the overall process of publication, paying for publication and identifies points of data collection. It was noted that in practice administrators often arrive in this workflow at different points, e.g. when the author requires an invoice to be processed for payment.

Conceptual data model





This conceptual data model illustrates how this workflow can be captured. The dark blue tables illustrate that academic output (and associated cost) is at the heart of the data captured by the system. Other tables are designed to handle other associated administration tasks concerning payment
The core requirements are that the system is centred on outputs and tasks. Data capture and entry must be easy and where possible make use of local finance systems, Cross ref, ORCID, Sherpa etc. The system must be locally configurable to meet requirements at institutional level.

Wireframes

Numerous prototype wireframes for user interaction (forms etc) have been developed and will be made available soon. This shows a list of APCs. All attributes can be used to filter and sort this list as required. It’s a list of academic output that has been submitted to the system.



Filter and sort list as required

This screen illustrates an ability to track progress against tasks

Note tabs which breakdown administration

Going into a record shows a breakdown of tasks by authors/article/funding/finance/compliance. This was influenced by in-house systems developed by UCL and Imperial, as well as other use cases and workshops in the specification development. They are also following the ‘End-To-End’ and ‘Good Practice’ JISC pathfinders.

My comments

Monitor Publications Registry

The data this is based on suggests a focus on COAF and RCUK funders, although I understand that HEFCE is also a key consideration. Sherpa FACT, however, is only applicable to COAF and RCUK so it's not yet clear how the project will make use of ROMEO data in order to monitor REF compliance. Surely repository information will also be a requirement here?

Average cost does not appear to distinguish between use of prepayment accounts, memberships or Nesli2 discounts. The system 

The method of aggregation is not yet clear. Where and how will this data be sourced? How will they deal with duplication? How will GUIDE be used?

APC administration

The framework for the data capture seems sound but it's not clear how use of external data sources will be used to standardise the process (list of publishers/funders etc). Unclear how system would interact with internal college finance systems for payment.

Recording negotiation that sometimes goes on between authors/publishers/librarians for compliant archiving of papers into required subject repositories?

Will funder entity also include subj repository requirements? Can the system check if these have been met or assist with this aspect of compliance?




Tuesday, 2 December 2014

Revitalising university-based publishing: the Ubiquity Network

Licensed for reuse on pixabay.com
These are my notes on the UKSG Forum 2014 presentation by Brian Hole, CEO and Founder Ubiquity Press Ltd
http://www.ubiquitypress.com/

Ubiquity Press is an open access publisher of peer-reviewed academic journals, books and data. They operate a model that aims to make quality open access publishing affordable for everyone.  They provide the infrastructure and services to enable university and society presses to run sustainably and successfully.

UP is a researcher lead press. Many of the personnel who make up the leadership arm of the company are current or former researchers from a variety of disciplines. UP want to help bring publishing back to Universities using a model that is low-risk and sustainable.

This combination of publishing and research experience gives them a strong sense of the priorities for the services they are developing. For example they are also focused on research data and ways to ensure this are publishable and citable. UP want to build a structure that supports the researcher and their chosen mechanisms for the communication of their ideas. As researchers, they insist that all outputs they publish are done so open access at the point of publication and this is a corner stone of their business model. Closed access publications are considered a form of scientific malpractice by UP.

There are immediate benefits to authors and the general public that come from backing an open model. Among other measures, UP make use of AltMetrics track the narratives of output reuse in real time. This is of great interest to the research communities they support. By applying this data to their published outputs, the UP system helps to incentive research dissemination.

So far the costs for a monograph are between £3000-£10000 and they are working hard to keep costs as low as possible in order to ensure these services are available to universities. They expect these costs to be met from open access publishing budgets within institutions. UP services handle all aspects regarding infrastructure and allow for the partnered institution to overlay their branding and style requirements. This helps to boost the online profile and reputation of the institution. The metadata can be pushed through to a research management system and the full text can be pushed through to the institutional repository, if required. The production is also handled by a dedicated team of professionals meaning quality is consistent and assured - enabling access across different platforms and conformity with accessibility standards.

The way that costs will be kept down is by the size of the ecosystem that UB hopes to inhabit. The economy of scale means that as more institutions sign up the costs of production and dissemination across their workflow should fall.

The UB system also works with ethics committees of individual universities to ensure a rigorous peer review process that can be shared across all partnered institutions. This means that standards of academic quality are maintained by the community at large. The networks of institutions are growing steadily and they aim to have 30-40 signed up by the end of 2015.

They see themselves as a community of presses and believe that by working together they will be able to maximise the reach and impact of the publisher outputs of their researchers.

Monday, 1 December 2014

Taking the barriers away with Freemium access publishing

These are my notes on the UKSG Forum 2014  presentation by Francois Barnuad, Head of Marketing at OECD.

With the growing costs to authors, funders and institutions during the transition to 100% open access for scholarly research, there are some who believe this is creating an unfair, two-tier system of scholarly exchange. OECD is proposing a new model that they term "freemium open access", that they believe will provide a fairer means of open dissemination for research for a greater proportion of authors.
http://www.slideshare.net/tobygreen/learning-to-let-go-ssp-bostonfinal

From a recent South Park episode - genius
Open access in scholarly publishing somewhat creates a paradox. In order to maximise the dissemination of an output it must be free to access at the point of publication, however the associated costs of publication, dissemination and preservation must also be covered.

Green models advocate the use by authors of institutional and subject repositories. However, due to copyright restrictions this often means the experience of accessing a research output by an end user is not always the best. This is important when you consider the impact this may have on the end user's assessment of the quality of a publication. It's difficult to accurately quantify the true costs of repositories. In most cases this is met by the institution where, depending on the size, it may be a significant challenge to sustain funding for such initiatives.

Gold models advocate immediate, high quality access and dissemination of research; however the costs are so substantial that it currently excludes a large majority of scholars. Neither they nor their institutions are in a position to subsidise (particularly hybrid) publishing models alongside other financial demands, for example library subscriptions.

OECD argues that both models offer major drawbacks. The current state of the publishing ecosystem during this transitional period means that, in the short term, neither is built to grow sustainably. Green offers no way to offset the cost to the institution and Gold is out of reach of most authors with no financial backing.

Freemium offers publishers the opportunity to identity different kinds of users within their business model and therefore the opportunity to offer different levels of service at different rates.

A scholarly publishing model must begin on the premise that all users are able to access content for free at the point of access. The next level of the model must offer enough benefits to encourage visitors to become “registered users”. At this point users are identifiable and can be offered services and content tailored to their profile type. After a certain period of usage a certain percentage of registered users will move to a “registered paid” model, and this is how operational and distribution costs are met. Once they are identified, the publisher now has the opportunity to offer different rates to certain user profiles - for example free access for students.

All content must be built toward a freemium audience who expect all content to be free at the point of access, thanks to the success of this model in other sectors - i.e. the entertainment industry. It should be pointed out that the next generation of researchers will have been exposed to this kind of model for a majority of their academic life.

In some ways the idea of freemium content holds greater weight when measured against the growing appetite for open access amongst scholarly communities. It is, after all, a form of open access that conforms to the Budapest definition of 2001. OECD propose that the ability to read different versions for different devices is a model that charges for ‘convenience’ and could be adopted to help meet the cost of dissemination. The use of advertising within content is also a familiar model to users that could be adapted within this model. The term is ‘monetize’!

There are no gold coins to buy, but freemium may help to keep the scholarly community in the driving seat in promoting their research and represents good value for money in the dissemination chain. It offers taxpayers and funders another choice.

I close with more images from "Freemium Isn’t Free" - the sixth episode of the eighteenth season of the animated television series South Park.






In Metrics we trust? HEFCE Independent review of the role of metrics in research assessment

These are my notes on the UKSG Forum 2014  presentation by James Wilsdon, Professor of Science and Democracy, SPRU, University of Sussex and Chair of the Independent Review of the Role of Metrics in Research Assessment.

The HEFCE logo. Please note I am not
affiliated with HEFCE in any way!
HEFCE is carrying out a review on the role of metrics in the quality assessment of research. This will consider how well metrics can be used across different academic disciplines to assess the excellence of research undertaken in the higher education sector. The outcomes will be of interest to those who undertake research and will inform the work of HEFCE and the other UK higher education funding bodies as they prepare future iterations of the Research Excellence Framework. The report is due out next year, so Professor Wilsdon gave an overview of how the project is going. More details can be found here:
http://www.hefce.ac.uk/whatwedo/rsrch/howfundr/metrics/

The report was instigated by David Willetts. Its release will now take place just after the next general election as a result of his recent departure from the home office.

...I'm afraid this was the best GIF concerning politics I could find...


In essence, the report is designed to investigate how new metrics might be applied to future Research Excellence Frameworks. The report has gathered lots of evidence already. HEFCE had an excellent response to a call for evidence from a variety of stakeholders across the community. HEFCE wants the review to be as open and transparent as possible so these have all been made available on the review webpages.

HEFCE are engaging with a variety of metrics across the research system and identifying ways these metrics can be used in measuring quality. One avenue of analysis is how metrics are gathered. Another area of investigation is the dynamics of 'gaming' and in particular how this can be applied to a measure and the effects on the outcome. These are all important considerations in producing a methodology for assessing the suitability of a metric.

What is already clear is the careful balance that must be struck between peer review and metrics when assessing research. Peer review is not a perfect system and some of the problems has been illustrated across numerous case studies in recent times. See the following articles for a discussion of this:

Panel review is also tricky when undertaking a REF. HEFCE are conscious of the fact that panel review is not an idealised version of, nor can it replace the rigors of peer-review. The sheer volume of research that is assessed under REF (within a very short timeframe) is an indication of the scale of challenge faced by the assessors. Similarly the aspects around the application of additional merit are nuanced during assessment. Neither approach could be considered a perfect measure.

You see the difficulty...

When questioned about approaches for measuring the quality of research, the response from publishers has been mixed. PLoS take a nuanced approach. Peer review systems differ between publishers and disciplines and this may explain the mixed feelings on metrics here. It’s difficult to justify a ‘one size fits all’.

An excellent post published on the LSE impact blog by Derek Sayer (Professor of Cultural History at Lancaster University) warns of the potential pitfalls and dangers of measuring quality in research through assessment in this way. In his article he points to the many contradictions of the REF. He believes that whilst metrics may have problems, the panel-assessment processes of the REF that give such great power to individual panel members are far worse when balanced against the impact of assessment on researchers’ careers. Ultimately, he agrees that measuring research quality is fraught with difficulty:

HEFCE have run several workshops to hear about the experiences of different institutions leading up to the 2014 REF. These consultations took place with a range of Universities of different sizes and research specialisms. One point of discussion was how they made use of metrics and whether they found them easy to use. Again they found that support for metrics was split across this group. The participants questioned whether the community currently has the right kind of metrics for certain disciplines (e.g. the arts). Again when considering different disciplines, some were sceptical of the value of metrics in certain circumstances especially for measuring the research quality of an output.
An option that has been considered is devolving greater freedom of assessment to the panels. An interesting point was raised by several VC's is that the approach of using multiple metric measures would make their job of monitoring institutional performance much harder as, although potentially fairer, it would require an in-depth understanding of the functions and outputs of these measurements across disciplines.

Another consideration is to operate assessment at institutional level (through more peer review) and to do away with much of the assessment work of the panels. Again the views of the academic community appear to be split over the implementation of this proposal. HEFCE wants the evolution of research quality measures to take place with the consensus of stakeholders across the scholarly community. Academics, institutions and HEFCE have a shared commitment to the continual improvement to the research base of the UK.



Unearthing gold: hard labour for publishers and universities?

Cross posted at the UKSG LiveBlog: http://uksglive.blogspot.co.uk/

Presenter: Paul Harwood (Jisc Collections)
Blogger: David Walters (King’s College London)

Paul gave us an update on a JISC collections project he has been working on for the past year. The project was developed in the wake of the ‘Finch report’ (2012) and the subsequent introduction of the Research Councils UK (RCUK) policy (2013). The aim of the project was to provide a ‘quick and dirty’ insight as to what has been happening and the perceptions of what has been happening.

Activities around the project involved:

  • A questionnaire to RLUK members in January 2014
  • Face-to-face interviews with 7 publishers in March 2014
  • Telephone interviews with representatives from 4 European countries

RCUK is the strategic partnership of the UK's seven Research Councils. Each year the Research Councils invest around £3 billion in research covering the full spectrum of academic disciplines from the medical and biological sciences to astronomy, physics, chemistry and engineering, social sciences, economics, environmental sciences and the arts and humanities

Paul reflected on the statement presented to him by numerous RCUK representatives during the project:

“This is a journey not an event”

Paul also commented on the mission statement of the RCUK open access policy and noted that it was last updated in May last year. Further changes may be seen as we move further into the transition to open access publishing and the impact of the RCUK policy is realised more widely.

------------------

Questionnaire to RLUK members

Paul explained that the project targeted RLUK as subjects because they are an excellent collection of the ‘great and the good’ research led institutions in the UK. There are a greater number of institutions that are members of RLUK and a greater number of universities that are eligible for RCUK funding, which is why this group was chosen as a sample over the Russell group. There are currently 34 members (and growing). 28 members are in receipt of RCUK funding.

71% of eligible institutions responded to the questionnaire. There were some interesting findings:

“Around a 1/3rd of respondents had a mandate for authors to deposit papers into the institutional repository.“

Paul noted there was some confusion around the term mandate vs policy. He commented that this cuts to the ‘heart of the argument’ on open access. Should authors be forced into disseminating their research openly? Will institutions meet resistance from researchers as a result?

“Around a 1/5rd of respondents had an institutional fund for authors prior to the new RCUK policy. Now around 2/3rds do“

Research funders also appear to be advocating more stick and less carrot. The vast majority of institutions did not have a mandate before RCUK policy came into effect but have since introduced a central fund to support authors meet the conditions of their grant.

The project found the BIS fund is likely to have had a big impact on this trend. Significant percentages have spent this money on prepayment accounts for the payment of future APCs, staff resources and retrospective gold. The highest number of agreements was 16.

Paul noted that another significant finding is that the library is leading this initiative in almost every case.

There is an average of 117 RCUK funded articles published per institution from April 2013. They found a large range of total funds spent between institutions, ranging from 30k to 500k. Many institutions have stated that if the money runs out, they intend to meet any remaining costs themselves.

Cambridge stood out as an institution as they have published their collected data on figshare. There is hope that other institutions will follow suit.

Overwhelmingly, institutions want to be proactive and to understand the workflows involved, which is why most are handling payments themselves. Around 1/3 are making use of the JISC APC and OAK scheme and approximately 2/3 are using their own system.

The most time and effort around the workflows has been going into checking that requirements have been met. This could involve checking all articles for acknowledgments, for example.

When approving payments, most institutions will not pay a gold APC charge if the publication does not meet the RCUK terms and conditions - an incorrect creative commons license, for example.

The project found a lot of frustration around the communication of RCUK to it’s authors. The majority of institutions positively support RCUK policy. However, many are concerned about spiraling costs and others concerned about the overall strategy by promoting Gold over Green. In advice to authors, more than ¾ of those surveyed are expressing institutional preference for achieving the RCUK objectives through the green compliance option over the gold.

------------------

Face-to-face interviews with 7 publishers

“A journey or an event”

Paul discussed some emerging themes with different publishers. During the project they spoke to both large and small publishers for a range of perspectives.

On being asked how open access has been received in their respective publishing houses’ the view expressed a feeling that a sufficiently big transformation has occurred to warrant major change. It's now widely accepted that open access is not going away and they are adapting their models to meet this challenge

There are major problems with many publishers in tracking funder information. For many it is information that is simply not held. The new RCUK policy has prompted a change in the way they store and manage this information. Many of the big publishers now incorporate funders and their associated requirements into their submission system. So far this is just not viable for some of the smaller publishers. All groups are looking for the development of industry standards surrounding the issue in order to meet this requirement. Integration with FundRef is one such solution they are working on.

This work is partly being informed by RIN (Research Information Network), who have produced new report detailing the kind monitoring that needs to take place in the transition to open access.

Many publishers are working to alleviate allegations of double dipping by changing their business models to offset these costs.

So far they have found a high administrative overhead with regard to this work.

------------------

Other representatives

Austria now closely mirrors the UK in terms of a mandated open access policy.

In Austria, funders are entering into agreements with publishers on behalf of their authors. The Austrian Science Fund (FWF), the Austrian Academic Consortium (Kooperation E-Medien Österreich), the Austrian Central Library for Physics at the University of Vienna and IOP Publishing (IOP) have announced a new pilot project that will provide advance funding for Austrian researchers to publish on a hybrid open access basis in IOP’s subscription journals and which will offset that funding against subscription and licence fees paid by the Austrian Academic Consortium for access to IOP’s journals.

Other countries want to see more government support in order to make steady progress in the transition to open access. In the words of some, they would like their own David Willets!

There is a perception that most publishers do not want to see open access fully realised. Generally there is a preference among funders for green route. However, if it appears that this is not meeting expectations, they are ready to make a case for gold funded OA.

In the Netherlands, they sense there is a growing open access movement building in the same way as the UK. However, although they are committed to this, they are not prepared to put additional funds aside in the same way. There is a perception that publishers took over the finch report and have used it to generate more revenue.

There is a feeling that green isn't working and that research evaluation needs serious review.
In Germany there is a real sense that their Government is not engaged with the issue of open access, despite repeated appeals by researchers who want it. They suggest that the RCUK policy is flawed and they shouldn't put more money in the system. Supporting two systems is seen as unsustainable.

Towards the next Research Excellence Framework

Cross posted at the UKSG LiveBlog: http://uksglive.blogspot.co.uk/

Presenter: Steven Hill (HEFCE)
Blogger: David Walters (King’s College London)

After extensive consultation, HEFCE and the other three UK funding bodies have published details of a new policy for open access relating to future research assessments after the last REF (submitted in 2013). Steven presented on aspects of the new policy and the motivations which are driving this change.

Steven was quick to point out that the policy is still in the early stages. HEFCE are framing the next REF by looking forward to expected changes in research methods and practices. In particular, they are looking at questions on how this will be assessed. The ‘juicy details’ will be forthcoming.

Details of the announcement can be found below, but in brief there is a focus on open access full-text deposit and metadata discovery for article submissions. This will require significant engagement by authors in terms of open access if they want to submit papers for assessment.
http://www.hefce.ac.uk/news/newsarchive/2014/news86805.html
------------

Open

Steven noted that, in the changing research landscape, alongside open access there are other ‘open’ terminologies emerging like ‘open research’ and ‘open science’.

The Budapest initiative and subsequent definition of open access encapsulated the social revolution underway in how we perceive ownership of information. Their inspirational opening paragraph really sets the scene for the changes to come.

Steven quoted from the book ‘Reinventing Discovery’, where Michael Nielsen argues that we are living at the dawn of the most dramatic change in science in more than 300 years. Steven discussed the importance of moving information out of people's heads, and out of siloed laboratories, to be accessible on the network as a fundamental imperative on this road to change.
------------

Funders response

Steven commented on funder’s response to this issue, which have served to drive and incentivise the issue of open access.

The new HEFCE policy will work alongside this by removing those perceived barriers, whilst protecting the elements of dissemination that should be retained.
------------

The Post 2014 REF

  • Deposited
  • Discoverable
  • Accessible
Steven pointed out that ‘The Post 2014 REF’ is really the correct terminology for the forthcoming assessment. Phrases like ‘REF 2020’ are misleading as we don’t yet know when the assessment will take place. Assessments usually take place every 5-8 years.

There is a focus on open access full-text deposit and metadata discovery. This applies to Journal articles and conference proceedings accepted for publication after 1 April 2016.

The requirements state that peer-reviewed manuscripts must be deposited in an institutional or subject repository on acceptance for publication. The title and author of these deposits, and other descriptive information must be discoverable straight away by anyone with a search engine. The manuscripts must then be accessible for anyone to read and download once any embargo period has elapsed.

Steven highlighted the fact that deposits are not limited to institutional repositories. However, he expects that universities will require this for their researchers in order that they have better control over the assessment.

The aims are to make papers discoverable as early as possible and accessible through whatever open access route is available. They are green/gold neutral, but expect required embargoes for publicly accessible open access to be 12 and 24 months depending on the discipline.

There is a feeling that the open access monograph landscape is not yet developed enough to make this an assessment criteria of the panel. This is especially in terms of business models, but the board does recognise emerging opportunities and associated risks. Consequently, the policy does not apply to long-form outputs. However, they are discussing the possibilities of additional credits for authors who do make their monographs and book chapter’s available open access. They expect this will be a criteria for this in future REF assessments.

They are also discussing the availability of additional credits for reuse rights and text mining. Text mining is expected to be available under the new government copyright legislations outlined by the Hargreaves review.
Whilst there are exceptions for submission, they don't think these will be widely used.
Based on the results of the REF 2013 assessment, they expect that 96% of papers submitted will be able to comply with these requirements without changing their choice of publication venues.
------------

Open data

Open data could be rewarded in the next REF. However, it's a complex and diverse issue. Sometimes it's not possible to make your research data openly available, for example when dealing with issues of confidentiality. Sometimes research data is very large. Steven gave the example of the square kilometre array, which is the world’s biggest radio telescope. In terms of data, this project annually collects the equivalent of 50000 DVDs.

The culture surrounding open data is still being developed. It is most important that all key stakeholders play a role in supporting researchers as they adapt and react to this new imperative. For HEFCE this is a key consideration at the forefront of their thinking.
------------

Metrics

HEFCE have been performing a metrics research review. They have been considering some key questions. Primarily, what kinds of metrics for research performance are out there? What do they measure? Are they fair? What are the behavioural impacts of using metrics?

Steven commented on example of gender bias in citations. Overall, men are cited more than women.
------------

Open research assessment

Steven took us through an example of the classical research cycle. This has been described and thought of in its current form for a very long time. He commented that open access serves to take part of this cycle and make it openly available. The same can be said for open data.

However, the changing landscape makes it possible for this entire cycle to be revolutionised. Resources like figshare enable ‘micro publishing’. This allows for little chunks of data and small experiments to be instantly accessible potentially providing a much broader picture.

Steven also mentioned ‘Open notebooks’ as a scientific method. In these research cycles, the whole process becomes open at all stages.

Steven also commented that Post-publication peer review is becoming more important than pre-publication peer review, as it provides impact and analysis in real-time.

All these networks and linkages across institutions, resources – across large and small data and research projects – present a real challenge for the community as researchers and assessors.

New methods and standards are required in order support these new evolving research practices and to ensure fair and accurate assessment.

Disruptions in a complex ecology: the future of scholarly communications

Cross posted at the UKSG LiveBlog: http://uksglive.blogspot.co.uk/

Presenter: Michael Jubb (Research Information Network)
Blogger: David Walters (King’s College London)

Michael gave us a very interesting overview of the purposes of scholarly communications and how changes to the infrastructure are steering change, new ideas and new forms of expression.
-----------

Purposes of scholarly communication

Michael gave a useful summary of the purposes of scholarly communications, who’s needs have been served within a research/publishing framework for centuries. In short, the purpose is to generate and share ideas and increase the impact of piece of research. Michael used the apt phrase of ‘standing on the shoulders of giants’.
  • Register research findings, timeliness, and attributable persons
  • Review and certify findings before publication
  • Disseminate new knowledge
  • Preserve a record of findings for long term efficiency and effectiveness of research
  • Reward researchers for their work
It's not just about communication, there are a number of other things that make research dissemination important. Metrics and impact are additional factors. A number of different purposes are fulfilled by research communications systems. Other considerations:
  • Discorerable
  • Accessible
  • Assessable
  • Usable
Michael explained that research, particularly in the sciences, should be seen as an open enterprise. It needs to be communicated in as way that is intelligible and accessible, so that people can understand what has been discovered.

Research needs to be open to quality assessment, not just assertion, based on the evidence submitted. It should also be usable as this ties into impact.
-----------

Mechanisms for scholarly communication

Michael discussed some of the different mechanisms that research is communicated and took science as an example. Other disciplines have a wide range of different methods of communication.
  • Orally: lectures, seminars, conference presentations, teleconferences
  • Written; theses, working papers, pre=prints, books, journal articles, wiki's, blogs, emails
  • Public vs restricted audiences 
  • Peer-reviewed or not
Michael highlighted a distinction between the mechanism and the degree of quality assurance.
-----------

Players and stakeholders and their interests

Researchers

  • Interested in career development and advancement
  • Interested in discovering the work of others in their fields.

University's - who employ the researchers

  • Interested in building the reputation of their institution
  • Interested in raising funds through the reputation of the research published.

Funders

  • Interested in the research they fund being accessible to the research community
  • The research councils have a mission statement. They want their research to make an impact on the world at large, in order to make a positive difference in society.
  • They have an interest in the efficiency of scholarly communications – they don't want it to be too expensive.

Librarians

  • Content, access and discovery for researchers and students in their institutions. This is in support of research and teaching/learning.

Publishers

  • Reputation
  • Generate revenue
  • Impact in wider society
  • Maximise dissemination

Learned societies

  • Interested in the relationship between publishers and their research community.

Michael pointed out that these groups do not always share exactly the same interest. There are clear streams of funding that flow between these groups. For example, from the library subscription budget to publisher. Therefore it is not surprising that there are tensions in the ecosystem.
-----------

The research landscape: Funders and Do-ers

Michael discussed who does research and how it is funded. He described this landscape in terms of ‘funders and do-ers’. There are many different kinds of funders; governments and charity's to name a couple.

He referred to a study by Elsevier which illustrates the global research landscape. This study showed that companies and business are by far the biggest funders of research and account for around 2/3rds of activity. The next biggest funder of research is universities and after this is government.

Thinking back to the ecosystem in terms of communication. The funding groups who are most interested in communication are those funded by governments, charities and universities. Business does not have a big interest in this – they are keen to protect ownership of their research and subsequent in-house innovation. Most research conducted through business is closed. A very high proportion will never be seen publicly, i.e. be published in journals.

There are international differences. In Japan, research funding is overwhelming dominated by business. In the UK, a much larger proportion is funded by the government. The UK is an outlier when compared with how research is funded globally.
-----------

Collaboration

Michael demonstrated that research is increasingly becoming a collaborative enterprise. The proportion of articles with authors from more than one country is growing.

The UK and Germany show that 50% of articles came from international collaboration. In the US, this is much lower. In China, it is only 15%

Comparatively in the UK, the proportion of articles with just a single author is around 15%

This shows that there are different international players with different interests, which adds to the complexity of the ecology.
-----------

Research data

Michael described the issue of Research data as of vital importance as the landscape undergoes a transformation.

Without the constraints of a journal, research can be instantly accessible across millions of data points. What is required is a coherent infrastructure to join up these points along with the rhetoric argument of research findings and to find an effective means of presentation. This is a challenge with research data growing exponentially. Visualisation techniques and analytical tools are required to utilise this data.

This also throws up a whole set of issues around whether research can be replicable.
-----------

Quality assurance and peer review

Michael outlined who is responsible for quality assurance.

Who:

  • Editors and editorial board
  • Publishers’  editorial staff
  • Reviewers

Types:

Single blind – the author doesn't know the reviewe,r but reviewer knows the author. This is common in sciences. It’s argued that it is too difficult to achieve double blind due to the volume of papers produced, the idenifiable writing style of authors and highly specialised subjects.
Double blind - neither knows who the other is. This is common in the humanities.

There are ongoing issues with this system. Authors complain of bias, delays, inefficiencies, data, replication and overload. It is also difficult to find people willing to do the job – reviewers often work for free – and this slows down the process. It also can be difficult to assess the quality of the research data.

Another issue is that sometimes journals make judgements on the significance of a paper, and this affects the publication schedule.

There are a series of experimentation's going on around peer review. For example the PLOS peer review system, which is based simply on the soundness of the science.

Other systems are exploring the idea of cascaded peer review results, within a publishers’ portfolio and between different publishers.

There are moves toward completely open peer review, with ongoing interaction between the author and reviewers.

Michael commented that post publication peer review is becoming more important the pre publication peer review. Very often publisher platforms allow for comments and reviews alongside the published article.
-----------

Open Access

Michael commented that, in reality, open access is more complicated than green and gold. These terminologies can lead to confusion in terms of how open a paper really is, particularly in terms of re-use.
  • Fully OA - paid/unpaid
  • Hybrid
  • Delayed free access
  • Repository preprint
  • Repository accepted
This is a complicated landscape. Global repository take up is underused in the UK. It’s estimated that only around 9% of articles are available in a repository.

-----------

Closing

Michael finished with some key questions surrounding the future of scholarly communication

  • How do you sustain the current ecology flows with innovation and sustainability?
  • Do journals have a future?

Growing and Measuring Impact: researchers' needs, and experiences of meeting them with Kudos

These are my notes on the UKSG Forum 2014  presentation by Will Russell, Manager of New Technologies and Incubation, Royal Society of Chemistry.

Chart of the Download Activity for “Digital Curiosities” from
UCL “Discovery.” - Melissa Terras
Link to blog post






The RSC and Taylor & Francis are supporting a pilot programme with Kudos, a start-up that provides tools and services to help researchers maximise the usage and impact of their published research articles.

The need for a service like Kudos arises from developments in global academic and research policies that are increasingly calling for evolving interpretations and measurements of 'impact' - used to assess researcher excellence. Publishers already invest editorial, marketing and technological expertise in making research articles discoverable. Kudos will further support this by helping leverage the expertise and social networks of researchers and research communities themselves in order to drive visibility and usage.

http://www.rsc.org/AboutUs/News/PressReleases/2013/Kudos-RSC-launch.asp

RSC use Altmetrics data and display this at article level through 'donuts'. The 'donut' is the chosen visual, infographic style of Altmetrics for displaying data from all the major social media platforms. They allow visibility of this data across different dimensions of scale and time. Among other things, this includes blogs, tweets, shares and pins!
The altmetric 'donut'

Alternative metrics are of great interest and value to the authors, but less so to RSC. The alternative measures of articles don’t necessarily have an impact on RSC as a publishing platform as they are not a measure of scientific correctness or quality. However, these metrics do enable a unique insight for authors into exactly where in the world their work is being discussed and on what platform. This information is instantly, visually communicable to the author.

More and more, social media platforms are becoming a vital link in the discovery chain. Because in large part it requires engagement by authors it is often underutilised. In some disciplines this activity is fostering a two-tier digital divide. See this blog by Melissa Terras Director of UCL Centre for Digital Humanities and Professor of Digital Humanities in UCL's Department of Information Studies who shares her experiences of using twitter to increase the impact of her papers.

http://journalofdigitalhumanities.org/1-3/the-impact-of-social-media-on-the-dissemination-of-research-by-melissa-terras/

It is becoming increasingly important for authors to actively disseminate their research using their social network of peers and more work needs to be done, particularly with regard to appropriate training in building a digital online profile and making best use of discipline specific dissemination platforms.

Kudos is a web-based service that helps researchers, their institutions and funders to maximise the visibility and impact of their published articles. Across the multitude of social channels. Kudos provides a platform for assembling and creating information to help share information to drive discovery, and for measuring and monitoring the effect of these activities. 

This may help researchers who want assistance with increasing usage of and citations to their publications by helping to push papers across a horde of social platforms that forms part of the author’s social network. This could help increase the impact of the research. 

This is particularly true when you consider that a paper may of most relevance (in the first instance) to an author’s personal network, which may take the form of a discipline specific, academic community. An author’s professional network (established and maintained over many years) can be harnessed and exploited by pushing new publications through these channels.

Presently it is only the corresponding authors who get an invitation to sign up to Kudos through RSC. The tool helps by enriching an articles metadata with additional terms that is more suitable to dissemination across platforms like twitter – for example, some article titles would never fit within the 120 character limit.  Shorter titles for dissemination purposes help with discoverability and readability on busy platforms like twitter. 

RSC were quick to point out that in their opinion this is not gaming. This process is increasing the visibility of their papers across the web and to authors professional networks. As a tool, Kudos gives authors the chance to compete quickly and effectively against others who have been utilising these channels to their advantage for some years.

So far RSC have found that only a small group of authors are very engaged with Kudos. This may be indicative of a lack of awareness or training around these issues. It is possibly also a reflection of the varying nature of how communities of practice are fostered and communicate ideas with each other.

Interestingly RSC recently experimented with author the idea of bringing life back to old papers by using these social media channels. The author found by send them out into the global sphere he immediately saw a reflection in his Alternative Metrics scores. This author hopes this shows that his papers are not dead and their use can be extended beyond publication. His view is that article level metrics of this kind should become a normalised measure and part of the portfolio of impact assessment tools.

New life for papers?