Will F1000Research Win Over the Traditional Scholarly Publishing Model?

Kazuhiro Hayashi breaks down F1000’s role in scholarly publishing and how the landscape is going to change.

Will F1000Research Win Over the Traditional Scholarly Publishing Model?

In order to understand the true significance of the University of Tsukuba and F1000Research partnering on the first open research publishing gateway (to publish in either English or Japanese), it is necessary to understand the history of scholarly publishing and get a clear view of the various challenges facing the industry today.

We asked Japan’s leading expert in academic information distribution technologies, Kazuhiro Hayashi, who is engaged in the development and implementation of a future-focused open science policy for Japan’s overall science and technology policy, to explain F1000’s role and his future vision of scholarly publishing.


 

17th century: from letters to journals

To explain what F1000Research is and what significance the platform has in the history of scholarly publication, I need to give you a brief introduction to the history of the academic journal.

The history of the modern academic journal dates back to the 17th century. One of the world’s oldest journals was Philosophical Transactions, which was first published in England in 1665. During this period, similar journals were created in France and Italy. Before that, books provided the only means for publishing one’s research, but to publish a book took a long time. The researchers of the time, in order to communicate their discoveries to others and leave a record claiming “This was my discovery!” before anyone else could take credit, wrote letters to each other. But, since a letter could be sent to only one person at a time, they had to send out many letters. The journal was born when someone had the idea of collecting these letters in one place and publishing them together. Even today, journal articles announcing breaking research-related news are referred to as “letters” or “communications.”

The next problem was monitoring the quality of the discoveries to be published in a journal. For that, someone had the idea of having other researchers in the same field check them before publication—and the peer review mechanism was born. This was the beginning of the academic journal culture of publishing studies only if the procedures and scientific validity meet certain standards and the information from the study is of high academic value. Given that information that is valuable and rare is a good commodity on which to base a business, before long, commercial publishers entered the game. Gradually, they assumed the role of the academic community and turned scholarly publishing into a business.

In other words, the history of scholarly publishing is the history of how technology has supported and optimized researchers’ interest in delivering their discoveries to more people as quickly and accurately as possible by increasing the value of published research and commercializing it. However, from the 17th through the 20th centuries, further innovation in the industry was limited to technological developments in papermaking, printing, and mailing.

The next real paradigm shift took place around 330 years after the inception of the journal, when the dissemination of information at universities via the internet began in the 1990s. We are now at the dawn of a transitional period for information distribution in which paper- and mail-based infrastructure is being replaced by completely new online platforms. Let’s take a closer look at what happened during the last 30 years to bring us to where we are today.

1990s-2000s: e-journals, preprinting, and the open access era

It is very difficult to say what constituted the start of the electronic journal, or e-journal. However, in the sense of the first journal that provided access to full-text articles in HTML, which could be considered the prototype for today’s e-journals, in 1995, the Journal of Biological Chemistry was digitized and the journal began providing full-text articles electronically. In the 10 years that followed, enormous progress was made in digitizing all of the journals that had been previously published using paper. As digitalization progressed, the physical restrictions related to the use of paper and mail services were eliminated and the internet created such economies of scale that everyone was able to access an enormous amount of research data. Commercial publishers who took notice at the time accelerated their sales of comprehensive package contracts for access to all the titles for which they owned the copyright, a model referred to as the “Big Deal.” Today, researchers can access more than 3,000 journals at any time from any place.

However, the Big Deal created a new problem. Subscription-related costs soared with the new package deals. Market principles related to price-setting do not work for research papers because they are scarce commodities for which there are no substitutes. Since commercial publishers are for-profit businesses, they, naturally, raise their prices to increase their profitability. As a result, along with increasing the number of e-journals in a package, they increased the package prices that university libraries were paying year after year until the libraries were no longer able to afford to pay them and began to terminate their contracts. The number of cases where the library gave up part of a subscription or canceled the contract entirely increased, even at well-known universities in Europe and the U.S.

This issue increased frustration in academia; even before the advent of the internet and electronic publishing, publishers’ unfair copyright policies and pricing robbed researchers of their access to other researchers’ work and impeded their research. But in 1991, a new medium for information dissemination, called “preprint publication by preprint server,” appeared in the field of physics. The role of the preprint server is, to put it simply, to get a study published as quickly as possible by publishing it on the web while it is still at the draft (preprint) stage. Anyone can share their findings at any time without incurring publishing costs and be able to solicit feedback from colleagues while maintaining the right to be credited for an idea or finding. The physics community that initiated this approach simultaneously sent their work to peer-reviewed journals. Then, in 1995, a UK-based scientist, Stevan Harnad, suggested that preprints be used to boycott the publishing companies. In his famous “Subversive Proposal,” he called on researchers to rebel against commercial publishers, saying that if all researchers published their papers on their own servers, it could run the publishers out of business and end their dominance.

At the same time, the view that the over-commercialization of scholarly work was a problem developed into an opposition movement. Critics felt it was wrong that research papers could only be read by people who paid a subscription fee to a private publishing company despite the fact that research papers produced with the support of public funding are public property and should be shared with the public. Researchers convened an international conference in Budapest, which resulted in the announcement of the Budapest Open Access Initiative (BOAI), which declared that scholarship should be freely shared and should be available for use by anyone at no charge. The so-called Budapest Statement advocated for open access to all research literature.

In the early 2000s, commercial publishers and academic societies opposed open access policies believing they were not commercially viable and could not assure the quality of research. However, following the launch of BioMed Central in 2000, PLOS launched PLOS ONE in 2006, an open access mega-journal, which had an initial impact factor of 4. It was also a commercial success. The challenge for open access publishers (including PLOS) was to change their business model to one in which they collected an article processing charge (APC) from the author. At the time, this success was unprecedented. It showed that open access journals could be commercially viable and could ensure the quality of the studies that were published well enough to obtain high impact factors.

Seeing this success, other commercial publishers joined in and starting launching open access journals in quick succession. They had already increased their subscription fees to the point that libraries could no longer afford them. Moreover, while there was a limit to the number of articles that could be published in a traditional paper journal, there was no limit to publications in an open access journal. Collecting the article processing charges (APC) from that many more researchers would ensure profitability. In other words, more open access journals meant more revenue for the publisher. Commercial publishers flocked into open access publishing and, by the early 2010s, the OA/APC business model was well established. In addition, in 2018, a consortium of 11 European national research funding agencies launched the Plan S initiative, which is expected to accelerate the conversion of journals to open access globally, by mandating that all research they fund be promptly published on an open access platform beginning January 1, 2020 (now revised to 2021).

 

The 21st century: attempts to resolve the issues related to peer review

Although considerable progress was made in journal digitization and the development of open access journals in the 20 years between 1990 and 2010, these changes only replaced the paper-based publishing system from the 17th century with a digital one. New issues centered around peer review emerged, and addressing these would require reforming the publishing process.

The existing peer review system is considered an indispensable mechanism for ensuring the quality of studies before publication. However, there are four broad issues with the existing system that need to be addressed: (1) getting an article published takes too long; (2) there is potential for publishing bias in favor of articles that will “sell well”; (3) the peer review process takes place behind closed doors, so it lacks transparency; and (4) the system makes it difficult to recognize the contributions of volunteers who spend their time reviewing articles for the publisher.

In a sense, preprints have resolved the first and second issues. With preprints, completed studies can be promptly shared with others in the field. These include studies that, in the existing peer review system, might be rejected. However, now researchers are faced with the problem of being overwhelmed by the vast number of papers of varying quality, and not knowing which can be trusted.

Subsequently, Faculty of 1000 (F1000) emerged as a publishing platform with the objective of resolving all four issues. The underlying idea was to take the conventional peer review mechanism one step further by proposing new methods for controlling the quality of published research. First, under a service called F1000 Experts, experts evaluated the quality of studies that had already been published and offered information that could add value to a study. Then, in further development, a platform named F1000Research was launched. This allowed preprints (articles in draft form) to be published promptly and be subjected to peer review. This reduced the time it took to publish a study, prevented publishing bias, and subjected the study to open review, which increased the transparency of the review process because the names of the reviewers were displayed and their comments could be cited. In other words, F1000Research is attempting to change the way scholarly literature has been published over the 350 years since the 17th century.

The future: a world of researcher-YouTubers

Although the F1000Research initiative is both challenging and pioneering, we are only at the very beginning of a transitional period which will tell whether scholarly publishing will break free from its 17th-century traditions. In my opinion, we have a long way to go before this solution helps break away from the traditional journal-based system. My point may be easier to understand if we look at the television industry. Publishing companies and journals are in the same kind of position as television production companies. With the spread of the internet, internet TV and video streaming services have been introduced, but television is still, for the most part, a type of mass media. Internet TV has kept the mass media culture and just replaced the technology it uses with a new one—the internet. On the other hand, YouTube’s momentous success as a video-based social networking system that is user-driven by digital natives has significantly changed ideas regarding video production, distribution, and entertainment. You could say that F1000 is still in a position like internet TV. However, since there is no platform like YouTube in the scholarly publishing industry for publishing the results of research conducted by digital natives, and because the researcher equivalent of the YouTuber has not been born yet, I think F1000Research is a promising platform to use in order to see how much it can get users involved in its operation and make it an innovation in the same league as YouTube.

From a broader perspective, despite technological advances such as AI, big data, and blockchain, both the publishing system and the old laws regarding information distribution like copyrights need to be updated. If our approach to technology changes, but the social systems, including its laws, do not change accordingly, true digital transformation will not take place. In other words, a lot of work remains to be done. In the future I envision, even the framework for academic publishing will have been done away with. It will be a world where researchers are paid a fee, for example, for research data that will be shared real-time with people who need it, researchers will be appropriately evaluated, and their reputations will be based on the distribution of their research data. The belief that this will happen in around 50 years keeps me active in this field. Even if I am not around physically, I will be watching from the other side [laughs].

In any case, I think that F1000 is a pioneering company that will strike a good balance between the ideal and the practical in the creation of a new publishing culture. I also think it would be wonderful if, as a result of the University of Tsukuba’s endeavor with F1000, new ways to share research results to make them publicly available will be created, and if those developments give life to new types of researchers and carve out a new world for research.

 


KAZUHIRO HAYASHI

Hayashi is a senior researcher at the Science and Technology Foresight Center of the National Institute of Science and Technology Policy (NISTEP) for the Ministry of Education, Culture, Sports, Science and Technology. He was accepted into the doctoral program for chemistry at the University of Tokyo’s Graduate School of Science. At the Chemical Society of Japan (CSJ), starting in 1995, he was engaged in the digitization of their English language journals and in the implementation of the CSJ’s responses to the roll-out of the DOI in 2002, and the roll-out of open access in 2005. To make use of his knowledge and experience with scientific findings and technology, he worked on projects to make the dissemination abroad of information originating in Japan more appealing through organizations such as the Science Council of Japan and the International Scholarly Communication Initiative (SPARC Japan). He is interested in the future of the ways scholarly information is distributed and in how the next generation of researchers should communicate. Since 2012 he has been engaged in policy science research at NISTEP. Currently, his international work toward the formulation and implementation of policies concerning open science has included acting as an expert committee member on open science for UNESCO, the OECD, and the G7 Science and Technology Ministers’ Meeting. In Japan, he has been involved in a broad range of activities, for example, acting as a specially appointed committee member for the Science Council of Japan and as the deputy chief examiner for the Cabinet Office’s Research Data Infrastructure Development and International Expansion Working Group. He is a founding member of the Research Data Utilization Forum and the Japan Open Science Summit.


This article is a part of ScienceTalks Magazine issue Welcome to the New Era of Open Publication.

Related post

Breaking the Barriers of Modern Academic Publishing

Breaking the Barriers of Modern Academic Publishing

Understanding a movement started by humanities and social sciences researchers to bring changes in the publishing industry
Putting Control Back into the Hands of Researchers

Putting Control Back into the Hands of Researchers

Interview with Kyosuke Nagata, President of the University of Tsukuba
Eliminating the Language Barrier in Research

Eliminating the Language Barrier in Research

How the University of Tsukuba is changing the way we share groundbreaking knowledge