Technical and social issues influencing the adoption of preprints in the life sciences
Authors:
Naomi C. Penfold aff001; Jessica K. Polka aff001
Authors place of work:
ASAPbio, San Francisco, California, United States of America
aff001
Published in the journal:
Technical and social issues influencing the adoption of preprints in the life sciences. PLoS Genet 16(4): e32767. doi:10.1371/journal.pgen.1008565
Category:
Review
doi:
https://doi.org/10.1371/journal.pgen.1008565
Summary
Preprints are gaining visibility in many fields. Thanks to the exponential growth in submissions to bioRxiv, an online server for preprints in biology, versions of manuscripts prior to the completion of journal-organized peer review are poised to become a standard component of the publishing experience in the life sciences. Here, we provide an overview of current challenges facing preprints, both technical and social, and a vision for their future development.
Keywords:
Biology and life sciences – Careers – Peer review – Science policy – Research funding – Scientists – Scientific publishing – Open access publishing
Unbundling the functions of publication
Science progresses only at the rate at which we can share information with one another. But as any author of a journal article can attest, formal mechanisms of scholarly communication do not always work efficiently and can be subject to biases [1–3]. Peer review takes time: not merely for the reviewer to compile a thorough assessment but also for the editor to find reviewers. In the swiftest case, a manuscript is accepted at the first journal, and the process to eventual publication may take approximately four months [4,5]. However, given that many researchers continue to be evaluated based on the reputation of the journals in which their work is published, authors are incentivised to “aim high” when they select a journal, and it can take several rounds of review (at a single or multiple journals) before the work is approved for publication. It is commonplace for a manuscript to have been submitted to at least two journals on its way to publication, and as a result, the overall peer review process can take years [6].
The sooner a piece of work can be read, evaluated, and built upon, the faster science moves. And by including a greater diversity of thought in the process of science, higher quality final products emerge. Yet, although our system of publication has superficially transitioned from physical print magazines to online websites, the mechanisms and processes of scientific communication are not much faster or more inclusive than they were in the 19th century.
Perhaps the underlying cause for this stasis is the fact that our system of evaluating scientific work—whether for deciding what to read or to whom to award grants and jobs—relies heavily on the reputation of journal titles, and in turn, journal article output is factored into university ranking calculations [7]. Experimenting with new forms of sharing science that are incompatible with publication in traditional venues therefore carries career risks. In addition, many open-science practices (posting lab notebooks, sharing data sets, or conducting replication studies) require significant extra effort for researchers, which is currently not universally prioritised. Therefore, researchers need efficient mechanisms of sharing research that align with current publishing practices while supporting a gradual evolution toward more transparent and efficient communication practices. One small step is to simply share manuscripts publicly at the time they are ready to send to a journal, i.e., by posting a preprint. Preprints are versions of manuscripts made public (often on a preprint server) before the conclusion of a formal (often journal-organized) editorial process. Using preprints to separate in-depth review from the initial act of sharing can increase efficiency while requiring minimal extra work for authors and presenting science in a format that is easily recognized by readers.
Here, we distill what we’ve learned from our work listening to concerns about, and investigating issues surrounding, preprints. We summarize the current state of support for preprinting in the life sciences, discuss extant needs and challenges, and put forth ideas for future developments.
Why now?
Posting preprints is standard practice in many fields in physics, mathematics, computer science, economics, and other disciplines. Preprints are only now becoming widespread in the life sciences (Table 1), despite a long history of sincere efforts to establish servers in biology by both public and private sectors dating back to the 1960s [6]. Why have they taken off in biology since 2015? We suspect that at least four factors have contributed.
First, in today’s digital world, the idea of composing a manuscript in real-time using collaborative editing tools only to not share it with the community seems increasingly anachronistic.
Second, bioRxiv was positioned effectively within the existing publishing paradigm from the start. Founded by veterans of the publishing industry, John Inglis and Richard Sever, bioRxiv quickly established partnerships with a number of journals. These journals not only agreed to consider manuscripts posted as preprints but also established a direct submission pipeline enabling authors to submit to both with one click. Furthermore, perhaps driven by a competitive publishing environment, editors began to invite submission of manuscripts from preprint servers (discussed below). Preprints now represent an opportunity to publishers, in which previous efforts to share science in this way may have been seen as a commercial threat. Direct submission arrangements and anecdotes about manuscript recruitment offered researchers confidence that posting preprints would not endanger their chances of journal publication. Furthermore, the ownership of bioRxiv by Cold Spring Harbor Laboratory, a credible, nonprofit research institute, likely contributed to its resonance with the community of authors and readers.
Third, many funders have since provided active support and recognition for preprints. Although the NIH has been involved in preprinting through the Information Exchange Groups of the 1960s and Harold Varmus’s 1999 eBioMed proposal [6,37], only recently have many funders voiced support for preprints as a mechanism for applicants and grantees to demonstrate productivity. We discuss these policies in detail below.
Fourth, Twitter created a community that provided visibility to preprints and support to their authors [38]. All of the benefits of preprinting (including discussion, collaboration, visibility, and earlier disclosure) rely on active acknowledgment of preprints by the authors’ community. At the early stages of any movement, adopters will be relatively far and few between, limiting their ability to support one another. Twitter has allowed preprint enthusiasts to connect with one another across institutional boundaries, meaning that even a small number of early adopters can reap the benefits of increased exposure and feedback for their work by sharing preprints with one another.
Preprints in harmony with journals
In 1966, a cabal of journal editors “outlawed” Information Exchange Groups (the NIH’s photocopy and mail-based preprint exchange platform), fearing that preprints would damage their business model [6]. A representative of the American Association of Immunologists wrote that “Since the preprints are complete publications, there is a real danger that they will reduce the usefulness of existing journals in the field of Immunology and may ultimately supersede them” [39]. Indeed, reports that papers change little between their preprint version to the final published version have caused some to declare that preprints can be the end of the story [40]. Despite the irony that the article reporting this similarity added a section on bioRxiv before its publication in a journal, the more serious issue is that textual analysis may not accurately capture significant changes in meaning. And there is value in evaluation even if the manuscript stays exactly the same: peer review can provide validation as well as improvements.
Perhaps for these reasons, authors continue to use journals even in fields in which preprinting has long been common practice. For example, in physics, 73% of papers on the arXiv can be matched to an article that appears in a journal indexed by Web of Science [41]. Although bioRxiv is younger, the number is similar (67%, [42]), suggesting that neither archive is massively disrupting the journal business.
In fact, preprints are very much complementary to journals, and they offer several tangible benefits for editors and publishers. First, although there is no evidence that the relationship is causal, papers that have been preprinted garner more attention over time [43,44,45]. Preprints allow authors to receive feedback from a broader range of scientists than could be engaged in a typical peer review process. Although a small amount of this feedback appears in the commenting section [46], the majority is communicated elsewhere. For example, a survey of bioRxiv users found that over 40% of authors get feedback via social media, and private feedback from emails and other correspondence with colleagues is nearly as common [32]. In cases in which community feedback on a preprint is incorporated before or during revisions in a journal peer-review process (as we have done for this paper), the version of the paper that is ultimately accepted by the journal will have undergone more scrutiny, likely leading to a higher quality final product. Although the attention each individual paper garners may decline as more and more manuscripts are preprinted (note, for example, that median downloads per bioRxiv preprint in the first month peaked in 2016 [42]), we expect this form of feedback to continue, especially for highly interesting, time-critical, or controversial work that is in the greatest need of additional scrutiny (e.g., see Outbreak Science [47]).
Furthermore, preprints offer an efficient marketplace for papers [48]. Although many editors travel to conferences to invite submission of future manuscripts based on interesting presentations, preprint servers make the manuscripts themselves open to review by anyone in the world. This can enable editors to curate from a wider pool: all preprinted papers, rather than just those actively submitted to their journal. Therefore, it is no surprise that the practice of inviting journal submissions from preprint servers seems to be widespread [49]. PLOS Genetics has pioneered the formalization of this process with preprint editors [50] and Proc B has adopted the practice as well [51]. Unfortunately, many such invitations may be moot because it is common practice for authors to post the preprint version concurrently with submission to a journal, a process that is facilitated by integrations in both journal and bioRxiv submission systems [27,52]. In order to allow this marketplace of submission invitations to function efficiently, authors can post their preprint a few weeks before journal submission and allow their work to recruit feedback, attention, and editorial invitations. Doing so could help save both authors’ and editors’ time along the way.
Finally, preprints relieve pressure on journals. Authors generally would like their papers to be published as soon as possible, leading some journals to promise shorter peer review turnaround times, perhaps at the expense of allowing reviewers to be as thorough as they would like to be [53]. If authors can instead share a preprint immediately, they are likely to feel more comfortable waiting a bit longer for high-quality, journal-organized peer review.
Journal policies explicitly permitting or even encouraging preprinting have removed much lingering fear of rejection due to prior publication conflicts. Even some long-standing holdouts, notably Cell Press, JACS, and the American Association for Cancer Research [54] have updated their policies to be friendlier to preprints. A full list of basic journal policies on preprint archiving can be found at SHERPA/RoMEO [55], more informal lists can be found at Wikipedia [56], and detailed policies on preprint version, licensing, and media coverage policies can be found in Transpose [57].
Institutional and funder support
Preprints allow researchers to demonstrate their most recent work to prospective and current funders. It is becoming less acceptable to cite work that is “in submission” or “under review” in grant applications: when a manuscript is prepared, reviewers wish to see it and may request the applicant cites a preprinted version [58]. Practically, preprints allow reviewers to judge applicants for funding or promotion by the rigor of their latest science.
In comparison to journals, university policies for the assessment of applications for hiring, promotion, and tenure seem slower to change [59], but there have been bright spots for preprints. For example, in late 2016, NYU Langone Medical Center added language to their promotion and tenure guides to include preprints as a potential research output, and in early 2018, UC Davis added a “preprints” category to the their online faculty evaluation database [60]. UT Austin, The Rockefeller University, and UC Santa Cruz have all added language inviting job applicants for faculty positions to submit preprints as well [60]. Furthermore, preprints may hold value beyond what is codified in formal policies. A survey of hiring committee members conducted by “Future PI Slack” suggests that 10/15 of those surveyed find preprints useful in evaluating faculty candidates [61].
Perhaps the most proactive support for preprints has come from funders, who seemed poised to actively encourage the use of preprints in the life sciences. In May of 2016, the Simons Foundation Autism Research Initiative (SFARI) announced it would change its grant award letter to “strongly encourage” investigators to post preprints and that such papers would be taken into consideration in funding decisions [62]. On September 1 of the same year, these concepts became integrated into the overall Simons Foundation policy, and other funders followed suit, including The Leona M. and Harry B. Helmsley Charitable Trust, EMBO Long-Term fellowships and Young Investigator program, Human Frontiers Science Program, MRC, Wellcome Trust, HHMI, Cancer Research UK, BBSRC, UKRI Future Leaders Fellowship program, CNRS, and the European Research Council [63].
One influential funder policy has been NIH’s guide notice NOT-OD-17-050, which clarifies the NIH’s position on preprints and other interim research products: “The NIH encourages investigators to use interim research products, such as preprints, to speed the dissemination and enhance the rigor of their work … Interim research products can be cited anywhere other research products are cited” [36]. A notable exception, however, is in the use of preprints in post-submission materials [64], which are intended to accommodate events outside the control of the investigators.
Some private funders have gone beyond encouraging preprints to requiring them. Barring privacy concerns, the Chan Zuckerberg Initiative states a commitment to posting preprints prior to peer review [65]. As part of Wellcome’s updated open access policy, researchers working on fields of public health relevance will be required to preprint at the time of journal submission from 2020 [16].
As with all policies, their existence does not ensure they will be enacted. Funders also must develop mechanisms to monitor grantee reaction and compliance. The emergence of technological infrastructure (e.g., links between preprints and published papers, metadata about funding sources, and submission and posting dates), as well as continued dialogue between researchers and funders, is key to enabling these policies.
Some have argued that preprints should be used by funders to achieve open access to the literature in lieu of mandates for open access journal versions [66], but we believe this suggestion is not viable. Because peer review changes papers (often for the better) such a plan would create a two-tier system with some readers having privileged, paywalled access to the more definitive journal versions of papers, and the remainder left with access to other versions that might be outdated. Plan U could become an acceptable substitute for universal open access only if all journals and preprint servers permitted archiving postprints (i.e., versions of manuscripts accepted for journal publication) and/or all submitted version of manuscripts. Currently, bioRxiv does not allow preprint publication after acceptance, and many journals prohibit the publication of preprints that incorporate comments from the peer review process [67].
Technical issues
At present, preprint servers lack the technological infrastructure that could help them to realize their full potential. Addressing such challenges could make a large impact on how preprints are used and discovered.
For example, authors who have previously read a preprint often wish to quickly find out how it has changed upon the posting or publication of a subsequent version. Currently, neither preprint servers nor journals systematically present a summary of the changes made. Some users already make version notes when posting a revised manuscript to bioRxiv; making this more standard practice might involve enabling authors to submit a short piece of text to journals as well, similar to a conflict of interest disclosure or author CRediT declaration. Once this is complete, it would be natural for journals to provide a link back to the preprint version, which would present a more complete picture of how a manuscript evolved over time. Some journals already provide this backwards link—including Biophysical Journal, Plant Direct, and PLOS titles [68–70]. Formal inclusion of preprint information in the XML representing the manuscript’s version history would help: this recommendation for the JATS schema is currently pending decision [71]. Alternatively, researchers may wish to include a link to their own preprint in the final version (as we do here), where the journal policy permits this. Preprints could also be better supported by reference managers with features that would allow users to link preprints to later versions (whether revised preprints or a final journal version) and receive updates when subsequent versions are available online.
Change is needed in search tools, too. For example, preprints could be linked from PubMed and PubMed Central. (Note that this is effectively being done for papers in F1000 Research and associated platforms such as Wellcome Open Research. Once these papers pass peer review, they appear on PubMed Central along with their date-stamped first version.) This helps to establish a record of what work was done when, irrespective of delays imposed by the peer review process, which is key to determining priority of discovery. Europe PMC already indexes preprints and has implemented links between the preprint and published version of the same piece of work, though improved metadata could facilitate further search and tool development [11,72].
Beyond the basic metadata about a preprint, open access to the data detailing interactions with each preprint would enable innovation around how the latest science is discussed. For a recent effort to understand Twitter interactions with and downloads of preprints posted on bioRxiv, content metadata was derived by scraping the bioRxiv website [42]. In the absence of an official bioRxiv application programming interface (API), these authors and others had developed their own tools (including an API, command line tool and Python wrapper) to source and interact with bioRxiv content data. Since the publication of the analysis, bioRxiv released its own API [73].
Addressing the technical issues detailed above may help more people find and interact with preprints. As we will discuss in the next section, the low discoverability and perceived legitimacy of preprints is at the root of several more complex social problems.
Social issues
Today, preprinting is treated as standard practice—or at least supported to a considerable degree—in some life science communities, such as neuroscience, bioinformatics, evolutionary biology, and ecology ([42]; see also subject-specific initiatives like “Peer Community in” [74] and servers hosted at OSF Preprints [75]). Other subject areas have less experience and thus may have lower awareness of the actual benefits and issues. In addition to new servers [28,36,76], several new research categories have been added to bioRxiv in recent years—clinical trials, epidemiology, paleontology, pathology, and pharmacology and toxicology (note their absence in older literature [42,46] and that both clinical trials and epidemiology are now served by medRxiv). This freshness demands and enables considered discussion of important issues so that the most beneficial practices surrounding preprinting can be cemented as cultural norms. A recent consultation highlighted that researchers were often unable to cite case studies of the benefits of preprints [38], and so continued productive adoption may require increasing the number and visibility of shared real-life experiences with preprints, such as those collected by We Support Preprints [77].
Licensing
Although open access to scholarly literature has been discussed for decades, its original meaning has been diluted. The Budapest Open Access Initiative defines it as “free availability on the public internet, permitting any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself” [78]. Today, the majority of articles on PubMed Central, although “free” to read, are not actually open for reuse. Articles not in the OA subset cannot be downloaded in bulk, restricting access to text and data mining [79]. Even if that bulk download were available, their licenses do not permit reuse.
Because authors are directly in control of licenses on preprints, they have an opportunity to create a more open corpus of literature. However, most authors on bioRxiv are choosing restrictive licenses [80] amid widespread confusion about what they mean and a misconception that journals prohibit the use of certain licenses for preprints [81]. In reality, we are aware of no publishers that currently enforce this policy. In contrast, an influential funder, the NIH, has recommended the use of CC BY [82]. More education and guidance for authors is needed, e.g., within the preprint submission process itself. Ideally, however, co-authors would have an informed discussion about the license to choose for their preprint before submission.
Permitted versions
The term “preprint” can describe many different versions of a manuscript, ranging from drafts shared for feedback well before journal submission to manuscripts ready to be accepted by a journal. However, journals differ in their policies regarding which versions of manuscripts under consideration may be posted, with some of them prohibiting the posting of preprints after initial submission. These policies may be rationalized by a sense of journal ownership of the peer review process, but, in fact, they prevent scientists from sharing improvements drawn from diverse sources—their own additional experiments and analysis, feedback from colleagues with whom the manuscript was privately shared, comments on the preprint server itself, and input from social media and preprint-specific feedback platforms (including preLights, PREreview, biOverlay, and Peer Community in). Adding to the confusion, preprint servers differ in their own policies for manuscript deposit; in many disciplines (canonically, arXiv) preprint servers also host postprints. In the life sciences, PubMed Central, complemented by institutional repositories, fulfills this need, and bioRxiv hosts only preprints, not postprints. However, other platforms can host biology postprints, e.g., OSF Preprints.
Scooping
A common fear cited as a barrier to preprinting is “getting scooped.” Researchers may feel this has happened when a competing research group publishes highly related work without crediting (i.e., fairly citing and discussing) their own preprint. As a consequence, their work receives less attention and recognition, and if the work is still unpublished, this can mean publication in a “lower” journal. Scooping can of course occur in the absence of preprinting when work is discussed at conferences, submitted to journals, or included in funding applications, or simply by coincidence of two groups working in a related area. Fears of scooping may be particularly acute in fields with “a well-defined objective” such as structural biology, although the actual consequences may not be as severe as feared [83]. Preprints do in fact offer some protection against scooping by providing a public timestamp of a claim [84,85]. Nevertheless, the nuances of the timeline of priority claims are likely to be overlooked by casual readers. Approximately 2% of respondents in bioRxiv’s survey report having suffered a loss in ability to claim priority, or to publish in the journal of their choice, as a result of having published a preprint [32].
It stands to reason that scooping fears are most acute when the stakes are high and careers are on the line. However, fears about scooping—and the secrecy that accompanies them—cannot be neatly divided by generations because it’s rare for a group of co-authors to be homogenous in years of experience.
Fear of scooping impacts not only researchers’ willingness to share preprints at all but also whether they are willing to share auxiliary materials that are normally shared as a condition of journal publication. For example, communities have yet to come to consensus on whether authors should be obligated to share reagents or strains after posting a preprint. In a future world where preprinting is universally regarded as a respected disclosure, ethical standards of disclosure should match those associated with journal articles.
Preprints and the media
One of the major arguments against the use of preprints is concern about exposing the public to unverified information, especially in fields in which public interest is high, such as medicine [86]. For example, one of the Altmetric top 100 papers on 2016 was a bioRxiv preprint linking cell phone radiation to cancer in rats. Some media outlets reported on the paper without mention of its preprint status [87], whereas others provided more measured coverage, including a critique of previous media reports [88].
As preprints rise in popularity, it is important that all readers and especially journalists have access to information about what a preprint server is and the screening checks (or lack thereof) that it employs. Many preprint servers (including bioRxiv, medRxiv, and PeerJ Preprints) display a disclaimer explaining the nature of preprints and the fact that they have not yet been “certified” by peer review.
The reaction of a community via public commenting can also be enormously helpful. Although the majority of feedback on preprints comes via mechanisms that is harder to tie to the preprint itself (or perhaps not publicly visible at all [32]), public commenting sections enable readers to gauge community reactions at a glance. The presence of links to related research documents, such as data and code, are also an important signal of trust for researchers [89]. Although the general public may not recognize these indicators, scientists reading preprints can use them to publicly raise concerns that will benefit all readers.
Moving forward, it would be helpful to provide all readers with information that outlines the level of scrutiny and/or acceptance by experts that an individual research output has received and for journalists and other public commentators to critique accordingly.
Curation and evaluation
As the production of scientific outputs continues to accelerate, both as a result of a growing number of researchers and their increasing willingness to share, we will need new ways of dealing with information overload. Although a glut of publications may feel like a 21st century problem, thinkers since Seneca have lamented the overabundance of information, and scholars have progressively developed tools to help organize and filter it [90].
Currently, readers report finding preprints by searching for keywords (note that multiple preprint servers are indexed on EuropePMC, OSF Preprints, and Google Scholar). They also report being alerted to interesting work on Twitter. The first strategy is directed by subject area but not interest, and the second by interest but not necessarily subject area. Ultimately, we will need more efficient ways to combine both search criteria in a single stream, in much the way that journal title is presently used (rightly or wrongly) to help parse search results in PubMed. Rxivist is one such tool that marries current interest and subject area [42], and we are collecting more curation projects at reimaginereview.asapbio.org. We believe that this emerging space will become an essential component of the preprint ecosystem.
Curation of interesting or highly respected preprints can also improve their usefulness in evaluating scientists for jobs and grants. Although journal name (and Impact Factor) are flawed proxies for judging the quality of a work [91], they save reviewer time by quickly communicating information about a paper’s selection process. Such proxies are not essential in the late stages of an evaluation process when candidates have been whittled down to a short list and reading their full outputs is a manageable task. However, the process of shortlisting candidates requires more time-efficient indicators of research quality than reading the content itself. Shortly after publication, such indicators may include the level of authors’ transparency and openness, endorsements from peers, and assessments of creativity. In the longer term, established reproducibility or replicability and impact on science or society can also be assessed [92]. Preprints offer the opportunity to evaluate researchers based on their most recent work, but candidates may need to accompany them with indicators that distill community reactions in the short-term, such as downloads, citation counts, constructive preprint comments, and other endorsements. Despite existing limitations, multiple reports suggest preprints are already helping early career researchers to secure their next research position [61,77,93]. Improved practices for filtering, curating, and signaling interest in preprints can further promote this phenomenon.
The future of preprints
Although the rate of preprints appearing on bioRxiv is increasing exponentially, they still comprise less than 3% of the monthly output of PubMed [94]. Because the majority of the potential growth in the use of preprints lies ahead, research communities must ensure that such use is productive to both science and scientists.
Who’s at the table?
With reduced gatekeeping mechanisms, preprints could be a mechanism for sharing and consuming the latest science irrespective of social hierarchies. We must ensure that preprint infrastructures and social mechanisms develop with issues of diversity, equity, and decolonialization of scholarship in mind [95,96]. Who can contribute to the preprinted literature? Who benefits from posting a preprint? Who can read, consume, and use information in preprints? As preprinting continues to grow in biology, we must bake these questions into every discussion.
The growing adoption of preprints in biology is being largely driven by researchers in North America and Europe: of the top 100 institutional affiliations ranked by number of preprints posted to bioRxiv until December 2018, only 6 are located outside these regions [42], whereas the 10 institutions affiliated with the most preprints are in the United States and United Kingdom alone [32]. Researchers who feel comfortable posting a preprint are likely to be those who feel less threatened by the “scooping” concerns identified above, which may be affected by perceived and actual competition for recognition, funding, and career positions. On the other hand, preprints may be seen as a cost-effective way to disseminate work particularly in more resource-constrained environments [97]—the advantages and disadvantages of preprints for scientists operating outside the European and US funding context warrant further investigation.
Reflecting on the “scooping” concerns listed above, we should consider how preprints could offer appropriate recognition and support for creators of openly shared work. Indeed, some researchers report only being rewarded with funding and jobs when they are authors of (high-impact) journal articles and not for reuse of their open data sets [98]. Therefore, it can be difficult to argue that the researchers producing the primary data sets should share these openly, let alone rapidly with a preprint. This issue does not relate to the development of new tools and methods—in this case, researchers report valuing the immediate usage, testing, and feedback that preprinting these resources provides.
Once work is shared openly, it is important to address how widely it is seen. Twitter is a major driver of attention on preprints, and social connections between preprint authors and readers raise visibility in the absence of dissemination through journals. Thus the visibility of preprints is strongly influenced by the authors’ existing network “connectedness” and therefore is vulnerable to the same underrepresentation issues we face elsewhere in science. There have been several initiatives to increase the visibility of underrepresented scientists (including VanguardStem and 500 Women Scientists [99,100]); following suit, SBotLite is a new Twitter bot that retweets preprints posted by female first authors in the hope of raising their visibility [101]. Ensuring that the dissemination of preprints does not mimic or perpetuate diversity issues in science, technology, engineering, and mathematics (STEM) requires continued investment in initiatives to counteract and mitigate existing attention biases.
Beyond the article
Some have expressed concern at the roughly 35% of preprints that do not go on to be published in a journal, believing that these preprints must be of low quality [102]. Alternatively, these outputs could reflect work never destined for a journal that would have otherwise not been shared or work that the authors have chosen not to submit to a journal. Such products include negative results, preliminary findings, methods and protocols, and short reports from projects that could not be completed (e.g., because funds or a training period ran out). All of these products are valuable, and all could be in principle posted on a preprint server. bioRxiv, however, does not allow “theses, dissertations, student projects, recipes and simple protocols,” nor review or policy articles [27]. It does, however, have specialized sections for contradictory and confirmatory work, though they are seldom used. As of the time of writing, the Contradictory and Confirmatory Results sections together make up less than 3% of the articles on bioRxiv [103].
These low usage rates suggest that preprints alone are not likely to be a solution to publication bias; our current incentive system does not sufficiently reward investments of energy spent writing up contradictory or confirmatory findings in the format of a journal article. Some of this effort, e.g., carefully assembling a methods section, is necessary to reproduce the work and must not be compromised. But other tasks, like putting the work in context with an introduction or interpreting the findings in a discussion, is less useful to specialized readers, who are the likely audience for contradictory or confirmatory findings anyway. In fact, those readers do not need the element of a narrative (often constructed post-facto) that ties together figures in a traditional paper. In these cases, a single figure (or even a micropublication, defined for these purposes as a statement with attribution [104]) would suffice.
There is presently an expectation that all products appearing on preprint servers are more or less complete articles. This helps to promote an image of the preprint server as a destination for high-quality work and helps to facilitate some very positive behaviors, such as the soliticiation of submissions by journal editors. However, this norm reinforces a culture in which research is shared relatively late in the process and also feeds some behaviors that are less desirable, such as counting the number of papers on a CV as a measure of productivity without assessing their contents. Although this practice makes little sense, it is a real concern, as evidenced by the fact that the Medical Research Council worded its preprint policy to discourage researchers from “salami slicing” their preprints into many smaller units for the purpose of gaming the system by gaining a higher publication count [105]. It is not useful to science for researchers to split one story into multiple parts purely to game the evaluation system; however, given the deeply complex and technical interdisciplinary work that is now often combined into a single 1,500-word article, there is clear value in ensuring each finding is comprehensively described. If posting single figures or smaller increments of work were to become standard practice, all research results could be communicated faster and with adequate methodological description to ensure reproducibility. Those ultimately destined for a journal could be assembled into an article when the authors felt ready. Another benefit of micropublications is that they enable peer review on a more atomic level. In an environment in which papers result from the collaboration of many different specialized experts, there may be situations in which no two or three reviewers have sufficient expertise to cover every figure panel.
Despite the apparent benefits of micropublications and preprints, both technical and social innovation is required to address open questions. Namely, how can science be shared in varying orders of detail, complexity, and review status over time, from first observation of a result to acceptance of a generalized finding into broader understanding? Which research outputs (data, code, methods) are useful to embed in a narrative article? For which of these outputs is subsequent filtration and curation valuable? Ultimately, where it is most useful to invest resources in coordinated peer review, journal production processes, and dissemination of findings to nonspecialist communities? Regardless of when or how preprints fit into this picture, we should strive to ensure that research integrity is rewarded, discovery is accelerated, and the publication process is more inclusive and equitable.
Zdroje
1. Tregenza T. Gender bias in the refereeing process? Trends Ecol Evol. 2002 Aug 1;17(8):349–50.
2. Shen YA, Webster JM, Shoda Y, Fine I. Persistent Underrepresentation of Women’s Science in High Profile Journals. bioRxiv. 2018 Mar 8;275362.
3. Murray D, Siler K, Larivière V, Chan WM, Collings AM, Raymond J, et al. Gender and international diversity improves equity in peer review. bioRxiv. 2019 Apr 11;400515.
4. Royle S. Waiting to happen II: Publication lag times [Internet]. quantixed. 2015. Available from: http://web.archive.org/web/20190412012435/http://quantixed.org/2015/03/16/waiting-to-happen-ii-publication-lag-times/. [cited 2019 May 31].
5. Himmelstein D. Publication delays at PLOS and 3,475 other journals. Satoshi Village [Internet]. 2015 Jun 29. Available from: http://web.archive.org/web/20180503071358/http://blog.dhimmel.com/plos-and-publishing-delays/. [cited 2019 May 31].
6. Cobb M. The prehistory of biology preprints: A forgotten experiment from the 1960s. PLoS Biol. 2017 Nov 16;15(11):e2003995. doi: 10.1371/journal.pbio.2003995 29145518
7. World University Rankings 2019: methodology [Internet]. Times Higher Education (THE). 2018. Available from: http://web.archive.org/web/20191104165124/https://www.timeshighereducation.com/world-university-rankings/world-university-rankings-2019-methodology. [cited 2019 Nov 25].
8. Lin J. Preprints growth rate ten times higher than journal articles [Internet]. Crossref. Available from: https://web.archive.org/web/20190522083623/https://www.crossref.org/blog/preprints-growth-rate-ten-times-higher-than-journal-articles/. [cited 2019 Nov 28].
9. Kleinert S, Horton R. Preprints with The Lancet: joining online research discussion platforms. The Lancet. 2018 Jun 23;391(10139):2482–3.
10. Mallapaty S. African scientists launch their own preprint repository. Nature [Internet]. 2018 Jun 25. Available from: https://web.archive.org/web/20190312102218/https://www.nature.com/articles/d41586-018-05543-w. [cited 2019 Nov 28].
11. Levchenko M. Preprints in Europe PMC: reducing friction for discoverability [Internet]. Europe PMC Blog. 2018. Available from: https://web.archive.org/web/20190821213831/http://blog.europepmc.org/2018/07/preprints.html. [cited 2019 Aug 21].
12. Joerg Heber on Twitter: “Am excited that as of today we are linking to preprints posted to @biorxivpreprint from the published article itself. This also applies to previously published papers. https://t.co/1CPAMRWABh https://t.co/YjFCtfPp7g”/Twitter [Internet]. Twitter. Available from: https://web.archive.org/web/20191128193458/https://twitter.com/joergheber/status/1020105070875045888. [cited 2019 Nov 28].
13. J. Am. Chem. Soc. on Twitter: "The editors’ decision is in! JACS will now consider submissions of manuscripts previously posted as preprints on @ChemRxiv. Details to follow… " / Twitter [Internet]. Twitter. Available from: https://web.archive.org/web/20180820152244/https://twitter.com/J_A_C_S/status/1031300824889208833. [cited 2019 Nov 28].
14. European Research Council. Main Changes Expected in the ERC Work Programme 2019 [Internet]. Available from: https://web.archive.org/web/20180810203015/https://erc.europa.eu/sites/default/files/content/pages/pdf/ERC-2019-Work-Programme-main-changes.pdf. [cited 2019 Nov 26].
15. PKP and SciELO announce development of open source Preprint Server system | SciELO in Perspective [Internet]. 2018. Available from: https://web.archive.org/web/20180921153453/https://blog.scielo.org/en/2018/09/21/pkp-and-scielo-announce-development-of-open-source-preprint-server-system/. [cited 2019 Nov 28].
16. Wellcome updates open access policy to align with cOAlition S [Internet]. Wellcome. 2019. Available from: https://wellcome.ac.uk/news/wellcome-updates-open-access-policy-align-coalition-s. [cited 2019 Aug 21].
17. ICMJE | Recommendations [Internet]. Available from: http://web.archive.org/web/20191128174016/http://www.icmje.org/news-and-editorials/icmje-recommendations_annotated_dec18.pdf. [cited 2019 Nov 28].
18. Israel Science Foundation | Research Gateways | F1000Research [Internet]. Available from: http://web.archive.org/web/20191128193840/https://f1000research.com/isf. [cited 2019 Nov 28].
19. Nakagawa S. EcoEvoRxiv launched! [Internet]. Transparency in Ecology and Evolution. 2019. Available from: http://web.archive.org/web/20190115134549/http://www.ecoevotransparency.org/2019/01/14/ecoevorxiv-launched/. [cited 2019 Nov 28].
20. John Inglis on Twitter: “Full text HTML begins rolling out across the 42,000 articles on @biorxivpreprint this week, starting with the earliest. Outcome of huge collaborative effort from @Novatechset, @highwirepress, and bioRxiv team, funded by @cziscience. Future articles will be FT 2d after posting.” / Twitter [Internet]. Twitter. Available from: http://web.archive.org/web/20190529151025/https://twitter.com/JohnRInglis/status/1092123015385616392. [cited 2019 Nov 28].
21. AMRC on Twitter: “AMRC #OpenResearch is officially launching today! It will help participating charities maximise the value of their research investment by providing their researchers with the opportunity to publish any & all aspects of their work rapidly & cost effectively https://t.co/keguIt7Ohz https://t.co/gddRKMdlSS”/Twitter [Internet]. Twitter. Available from: http://web.archive.org/web/20191128194153/https://twitter.com/AMRC/status/1097782846083588097. [cited 2019 Nov 28].
22. Beilstein Archives [Internet]. Available from: http://web.archive.org/web/20190428121154/https://www.beilstein-archives.org/xiv/. [cited 2019 Nov 28].
23. Crystal M. It’s Our Preprint Anniversary! [Internet]. The Official PLOS Blog. 2019 Available from: http://web.archive.org/web/20191111140059/https://blogs.plos.org/plos/2019/04/its-our-preprint-anniversary/. [cited 2019 Nov 28].
24. Demain P. New Features Alert! Improvements to Adding and Grouping Works [Internet]. ORCID Blog. 2019. Available from: http://web.archive.org/web/20191128194544/https://orcid.org/blog/2019/04/29/new-features-alert-improvements-adding-and-grouping-works. [cited 2019 Nov 28].
25. Springer Nature journals unify their policy to encourage preprint sharing. Nature. 2019 May 15;569:307–307.
26. Participating Journals & Platforms [Internet]. Research Square. Available from: http://web.archive.org/web/20191008201413/https://www.researchsquare.com/journals. [cited 2019 Nov 28].
27. Frequently Asked Questions (FAQ) [Internet]. bioRxiv. Available from: https://web.archive.org/web/20190821203507/https://www.biorxiv.org/about/FAQ. [cited 2019 Aug 21].
28. medRxiv [Internet]. The Yoda Project. Available from: https://web.archive.org/web/20190821214815/https://yoda.yale.edu/medrxiv. [cited 2019 Aug 21].
29. Research outputs find a home at IndiaRxiv–IndiaRxiv [Internet]. Available from: http://web.archive.org/web/20190905143532/http://indiarxiv.in/research-outputs-find-a-home-at-indiarxiv/. [cited 2019 Nov 28].
30. Hoyt J. PeerJ Preprints to stop accepting new preprints Sep 30th 2019 [Internet]. PeerJ Blog. 2019. Available from: http://web.archive.org/web/20191112120834/https://peerj.com/blog/post/115284881747/peerj-preprints-to-stop-accepting-new-preprints-sep-30-2019/. [cited 2019 Nov 28].
31. Transparent review in preprints [Internet]. Cold Spring Harbor Laboratory. 2019. Available from: http://web.archive.org/web/20191014223638/https://www.cshl.edu/transparent-review-in-preprints/. [cited 2019 Nov 28].
32. Sever R, Roeder T, Hindle S, Sussman L, Black K-J, Argentine J, et al. bioRxiv: the preprint server for biology. bioRxiv. 2019 Nov 6;833400.
33. Research Square on Twitter: “This Monday we’re celebrating 9,000 #preprints on #ResearchSquare! Browse the latest #research in your field & comment on emerging #science before it’s published https://t.co/kb1Zaa9Hpy”/Twitter [Internet]. Twitter. Available from: http://web.archive.org/web/20191128195329/https/:/twitter.com/researchsquare/status/1198979654712995840?s=20. [cited 2019 Nov 28].
34. Research Square on Twitter: “Today we’re adding 13 more @BioMedCentral journals to #InReview including @MicrobiomeJ & Environmental Health. Opt in when you submit to 70+ journals including the entire @BMC_series. View the full list of participating journals here: https://t.co/KUHjw81cVA @SpringerNature” [Internet]. Available from: https://web.archive.org/web/20191031141820/https:/twitter.com/researchsquare/status/1189904859157422081. [cited 2019 Nov 25].
35. News from around the web [Internet]. ASAPbio. Available from: https://web.archive.org/web/20190821215354/https://asapbio.org/news-from-around-the-web. [cited 2019 Aug 21].
36. Tennant J, Bauin S, James S, Kant J. The evolving preprint landscape: Introductory report for the Knowledge Exchange working group on preprints. 2018 May 17. Available from: https://osf.io/preprints/metaarxiv/796tu/. [cited 2019 May 31].
37. Kling R, Spector LB, Fortuna J. The real stakes of virtual publishing: The transformation of E-Biomed into PubMed central. J Am Soc Inf Sci Technol. 2004;55(2):127–48.
38. Chiarelli A, Johnson R, Pinfield S, Richens E. Practices, drivers and impediments in the use of preprints (Phase 1 report) [Internet]. 2019 May 1. Available from: https://zenodo.org/record/2654832#.XPEaXNNKg_U. [cited 2019 May 31].
39. Dray S. Information Exchange Group No. 5. Science. 1966 Aug 12;153(3737):694–5.
40. Klein M, Broadwell P, Farb SE, Grappone T. Comparing Published Scientific Journal Articles to Their Pre-Print Versions—Extended Version. Int J Digit Libr [Internet]. 2018 Feb 5. Available from: http://arxiv.org/abs/1803.09701. [cited 2019 May 31].
41. Larivière V, Sugimoto CR, Macaluso B, Milojević S, Cronin B, Thelwall M. arXiv E-prints and the journal of record: An analysis of roles and relationships. J Assoc Inf Sci Technol. 2014;65(6):1157–69.
42. Abdill RJ, Blekhman R. Tracking the popularity and outcomes of all bioRxiv preprints. Pewsey E, Rodgers P, Greene CS, editors. eLife. 2019 Apr 24;8:e45133. doi: 10.7554/eLife.45133 31017570
43. Fraser N, Momeni F, Mayr P, Peters I. The effect of bioRxiv preprints on citations and altmetrics. bioRxiv. 2019 Jun 22;673665.
44. Serghiou S, Ioannidis JPA. Altmetric Scores, Citations, and Publication of Studies Posted as Preprints. JAMA. 2018 Jan 23;319(4):402–4. doi: 10.1001/jama.2017.21168 29362788
45. 1. Fu DY, Hughey JJ. Releasing a preprint is associated with more attention and citations for the peer-reviewed article. Rodgers P, Amaral O, editors. eLife. 2019 Dec 6;8:e52646. doi: 10.7554/eLife.52646 31808742
46. Inglis J, Sever R. bioRxiv: a progress report [Internet]. ASAPbio. 2016. Available from: https://web.archive.org/web/20190821214557/https://asapbio.org/biorxiv. [cited 2019 Aug 21].
47. Rapid Reviews for Rapid Outbreak Response [Internet]. Outbreak Science Blog. 2019. Available from: http://web.archive.org/web/20191128174326/https://blog.outbreakscience.org/rapid-reviews-for-rapid-outbreak-response/. [cited 2019 Nov 28].
48. Berlin S. If the papers don’t come to the journal. EMBO Rep. 2018 Apr 1;19(4):e45911. doi: 10.15252/embr.201845911 29472243
49. Slavov N. Why I love preprints [Internet]. Slavov Lab. 2017. Available from: https://web.archive.org/web/20180828220418/web.northeastern.edu/slavovlab/blog/2017/09/28/biomedical-preprints-benefits/. [cited 2019 Aug 21].
50. Barsh GS, Bergman CM, Brown CD, Singh ND, Copenhaver GP. Bringing PLOS Genetics Editors to Preprint Servers. PLOS Genet. 2016 Dec 1;12(12):e1006448. doi: 10.1371/journal.pgen.1006448 27906975
51. Barrett Spencer C. H. Proceedings B 2017: the year in review. Proc R Soc B Biol Sci. 2018 Jan 10;285(1870):20172553.
52. Advancing the sharing of research results for the life sciences [Internet]. bioRxiv. Available from: https://web.archive.org/web/20190821203542/https://www.biorxiv.org/about-biorxiv. [cited 2019 Aug 21].
53. Anderson K. The Tincture of Time -Should Journals Return to Slower Publishing Practices? [Internet]. The Scholarly Kitchen. 2017. Available from: http://web.archive.org/web/20190823114127/https://scholarlykitchen.sspnet.org/2017/03/28/the-tincture-of-time-should-journals-return-to-slower-publishing-practices/. [cited 2019 May 31].
54. Greene C. Why we preprint. [Internet]. Casey Greene. 2015. Available from: http://web.archive.org/web/20181116081712/https://medium.com/@greenescientist/why-we-preprint-fb3bfbcdf4ff. [cited 2019 May 31].
55. SHERPA/RoMEO—Search—Publisher copyright policies & self-archiving [Internet]. Available from: https://web.archive.org/web/20190821204015/ http://www.sherpa.ac.uk/romeo/search.php. [cited 2019 Aug 21].
56. List of academic journals by preprint policy—Wikipedia [Internet]. Available from: https://web.archive.org/save/https://en.wikipedia.org/wiki/List_of_academic_journals_by_preprint_policy. [cited 2019 Aug 21].
57. Transpose [Internet]. Available from: http://web.archive.org/web/20190821204315/https://transpose-publishing.github.io/#/. [cited 2019 Aug 21].
58. Bishop D. Tweet [Internet]. Twitter. Available from: https://web.archive.org/web/20190821204445/https://twitter.com/deevybee/status/1127186891533639681. [cited 2019 Aug 21].
59. McKiernan EC, Schimanski LA, Nieves CM, Matthias L, Niles MT, Alperin JP. Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations [Internet]. PeerJ Inc.; 2019 Apr. Report No.: e27638v2. Available from: https://peerj.com/preprints/27638. [cited 2019 May 31].
60. University policies and statements on hiring, promotion, and journal license negotiation [Internet]. ASAPbio. Available from: https://web.archive.org/web/20190821210111/https://asapbio.org/university-policies. [cited 2019 Aug 21].
61. Fernandes JD, Sarabipour S, Smith CT, Niemi NM, Jadavji NM, Kozik AJ, et al. Insights from a survey-based analysis of the academic job market. bioRxiv. 2019 Oct 9;796466.
62. Spiro J. SFARI supports preprints for the life sciences [Internet]. SFARI. 2016. Available from: https://web.archive.org/web/20190821212320/https://www.sfari.org/2016/05/20/sfari-supports-preprints-for-the-life-sciences/. [cited 2019 Aug 21].
63. Funder policies [Internet]. ASAPbio. Available from: https://web.archive.org/web/20190821212524/https://asapbio.org/funder-policies. [cited 2019 Aug 21].
64. After My Application is Submitted, Can I Include a Copy or Citation of a Preprint as Post-submission Materials? [Internet]. NIH Extramural Nexus. 2018. Available from: http://web.archive.org/web/20190821212758/https://nexus.od.nih.gov/all/2018/03/02/post-submission-materials-can-i-include-a-copy-or-citation-of-a-preprint/. [cited 2019 Aug 21].
65. Science Initiative Privacy Principles [Internet]. Chan Zuckerberg Initiative. 2018. Available from: http://web.archive.org/web/20190821212941/https://chanzuckerberg.com/privacy/science-privacy-principles/. [cited 2019 Aug 21].
66. Sever R, Eisen M, Inglis J. Plan U: Universal access to scientific and medical research via funder preprint mandates. PLoS Biol. 2019 Jun 4;17(6):e3000273. doi: 10.1371/journal.pbio.3000273 31163026
67. Reichmann S, Ross-Hellauer T, Hindle S, McDowell G, Lin J, Penfold N, et al. Editorial policies of many highly-cited journals are hidden or unclear [Internet]. 2019 May 5. Available from: https://zenodo.org/record/3237242. [cited 2019 Nov 25].
68. Loew LM, Staehle B. 2017 Ushers in New Editorial Board Members and More. Biophys J. 2017 Jan 10;112(1):E01–2. doi: 10.1016/j.bpj.2016.12.014 28076821
69. Plant Direct Journal on Twitter: “@JohnRInglis @PLOS @biorxivpreprint we have been doing it since we launched last year. . . . see for ex. https://t.co/S145EODDOU”/Twitter [Internet]. Twitter. Available from: https://web.archive.org/web/20191128202008/https://twitter.com/PlantDirectJ/status/1020134027616047104. [cited 2019 Nov 28].
70. Preprints [Internet]. PLOS One. Available from: https://web.archive.org/web/20190925025506/https://journals.plos.org/plosone/s/preprints. [cited 2019 Nov 28].
71. Article publication and history dates–JATS4R [Internet]. Available from: https://jats4r.org/article-publication-and-history-dates. [cited 2019 Nov 28].
72. Malički M, Sarol MJ, Alperin JP. Analyzing preprints: The challenges of working with SHARE metadata [Internet]. Scholarly Communications Lab | ScholCommLab. 2019. Available from: http://web.archive.org/web/20190906133051/https://www.scholcommlab.ca/2019/09/04/preprints-challenges-part-one/. [cited 2019 Nov 25].
73. bioRxiv API [Internet]. Available from: http://web.archive.org/web/20191111115439/http://api.biorxiv.org/. [cited 2019 Nov 26].
74. Peer Community In [Internet]. Available from: http://web.archive.org/web/20190923105228/https://peercommunityin.org/. [cited 2019 Aug 21].
75. OSF Preprints [Internet]. Available from: http://web.archive.org/web/20191113184523/https://osf.io/preprints/. [cited 2019 Aug 21].
76. Narock TW, Goldstein E. Quantifying the growth of preprint services hosted by the Center for Open Science [Internet]. Open Science Framework; 2019 Apr. Available from: https://osf.io/5fk6c. [cited 2019 May 31].
77. We support preprints… –Let’s Accelerate Scientific Publishing In The Life Sciences! [Internet]. Available from: https://web.archive.org/web/20190821215234/https://wesupportpreprints.wordpress.com/. [cited 2019 Aug 21].
78. Read the Budapest Open Access Initiative [Internet]. Budapest Open Access Initiative. 2002. Available from: https://web.archive.org/web/20190821215658/https://www.budapestopenaccessinitiative.org/read. [cited 2019 Aug 21].
79. Open Access Subset [Internet]. PubMed Central. Available from: https://web.archive.org/save/https://www.ncbi.nlm.nih.gov/pmc/tools/openftlist/. [cited 2019 Aug 21].
80. Himmelstein D. The licensing of bioRxiv preprints. Satoshi Village [Internet]. 2016 Dec 5. Available from: https://blog.dhimmel.com/biorxiv-licenses/. [cited 2019 May 31].
81. Preprint licensing survey–ASAPbio [Internet]. Available from: https://web.archive.org/web/20190821220137/https://asapbio.org/licensing-survey. [cited 2019 Aug 21].
82. NOT-OD-17-050: Reporting Preprints and Other Interim Research Products [Internet]. NIH. 2017. Available from: https://web.archive.org/web/20190821212644/https://grants.nih.gov/grants/guide/notice-files/NOT-OD-17-050.html. [cited 2019 Aug 21].
83. Hill R, Stein C. Scooped! Estimating Rewards for Priority in Science.:61.
84. Tennant JP, Crane H, Crick T, Davila J, Enkhbayar A, Havemann J, et al. Ten Hot Topics around Scholarly Publishing. Publications. 2019 Jun;7(2):34.
85. Bourne PE, Polka JK, Vale RD, Kiley R. Ten simple rules to consider regarding preprint submission. PLoS Comput Biol. 2017 May 4;13(5):e1005473. doi: 10.1371/journal.pcbi.1005473 28472041
86. Sheldon T. Preprints could promote confusion and distortion. Available from: https://www.nature.com/articles/d41586-018-05789-4. [cited 2019 Nov 25].
87. Harkinson J. “Game-changing” study links cellphone radiation to cancer [Internet]. Mother Jones. Available from: http://web.archive.org/web/20191029201950/https://www.motherjones.com/environment/2016/05/federal-study-links-cell-phone-radiation-cancer/. [cited 2019 Nov 25].
88. Stockton EGE Nick. You Need More Than Rat Tumors to Prove Phones Cause Cancer. Wired [Internet]. 2016 May 28. Available from: http://web.archive.org/web/20190905030719/https://www.wired.com/2016/05/need-rat-tumors-prove-phones-cause-cancer/. [cited 2019 Nov 25].
89. Soderberg CK, Errington T, Nosek BA. Credibility of Preprints Survey Presentations. 2019 Oct 2. Available from: https://osf.io/rwne8/. [cited 2019 Nov 25].
90. Blair A. Too much to know: managing scholarly information before the modern age [Internet]. Yale University Press; 2010. Available from: https://www.worldcat.org/title/too-much-to-know-managing-scholarly-information-before-the-modern-age/oclc/601347978. [cited 2019 Aug 21].
91. Seglen PO. Why the impact factor of journals should not be used for evaluating research. BMJ. 1997 Feb 15;314(7079):498–502. doi: 10.1136/bmj.314.7079.497 9056804
92. Moher D, Naudet F, Cristea IA, Miedema F, Ioannidis JPA, Goodman SN. Assessing scientists for hiring, promotion, and tenure. PLoS Biol. 2018 Mar 29;16(3):e2004089. doi: 10.1371/journal.pbio.2004089 29596415
93. Sarabipour S, Debat HJ, Emmott E, Burgess SJ, Schwessinger B, Hensel Z. On the value of preprints: An early career researcher perspective. PLoS Biol. 2019 Feb 21;17(2):e3000151. doi: 10.1371/journal.pbio.3000151 30789895
94. Penfold NC, Polka J. Preprints in biology as a fraction of the biomedical literature [Internet]. Zenodo; 2019. Available from: https://zenodo.org/record/3256298. [cited 2019 Nov 25].
95. Albornoz D, Chan L. Power and Inequality in Open Science Discourses. IRIS—Rev Informação Mem E Tecnol. 2018 Nov 12;4(1):70–9.
96. Okune A. Decolonizing scholarly data and publishing infrastructures [Internet]. Africa at LSE. 2019. Available from: https://web.archive.org/web/20190821221035/https://blaogs.lse.ac.uk/africaatlse/2019/05/29/decolonizing-scholarly-data-and-publishing-infrastructures/. [cited 2019 May 31].
97. Debat H, Babini D. Plan S in Latin America: A precautionary note [Internet]. PeerJ Inc.; 2019 Jul. Report No.: e27834v2. Available from: https://peerj.com/preprints/27834. [cited 2019 Nov 28].
98. Chambers C, Morey C, Open Science Working Group, School of Psychology/CUBRIC, Cardiff University. 2017 Survey on Open Research Practices [Internet]. 03:52 PM. Available from: https://mfr.osf.io/render?url = https://osf.io/dmfke/?action=download%26mode=render. [cited 2019 Aug 22].
99. VanguardSTEM [Internet]. Available from: https://web.archive.org/web/20190821222639/https://www.vanguardstem.com/. [cited 2019 Aug 21].
100. 500 Women Scientists [Internet]. Available from: https://web.archive.org/web/20190821222728/https://500womenscientists.org/. [cited 2019 Aug 21].
101. sBotLite (@sbotlite) [Internet]. Twitter. Available from: https://web.archive.org/web/20190821222840/https://twitter.com/sbotlite. [cited 2019 Aug 21].
102. Anderson K. Comment on Two New Initiatives at eLife To Start the Eisen Era [Internet]. 2019. Available from: https://web.archive.org/web/20190821222256/https://scholarlykitchen.sspnet.org/2019/08/15/two-new-initiatives-at-elife-to-start-the-eisen-era/#comment-83759. [cited 2019 Aug 21].
103. Malički M, Sarol MJ, Alperin JP. Analyzing Preprints: The challenges of working with metadata from bioRxiv [Internet]. Scholarly Communications Lab | ScholCommLab. 2019. Available from: http://web.archive.org/web/20191128195811/https://www.scholcommlab.ca/2019/10/10/preprints-challenges-part-three/. [cited 2019 Nov 25].
104. Clark T, Ciccarese P, Goble C. Micropublications: a semantic model for claims, evidence, arguments and annotations in biomedical communications. J Biomed Semant [Internet]. 2014 Jul 4;5(28). Available from: https://jbiomedsem.biomedcentral.com/articles/10.1186/2041-1480-5-28. [cited 2019 Aug 21].
105. The MRC supports preprints—News and features [Internet]. Medical Research Council. 2017. Available from: https://web.archive.org/save/https://mrc.ukri.org/news/browse/the-mrc-supports-preprints/. [cited 2019 Aug 21].
Článek vyšel v časopise
PLOS Genetics
2020 Číslo 4
- Může hubnutí souviset s vyšším rizikem nádorových onemocnění?
- Polibek, který mi „vzal nohy“ aneb vzácný výskyt EBV u 70leté ženy – kazuistika
- Zkoušku z bariatrické chirurgie nejlépe složil ChatGPT-4. Za ním zůstaly Bing a Bard
- Raději si zajděte na oční! Jak souvisí citlivost zraku s rozvojem demence?
- Metamizol jako analgetikum první volby: kdy, pro koho, jak a proč?
Nejčtenější v tomto čísle
- Analysis of genes within the schizophrenia-linked 22q11.2 deletion identifies interaction of night owl/LZTR1 and NF1 in GABAergic sleep control
- High expression in maize pollen correlates with genetic contributions to pollen fitness as well as with coordinated transcription from neighboring transposable elements
- Molecular genetics of maternally-controlled cell divisions
- Spastin mutations impair coordination between lipid droplet dispersion and reticulum