Creating convergence: Coming together to tackle systemic challenges

Note: This post is the second in a two-part series. Read the first post, exploring the interconnected challenges of the current system, here.

Researchers, funders, institutions and other stakeholders have the power to reshape scholarly communications to better reflect the values and workflows of research, and drive more impactful, rapid, reproducible, and cost-effective science. But only if we work together, across the entire breadth of the system. 

Coming together

For the past quarter-century scholarly communications stakeholders have worked separately to address interconnected systemic problems of research integrity, transparency, speed, and cost though open science. So far it hasn’t worked.

The problem isn’t the toolkit; it’s the application. We’ve been working in isolation, one platform, policy, or pilot program at a time. System-wide change requires holistic solutions and large-scale cooperative action.

To reshape research sharing and communication, we have to collaborate across the stakeholder groups, including funders, institutions, researchers, and various types of content service providers. Together, we can leverage the tools of open science to shape a new approach to the communication of research in all its forms, backed by a reward system that corresponds to the values of science and society. In turn, we can empower researchers to practice high-quality, trustworthy, efficient, and collaborative science without sacrificing career advancement. Stratos specializes in forging mutually beneficial partnerships across stakeholders, and implementing collaborative project work that yields measurable and lasting results.

The levers of change

Open science encompasses a broad range of research production, sharing, and communications best practices designed to support research integrity, efficiency, reproducibility and innovation. Tools like data availability, methods and code sharing, preprints, reporting standards, persistent identifiers, lab notebooks, incremental publishing, and more, if coordinated, could fundamentally evolve our effectiveness. Regardless of the particular behavior we hope to encourage, there are three major levers we can use to effect change

I. Policy

Policy has the potential to be hugely impactful. But complexity, inconsistency, and individuality can limit effectiveness.

A plethora of similarly intentioned but slightly different open science policies have evolved across the scholarly ecosystem. To take only one example, many funders now have data availability policies. Some require grantees to use a data repository; others have specific licensing requirements. Some screen for policy adherence, others don’t. Similar variability exists at the publisher level. Some journals require a data availability statement but allow authors to share data “upon request.” Others may require data as Supporting Information or a link to a repository DOI. According to DataSeer’s Open Science Metrics, only 30-40% of authors comply with data policy requirements.  

And data availability is just one of many open practices. The policy landscape is so complex that entire databases are dedicated to parsing requirements. It’s no wonder that researchers find open science confusing, and struggle to comply — even with the best of intentions.

If we can be more collaborative and consistent in our mandates we’ll see greater success and better outcomes. ASAP (Aligning Science Against Parkinsons) provides one example. The ASAP initiative launched in 2019 as the most ambitious large multi-team transdisciplinary project in the U.S., introducing policy mandates and the necessary infrastructure and processes for an open collaborative research community. In the first five years, hundreds of scientists have been granted $290M+ USD across 35 team leaders representing 14 countries and 80+ institutions to conduct basic research on Parkinson’s Disease (PD). 

As the initiative has evolved, ASAP has generated useful materials in the open sphere, including the ASAP Blueprint for Collaborative Open Science, the preprint “From Policy to Practice: Tracking an Open Science Funding Initiative”; and the interview series Protocol Particulars in which researchers share tips and best practices for utilizing their protocols.

II. Infrastructure & training

Asking people to change their practices without removing barriers to those changes doesn’t work. As Brian Nosek’s strategy for culture change stipulates, one of the most effective ways to shape behavior is through tools and systems. There are plenty of examples of open science infrastructure already in place: integrations that allow for automatic preprint posting as part of article submission, automation that scans submitted manuscripts for policy compliance, indexing and archiving practices that proliferate content across multiple libraries and platforms. The trouble is, those infrastructures are adjunct to an outdated model based on pre-internet print publishing norms. 

Again, a communal, large-scale re-envisioning is necessary to drive the kind of impactful change we want to see. One proposal is “research stacks” — an alternative vision for research communication that embraces the tools and technologies researchers use in their daily work. In this model, the PDF article is replaced by a research package or “stack” containing all the outputs of a single investigation from study design and protocols, to data and code, to a narrative summary and reviews, all in one purpose-built environment. 

Another model seeks to make research communication more modular and composable. The Continuous Science Foundation held a meeting in Banff in May 2025 where this idea of composable science was proposed. Similarly, calls for a new research communication paradgim were made at a workshop held on the HHMI Janelia Research Campus in December of 2024 where participants outlined a more complete and iterative ‘container’ for research that evolves into a linked set of related revisions, reviews, annotations over time. The summary is still open for public comment. 

In the nearer term, metadata is a vital but oft-overlooked infrastructure element. No matter how much information is publicly available, its value lies in users having the ability to discover, access, interpret, and reuse it.

Consistent metadata is the key. Again taking data availability as an example, most data sharing policies result in isolated datasets in random warehouses and repositories, without the tools necessary for discovery and reuse — effectively data graveyards. Broadly applied, normalized metadata standards by contrast, can create collective, stewarded data sources, filled with disparate data types and formats co-localized and even merged for maximum discoverability, reuse, and cross reference.

Several models can be found in the neuroscience community. Neurodata Without Borders (NWB) offers a way to package together very different data formats into a single, ingestible container to make exchange of all of the datasets connected to an experiment interoperable. The Brain Imaging Data Structure (BIDs) is a standard way of organizing related files and metadata. BIDs offers a starter kit with a simple explanation how to work with it and a BIDS validator to automatically check datasets for adherence to the specification.

Training goes hand-in-hand with new processes and technologies. There is more information, materials, training, and support for open science and reproducibility practices now than ever. Stanford University’s Program on Research, Rigor and Reproducibility (SPORR) is an initiative at the Stanford University School of Medicine funded by an NIH grant and developed as a model for other Stanford schools and departments. It provides support services for education and training, monitoring and accountability, and implementation of open scholarship practices including incentives and rewards. And NASA’s Open-Source Science Initiative (OSSI) is a comprehensive program of activities to promote policy adjustments, support open-source software, and enable cyberinfrastructure. NASA has developed Open Science 101, an openly licensed curriculum covering principles and best practices.

III. Incentivization

Research communication workflows were shaped and continue to be reinforced by the entrenched reward system of science. Career advancement is allocated based on publication in high-impact journals. That incentivizes competition, secrecy, and stockpiling results. That stifles research progress. It makes researchers reluctant to admit mistakes and can even lead to selective reporting or outright fraud. 

The National Academy’s Roundtable on Aligning Incentives for OpenScience met in June 2024 and concluded that, while journal publishing is no longer the best methodology for research sharing and communication, we have also not yet provided a better alternative. The logical choice is a preprint that includes underlying data and other research outputs needed to reproduce the work. And, once that is accomplished, allows for a wide range of mechanisms for AI validation and human peer review.

The round table has established two collective action initiatives to catalyze change in the ecosystem. One examines the current business models and funding structures for research sharing and communication such as subscriptions and transformative deals and APCs. This group is considering how these funds may be better applied to a new paradigm. The second action initiative is convening innovators, research communication experts, technologists, and funders to build convergence behind the alternative to the current system of journal publishing. A new research paradigm based on data-complete preprints with options for validation and review.

We believe it is only through such convergence of ideas and collective action that we will achieve a true shift away from an expensive, print based system towards a dynamic paradigm that will accelerate the benefits of our collective investment in science.

Progress in science thrives through clarity, collaboration, and the free exchange of ideas. Right now, researchers who must often go above and beyond to voluntarily embrace open practices are fighting an uphill battle, doing extra work for little reward, sometimes even going against their own best interest. While some enthusiastic altruists with established careers and solid funding have the privilege to do that, large-scale adoption depends upon normalizing and rewarding a culture of openness.

We need to ask ourselves: how do we recognize, celebrate, and reward the kinds of behaviors that contribute to scientific progress? How do we free researchers from the restrictive demands of high-impact publishing and give them the freedom to practice science according to its — and their — own standards?