Companies offering digital therapeutics frequently advertise that the effectiveness of their product is backed by scientific evidence. Such scientific evidence is critical because it operates as a key sales tool and helps companies identify areas to improve the product. Conversely, unsupported claims about the product can deceive consumers and create bad faith with the intended audience.
However, just because a company publishes scientific evidence of a product’s effectiveness does not mean that the report is credible evidence (see our data sheet for guidance on how to critically evaluate evidence in the digital health industry). The behind-the-scenes decision-making in the scientific process makes it too easy to strategically report that a study supports a specific claim about a product. To evaluate whether any evidence is truly credible, such behind-the-scenes decisions need to be made visible through the adoption of Open Science practices.
The Reproducibility Crisis and Open Science
Open Science is a set of principles and methods that promote transparent, reproducible, and accessible science1,2. Open Science developed in response to the “replicability crisis” in which alarming rates of previous scientific findings have failed to be reproduced by new studies using the same methodologies. For instance, a landmark study estimated that only ~50% of findings reported from three premier psychology journals could be replicated3. Such alarming findings have forced scientists to engage in a brutally honest reflection of the scientific process to identify why findings are failing to replicate.
One dominating answer to this question is that science is a human endeavor. As humans, scientists have innate biases and pressures that influence how studies are conducted and reported2. Reporting that a study obtained favorable results can be rewarded with study publication, increased status, job security, professional accolades, grant funding, and more. Research conducted by commercial entities has even larger pressures as scientific evidence can be critical to investor support and the success of the company by providing evidence that the product in question actually ‘works’. Accordingly, unfavorable findings can make or break an entire company.
When these pressures and motivations meet the multiverse of different ways, studies can be conducted, data analysed and reported, and the dissemination of invalid or irreproducible findings can occur. For example, it has been posited that data produced from a neuroimaging experiment can be plausibly analysed in ~7,000 different ways4. If during analysis 6,990 of the different analyses produce unfavorable results, only the 10 favorable results could be selectively reported. Given the overwhelming pattern of unfavorable results, these 10 selectively reported favorable findings could be false positives. Consequently, the selective publication of these 10 favorable results while hiding the 6,990 unfavorable results would make these invalid findings appear credible and bias an entire research area by promoting theories and work that assume such results are credible. As scientists who are trusted sources of knowledge, we have a responsibility to report our scientific process and findings (positive, null, or negative) in a way that is visible and verifiable to the public so that the true credibility of the reported results can be investigated. This is where Open Science comes in.
Open Science Practices
The core mission of Open Science is to increase transparency in science by providing publicly available, verifiable documentation of the entire scientific process. To do this, Open Science encourages scientists to, where possible, engage in the following key practices5,6 :
- Pre-registration: Publicly publishing a timestamped plan for a study before data has been either collected, examined, or both (e.g., such as a public repository website such as osf.io). This serves as a public record of what the study plan was before results are known or reported. Deviations from the original study plan in the final study report are then transparent to the public.
- Providing open access data, materials, and code: Publicly storing all study materials, data, and data analysis code so that others can verify reported study methods and study findings.
- Providing resources for reproducible analyses: Documentation and necessary resources for reproducing the exact study findings should be provided along with open access data, material, and data analysis code.
Open Science and Digital Therapeutic Companies
Such practices have slowly integrated into university research over the past decade and there is a clear need for the same integration in commercial research1. Digital therapeutic companies can have a direct influence on consumers’ health and therefore have a responsibility to ensure they are offering products that are backed by credible scientific evidence. Evidence claims made by digital therapeutic companies should be accompanied by visible and verifiable research processes, which can be accomplished through Open Science practices. At a minimum, initial study plans and deviations from these initial plans should be publicly documented. This can be accomplished through public pre-registration of initial study plans and subsequent acknowledgement of these plans and explanation of any deviations from them in the final study report. Secondly, when ethically possible, de-identified study data should be publicly stored along with study materials and data analytic code so that the public and other researchers can independently access and verify any documented claims. Thirdly, instructions on how to use the publicly stored data and analysis code to reproduce and verify claims should be provided so that the public has both access and the necessary knowledge to investigate claims. These practices will not only enhance the credibility of evidence claims but being open and honest with the scientific process behind such claims will also serve to enhance public trust in the company and increase brand reputation.
If digital therapeutic companies truly want to provide products backed by the highest credibility of evidence and empower the public to verify their claims, then they will begin adopting Open Science practices. Providing the highest standard of evidence-based digital therapeutics is key to SilverCloud Health’s mission. As such, SilverCloud Science sees the adoption of Open Science practices detailed above as central to this mission. We also encourage other commercial research teams to do the same and plan to release further collaterals with more information about how others can implement Open Science tools, such as a guide for completing pre-registration and the details of our journey as a commercial research team adopting these principles. Ultimately, we are optimistic about the potential for Open Science practices to cultivate more transparent and trustworthy evidence generation. In turn, this should improve both the standards of care and outcomes for our end-users, as well as the standards and reputation of the digital health industry as a whole.
References
1. Evans, T. R., Branney, P., Clements, A., & Hatton, E. (2021). Improving evidence-based practice through preregistration of applied research: Barriers and recommendations. Accountability in Research, 1-21.
2. Munafò, M. R., Nosek, B. A., Bishop, D. V., Button, K. S., Chambers, C. D., Du Sert, N. P., ... & Ioannidis, J. P. (2017). A manifesto for reproducible science. Nature human behaviour, 1(1), 1-9.
3. Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251).
4. Carp, J. (2012). On the plurality of (methodological) worlds: estimating the analytic flexibility of FMRI experiments. Frontiers in neuroscience, 6, 149.
5. Crüwell, S., van Doorn, J., Etz, A., Makel, M. C., Moshontz, H., Niebaum, J. C., ... & Schulte-Mecklenbeck, M. (2019). Seven easy steps to open science. Zeitschrift für Psychologie.
6. Vazire, S. (2018). Implications of the credibility revolution for productivity, creativity, and progress. Perspectives on Psychological Science, 13, 411-417.