Research drives innovation in everything from data analytics to product design. But what if we cannot trust that the latest research is accurate? Two key features of good research are replicability, which means researchers can test a hypotheses against artifacts that are similar to those of the original study, and reproducibility, which means researchers can achieve the same results with the same tools and data. But a few years ago, a landmark study attempted to replicate 100 psychology experiments, and found that less than 40% of results matched the original findings. By those numbers, a considerable share of research results may not be as robust as we thought they are. Culverhouse’s Dr. Pratyush Sharma and collaborators Susanne Adler (Ludwig-Maximilians-University, Germany) and Dr. Lăcrămioara Radomir (Babeș-Bolyai University, Romania), in their September Journal of Business Research article “Toward Open Science in PLS-SEM: Assessing the State of the Art and Future Perspectives,” discuss a way forward: open science, an approach that emphasizes transparency and collaboration.
While many academic journals encourage original, one-off research and do not reward researchers for making materials and data available for independent reuse or for conducting replication studies that build confidence in findings, open science encourages researchers to make their methods clear and understandable and their tools and datasets accessible to other researchers. In this way, studies can more easily be replicated, boosting confidence in claims, and making research findings more trustworthy.
Sharma and his colleagues focused on a popular statistical technique: Partial Least Squares Structural Equation Modeling, or PLS-SEM. To learn to what extent open science practices are implemented in PLM-SEM-based studies, the authors conducted a comprehensive literature review of leading marketing journals. Unsurprisingly, the results showed little evidence of open science adoption, and few replication studies. Also lacking were instances of preregistration, which is a practice, favored by open science, of registering research questions, hypotheses, and methodology before analyzing data. To address this, the authors designed and provided a PLS-SEM-specific preregistration template that researchers can use to foster transparency in their analyses, thereby increasing confidence in their findings.
While researchers currently have the voluntary choice of engaging in preregistration and material sharing, several journals are beginning to encourage or even require preregistration and data sharing. Thus, it is important that researchers stay abreast with the upcoming changes and not be caught off-guard.
“Error correction is an important aspect of the scientific enterprise and open science practices can help achieve that goal more efficiently,” Sharma said.