While health services leaders rely heavily on information gathered via environmental scans (ESs) to guide strategic decision-making, formal guidance on how to conduct these scans is notably absent. The purpose of this study was to determine the level of agreement on essential components of a definition and a methodological framework for ESs. The goals were to (1) advance our working definition to a concept definition for ESs and (2) develop a methodological framework to guide health service researchers conducting ESs.
We used a real-time, modified Delphi survey in a virtual platform setting to seek perspectives on statements related to ESs from individuals who were recruited based on having verifiable experience designing or conducting ESs in health services delivery research. Surveylet, an online software, was used to facilitate asynchronous data collection and to determine the level of agreement on the statements with an a priori threshold of 75% set for agreement on each statement.
21 panellists provided opinions on 59 statements related to a proposed ES definition and on 69 statements specific to components of a methodological framework for ESs.
Panellists from four countries participated in the survey representing 2 to ≥11 years of experience with ESs and having completed 1 to ≥7 ESs. Agreement was achieved in 28 of the 59 statements related to the ES definition and for 51 of 69 statements related to a methodological framework.
The agreement on many elements deemed essential for a definition of ES support development of a proposed concept definition of ES in health service delivery research. As well, the agreement on components deemed necessary for a methodological framework will help in future development of such a framework to guide stakeholders in the planning and implementation of ESs. These results provide a starting point for a common understanding of ESs in the field of health services delivery research.
Scoping reviews, mapping reviews and evidence and gap maps (collectively known as ‘big picture reviews’) in health continue to gain popularity within the evidence ecosystem. These big-picture reviews are beneficial for policy-makers, guideline developers and researchers within the field of health for understanding the available evidence, characteristics, concepts and research gaps, which are often needed to support the development of policies, guidelines and practice. However, these reviews often face criticism related to poor and inconsistent methodological conduct and reporting. There is a need to understand which areas of these reviews require further methodological clarification and exploration. The aim of this project is to develop a research agenda for scoping reviews, mapping reviews and evidence and gap maps in health by identifying and prioritising specific research questions related to methodological uncertainties.
A modified e-Delphi process will be adopted. Participants (anticipated N=100) will include patients, clinicians, the public, researchers and others invested in creating a strategic research agenda for these reviews. This Delphi will be completed in four consecutive stages, including a survey collecting the methodological uncertainties for each of the big picture reviews, the development of research questions based on that survey and two further surveys and four workshops to prioritise the research questions.
This study was approved by the University of Adelaide Human Research Ethics Committee (H-2024-188). The results will be communicated through open-access peer-reviewed publications and conferences. Videos and infographics will be developed and placed on the JBI (previously Joanna Briggs Institute) Scoping Review Network webpage.