This study aimed to understand the knowledge possessed by informal medicine vendors regarding antibiotics and antibiotic resistance, identify the perceptions held by informal medicine vendors about antibiotics and their uses and examine the practices employed by informal medicine vendors in the sale and distribution of antibiotics.
Exploratory qualitative study using semi-structured interviews and direct observations.
Markets and shops across 11 villages in the Nanoro health district, Burkina Faso.
23 informal medicine vendors, aged between 25 and 55 years and with 8–30 years of experience, were recruited through snowball sampling in the Nanoro health district of Burkina Faso.
Informal medicine vendors exhibited a limited understanding of antibiotics, often confusing them with other treatments and referring to them using local terminologies based on perceived use and effectiveness. Antibiotics were perceived as universal remedies, supported by therapeutic belief, empirical reasoning and community solidarity, with empirical diagnosis, approximate dosing and informal preparation techniques passed on through imitation. These findings emerged across themes including perceptions, symbolic attributes and sales practices.
Informal medicine vendors in rural Burkina Faso demonstrated limited understanding of antibiotics and antimicrobial resistance, with practices shaped by local beliefs and empirical experience. These findings underscore the need for context-sensitive interventions that include tailored education and regulatory engagement to improve antibiotic stewardship and mitigate the spread of resistance.
Cerebral palsy (CP) is the most common motor disability in children, with higher prevalence in low-income and middle-income countries (LMICs) compared with high-income countries (HICs). Children with CP (CwCP) often face significant challenges in achieving toileting independence due to motor, sensory and cognitive impairments. Parents play a pivotal role in managing these challenges, often encountering significant emotional, physical and social burdens. Despite the importance of toileting for autonomy and dignity, limited evidence exists on tailored toilet training programmes for CwCP, especially in LMICs. Understanding parental perspectives is essential to addressing these gaps and informing family-centred interventions.
This scoping review aims to explore parents’ perspectives on toileting management for CwCP, focusing on strategies, challenges and unmet needs, to inform future research and the development of supportive interventions. This scoping review will be conducted in accordance with the guidelines of the Joanna Briggs Institute and summarised using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews framework. The English language full-text articles, published between January 2014 and December 2024, addressing the parental perspectives, challenges and strategies related to the task of toileting in CwCP below 18 years of age will be included. Systematic searches will be conducted through PubMed, EMBASE, CINAHL, Scopus, Web of Science and grey literature. The data will be extracted and analysed thematically using Microsoft Excel.
The present protocol has been registered in the Open Science Framework (DOI: https://doi.org/10.17605/OSF.IO/73YQZ). Ethical approval is not required, as this review uses secondary data from published studies and does not involve direct participant recruitment. The findings will synthesise themes related to parents’ strategies, challenges and expectations in toileting management for CwCP. They will help address existing literature gaps and inform the development of practical, evidence-informed toileting education programmes for parents.
This study aims to identify the inequalities encountered during the nursing doctoral process and propose potential solutions.
This study was conducted as qualitative and descriptive research.
The study was conducted between January and February 2025 with 18 students who were pursuing doctoral education in nursing and working as nurses in Türkiye. Data were collected through face-to-face interviews using an introductory information form and a semi-structured interview guide. The data were analysed using an inductive content analysis method.
Four main themes emerged from the study: Challenges: Dancing at Two Weddings at the Same Time, Academic Competition and Inequalities, Glass Ceilings in Hospitals and Solution Strategies for Inequalities.
The study highlighted the inequalities experienced by students who were simultaneously continuing their doctoral education while working as nurses in the same clinical setting.
Faculty members providing doctoral education could organise meetings to help students overcome the challenges they face during their educational process. Nurse managers, on the other hand, should organise regular meetings where nurses can share the difficulties they encounter and offer suggestions for improving their units.
The Consolidated Criteria for Reporting Qualitative Research (COREQ) checklist was used for reporting the study.
No patient or public contribution.
To explore whether routine electronic healthcare records can be used to identify triggers for initiating advance care planning (ACP) and the optimal time window to initiate ACP. We aimed to assess the prevalence of triggers for initiating ACP as defined for use in routine data, whether their presence is associated with death, and what their position is relative to a previously identified ‘optimal time window for ACP’.
Nested case-control study within a large dynamic population cohort dataset.
Primary care population-based, anonymised data extracted from GP centres in the South Holland province, The Netherlands.
We selected records of individuals aged ≥65 registered with their general practice from 1 Jan 2014 to 1 Jan 2017. Cases were individuals who died between 1 Jan 2017 and 1 Jan 2020. Controls were individuals who remained alive. Cases were matched by age to controls in a 1:4 ratio.
Outcomes include prevalence of triggers for ACP in the records of deceased and living individuals; association of the triggers’ presence with death; timing of the identified triggers in deceased individuals relative to the ‘optimal time window for ACP’.
We included 17098 records, 4139 from deceased individuals (mean age 81) and 12959 from living individuals (mean age 79). Triggers most strongly associated with death were consultations concerning malignancy (OR 8.35, 95% CI 7.42 to 9.41), hospital admissions (OR 7.32, 95% CI 6.75 to 7.94), emergency department referrals (OR 7.11, 95% CI 6.52 to 7.75), registered home visits (OR 5.97, 95% CI 5.51 to 6.47), consultations concerning heart failure (OR 5.25, 95% CI 4.59 to 5.99), dementia (OR 4.75, 95% CI 3.99 to 6.56), opioid prescriptions (OR 4.58 (4.25–4.93), consultations concerning general decline/feeling old (OR 4.15, 95% CI 3.72 to 4.64) and skin ulcers/pressure sores (OR 4.04, 95% CI 3.55 to 4.61). Those closest to the median of the optimal time window for ACP were consultations regarding dyspnoea, general decline/feeling old, heart failure, skin ulcers/pressure sores and fever, opioid prescriptions, emergency department referrals, registered home visits and hospital admissions.
Clinical triggers for initiating ACP in general practice can be recognised within the routine electronic health records and they align well with the ‘window of opportunity’ to initiate ACP.
by Denise Skiadopoulos, Alaina Bandstra, Valerie Kattouf, Corina van de Pol, Vivek Labhishetty, Sumeer Singh
ObjectiveHeterophoria is routinely measured during a comprehensive ocular examination. The aim of the current study is to compare the inter-examiner repeatability of the Neurolens measurement device (nMD), a commercially available instrument that objectively assesses phoria, to the inter-examiner repeatability of prism alternating cover test and the von Graefe method.
Methods91young adults aged between 18–60 years were enrolled. Two experienced optometrists assessed phoria on each subject using three methods: the von Graefe method (VG), prism alternating cover test (PCT) and nMD. VG and PCT were performed at distance (6m) and near (40 cm). The nMD measurements were obtained using virtual distance (6m) and near (50 cm) targets. All the tests were performed in a single session by both the examiners in a randomized order.
ResultsAll study participants were students, staff, and faculty of the Illinois College of Optometry. Of the 91 participants recruited, 65 were female. All participants completed the study with no missing data. The repeatability analysis showed nMD (distance: 0.69 ± 0.77PD; near: 1.00 ± 0.98PD) to have the smallest mean absolute difference at both distance and near compared to VG (distance: 3.28 ± 3.18PD; near: 4.48 ± 3.99PD) and PCT (distance: 1.50 ± 2.36PD; near: 4.05 ± 3.69PD). Bland Altmann analysis showed that the phoria measurements from nMD exhibited significantly less variability when compared with VG and PCT.
ConclusionsThe Neurolens measurement device (nMD) has the highest inter-examiner repeatability when compared to traditional VG and PCT methods. Given that the measurements are objective and repeatable compared to the two traditional methods, this device has the potential to be a useful addition to current methods of clinical practice.
The Munich Security Conference 2024 highlighted the complex connections between climate change and global security risks. Engaging students in fighting climate change is a stepping stone to achieving the Sustainable Development Goals.
To investigate the effect of a video-based climate change program on revitalising eco-cognizance, emotional response, and self-efficacy among nursing students in rural communities.
A randomised controlled trial research design was adopted.
A total of 140 nursing students completed a survey related to the Climate Change Perceptions, the Climate Change Anxiety Scale, and the Environmental Self-Efficacy Scale. The study group engaged in the video-based climate change program, while the comparison group received flyers related to climate change across the globe.
The intervention group significantly improved climate change perception and environmental self-efficacy compared to the control group, with large effect sizes. On the other hand, significantly lower levels of cognitive impairment due to climate change anxiety were recorded among the intervention group compared to the control group.
Our intervention improved nursing students' climate change literacy, pro-environmental attitudes, environmental self-efficacy, and anxiety. Future research may target a variety of university majors and use RCTs nested in a mixed-method design to capture the student experience with climate change before and after the RCT.
This study demonstrated that a comprehensive educational program significantly improved climate literacy, pro-environmental attitudes, and environmental self-efficacy among undergraduate nursing students while reducing climate anxiety. The findings of this study offer valuable insights for enhancing student nurses' ability to translate their scientific understanding into informed decision-making regarding issues like climate change.
Drastic natural disasters, including extreme temperatures, flooding, wildfires and snow and sandstorms, significantly affect populations, including nursing students. Early screening and management of climate change anxiety among university students is recommended as a buffer against upcoming mental health issues. Student counselling services are urged to consider the effect of climate change as a mental health parameter that significantly affects students' psychological and, consequently, academic life and progress. A video-based climate change program (VBCCP) is beneficial for equipping students with climate change literacy. The revitalization of the participant's overall eco-emotional response, pro-environmental behaviour and cognizance signalled the potential of VBCCP as a simulation teaching tool that might be integrated into nursing curriculums. Additionally, VBCCP is a cost-effective strategy that complies with International Nursing Association for Clinical Simulation and Learning (INACSL) requirements. The VBCCP can be delivered in the conventional classroom environment or through the digital platform without incurring additional costs and in alignment with the definition of simulation provided by the Agency for Healthcare Research and Quality.
No public or patient contributions.
RCT registration: NCT06223412, on 23rd January 2024
by Oumarou I. Wone Adama, Iman Frédéric Youa, Alexandra Bitty-Anderson, Arnold Junior Sadio, Rogatien Comlan Atoun, Yao Rodion Konu, Hezouwe Tchade, Martin Kouame Tchankoni, Kokou Herbert Gounon, Kparakate Bouboune Kota-Mamah, Abissouwessim Egbare Tchade, Godonou Amivi Mawussi, Fiali Ayawa Lack, Fifonsi Adjidossi Gbeasor-Komlavi, Anoumou Claver Dagnra, Didier Koumavi Ekouevi
IntroductionIn Togo, the syndromic approach is used for the diagnosis and management of sexually transmitted infections (STIs). The aim of this study was to evaluate the syndromic approach for diagnosis of STIs among female sex workers (FSW) in Lomé, Togo.
MethodsA cross-sectional study was carried out from September to October 2023 among FSW in Lomé (Togo). FSW aged 18 years and above were included. A gynecological examination was performed for syndromic diagnosis, and the Xpert® CT/NG were used to screen vaginal swabs for Chlamydia trachomatis (CT) and Neisseria gonorrhoeae (NG). The performance (predictive values) of the syndromic approach to STI diagnosis was evaluated using the Xpert® CT/NG test as the gold standard.
ResultsA total of 357 FSW were recruited. The median age of FSW was 32 years (IQR: [26–40 years]) and 8.2% had attained a higher level of education. The prevalence of syndromic STI among FSW was 33.3%. Vaginal swabs were positive for CT (8.4%) and NG (8.7%), with a prevalence of bacterial STIs (CT and/or NG) of 14.3%. The syndromic approach to STI diagnosis demonstrated a positive predictive value of 24.3%.
ConclusionThe prevalence of STIs is relatively high among FSW in Lomé. According to this study, the diagnosis of STIs using the syndromic approach has limited relevance. National STI screening and management policies urgently need to be rethought, incorporating recent technological advances.
by Adela Klezlova, Petr Bulir, Alexandr Stepanov, Andrea Sidova, Magdalena Netukova, Jana Vranova, Katarina Urbaniova, Martina Grajciarova, Lenka Vankova, Zbynek Tonar, Pavel Studeny
PurposeThe purpose of the study is to evaluate the effectiveness, surgical preoperative and postoperative complications, histopathological findings, and optimize surgical technique after implantation of the new nanofiber glaucoma drainage implant (GDI).
MethodImplantation of the GDI, a unique nanofiber drainage device fabricated from polyvinylidene fluoride (PVDF) using the well-established electrospinning technology on the Nanospider™ platform, was first optimized in vitro on cadaver porcine bulbs before the initial in vivo implantations. PVDF was selected due to its favorable properties, including biocompatibility, anti-adhesive behavior, and mechanical stability, which are particularly advantageous in minimizing fibroblast colonization and fibrotic encapsulation. The Nanospider™ technology allows for reproducible, large-scale fabrication of nanofiber materials with controlled fiber morphology, which ensures uniformity and precision of implant dimensions.An in vivo study on 28 normotensive eyes from 14 laboratory New Zealand White rabbits was conducted. There were two groups of animals: the study group (14 eyes) and the control group (14 contralateral eyes). The study group underwent implantation of the new nanofiber GDI; the control group did not undergo any surgical procedure. Intraocular pressure (IOP) was measured preoperatively and at regular times postoperatively (Tono-Pen AVIA®). Preoperative and immediate postoperative complications were monitored. Histological quantification was performed using unbiased sampling and stereological methods to assess leukocyte infiltration, type I and type III collagen fractions, and both absolute and relative levels of inflammation.
ResultsBased on the previous results and in vitro surgical experiences, the implant was narrowed to 2.0 mm, a thickness of 100 µm was chosen, and the implant was fixed with two scleral stitches to maintain its position. No serious preoperative complications occurred during in vivo experiments. There was one extrusion of the glaucoma implant noted after surgery, likely due to insufficient conjunctival fixation. This animal was excluded from both the study and the control groups. No serious instances of intraocular hypotension were observed after surgery. All animals tolerated the surgical procedure well, and the postoperative period was without any serious issues. In the study group, the average preoperative IOP was 13.6 mmHg (±4.1, n = 13). The average postoperative IOP on the first day, one, two, and three weeks, and one month after surgery decreased to 8.8 mmHg (±3.3, n = 13), 9.8 mmHg (±2.0, n = 13), 10.3 mmHg (±3.6, n = 13), 10.2 mmHg (±2.6, n = 13), and 9.7 mmHg (±2.0, n = 13), respectively. In the control group of contralateral eyes, the average preoperative IOP was 11.42 mmHg (±4.2, n = 13). The average postoperative IOP was 11.8 mmHg (±5.4, n = 13), 14.2 mmHg (±4.6, n = 13), 14.5 mmHg (±3.4, n = 13), 14.0 mmHg (±3.8, n = 13), and 14.2 mmHg (±2.4, n = 13), respectively, at the same follow-ups. In the study group, the IOP was statistically significantly lower by 29% at the end of the follow-up compared to the preoperative measurements (p = 0.009). Eyes with the implant showed greater leukocyte infiltration and less type I collagen compared to the group without implants. The ratio of type I to type III collagen was lower in the implant group, indicating delayed maturation and weaker connective tissue during early healing.
ConclusionFor easier implantation, minor technical adjustments such as implant narrowing and scleral fixation of the GDI were developed and tested using in vitro experiments. In vivo implantation of unique nanofiber GDI appeared safe and technically well-suited for our study. No serious perioperative or postoperative complications were observed. There was one scleral extrusion of the device, which was, in our opinion, caused by insufficient conjunctival fixation. A statistically significant IOP reduction was achieved at the end of the follow-up in the study group with implanted GDIs. Further studies on the effectiveness of the implant with longer monitoring periods, together with other surgical options such as combined cataract surgery and nanofibers GDI, are needed.
by Edlin Garcia Colato, Aijia Yuan, Sagar Samtani, Bernice A. Pescosolido
Given the prevalence of depression among young adults, particularly those aged 18–25, this study aims to address a critical need in higher education institutions for proactive, private, automated mental health self-awareness. This study protocol outlines how a mobile phone application will leverage sensor signal and survey data to develop an automated screening tool for depressive behaviors. By analyzing sensor-based behavioral data through deep learning techniques, the proposed study seeks to identify students exhibiting depressive symptoms and their specific behaviors. Approximately 1,000 first-year undergraduate students (age 18 and above) will be recruited from two public US universities, one in the Midwest and one in the Southwest. For the midwestern university, there will be 11 surveys (baseline, nine follow-ups, and an endline) collected throughout a single academic year (2024–2025). However, at the southwestern university, only nine surveys will be administered during a semester. Simultaneously, sensor-based behavioral data on behaviors such as physical activity, social interactions, and sleep will be continuously collected passively. The main analysis will focus on understanding the relationships between human behaviors captured by sensor-based behavioral data and self-reported mental health surveys. Machine learning and deep learning algorithms will be used to uncover key behavioral patterns most indicative of mental disorders such as depression.Subarachnoid haemorrhage (SAH) is relatively frequent, accounting for 5% of strokes and affects a young population. Arterial vasospasm is a frequent complication of SAH, with an estimated incidence as high as 70%. Vasospasm is responsible for cerebral ischaemia which in turn is potentially responsible for severe morbidity (neurological deficit, neuropsychiatric disorders), poor quality of life (institutionalisation, inability to return to work) and increased mortality. Treatment with intravenous milrinone, an arterial vasodilator, has been proposed, but no randomised controlled study exists. We hypothesised that an intravenous infusion of milrinone would improve the neurological recovery of patients with vasospasm following aneurysmal SAH at 3 months.
The MiVAR (Milrinone Infusion for VAsospam treatment in subarachnoid hemoRrhage) study is an investigator-initiated, phase III multicentre, randomised placebo-controlled, double-blinded, superiority trial evaluating the effect of intravenous milrinone versus placebo (saline), in patients with cerebral vasospasm following aneurysmal SAH. Patients will be included within 6 hours of the confirmation of vasospasm diagnosis by a CT angiography and randomised to receive either milrinone (initial bolus of 0.1 mg/kg over 30 min—max 10 mg—followed by a continuous infusion at 1 µg/kg/min rate for at least 48 hours) or placebo. Milrinone (or placebo) could be increased to 1.5 µg/kg/min. The dose is adapted according to the clinical and/or transcranial Doppler response. 360 patients are expected to be included. The primary endpoint is the proportion of patients with a good neurological outcome at 3 months, defined as a modified Rankin score ≤2, obtained through a centralised standardised telephone interview (done by a unique trained team). The study started in August 2020, and the expected final follow-up is the last quarter of 2025. Analyses of the intention-to-treat and per-protocol populations are planned.
The MiVAR trial protocol has been approved by an ethics committee (Comité de Protection des Personnes Ouest V), by the Agence Nationale de Sécurité du Médicament (ANSM, Number 160 828A-21, approval date 26 December 2019) and by the ‘Commission Nationale Informatique et Liberté’ (CNIL, decision DR-2020-076, approval date 21 February 2020). The study will be conducted according to the principles of the Declaration of Helsinki and the Good Clinical Practice guidelines. The results will be disseminated through presentation at scientific conferences and publication in peer-reviewed journals. The MiVAR study will be the first multicentre randomised study to evaluate the efficacy of intravenous milrinone in improving the neurological outcomes at 3 months in patients with vasospasm following aneurysmal SAH.
NCT04362527, EudractCT number 2019-002145-37.
This study explores the potential of a generative artificial intelligence tool (ChatGPT) as clinical support for nurses. Specifically, we aim to assess whether ChatGPT can demonstrate clinical decision-making equivalent to that of expert nurses and novice nursing students. This will be evaluated by comparing ChatGPT responses to clinical scenarios to those of nurses on different levels of experience.
This is a cross-sectional study.
Emergency room registered nurses (i.e. experts; n = 30) and nursing students (i.e. novices; n = 38) were recruited during March–April 2023. Clinical decision-making was measured using three validated clinical scenarios involving an initial assessment and reevaluation. Clinical decision-making aspects assessed were the accuracy of initial assessments, the appropriateness of recommended tests and resource use and the capacity to reevaluate decisions. Performance was also compared by timing response generations and word counts. Expert nurses and novice students completed online questionnaires (via Qualtrics), while ChatGPT responses were obtained from OpenAI.
Concerning aspects of clinical decision-making and compared to novices and experts: (1) ChatGPT exhibited indecisiveness in initial assessments; (2) ChatGPT tended to suggest unnecessary diagnostic tests; (3) When new information required re-evaluation, ChatGPT responses demonstrated inaccurate understanding and inappropriate modifications. In terms of performance, the mean number of words utilized in ChatGPT answers was 27–41 times greater than that utilized by both experts and novices; and responses were provided approximately 4 times faster than those of novices and twice faster than expert nurses. ChatGPT responses maintained logical structure and clarity.
A generative AI tool demonstrated indecisiveness and a tendency towards over-triage compared to human clinicians.
The study shows that it is important to approach the implementation of ChatGPT as a nurse's digital assistant with caution. More study is needed to optimize the model's training and algorithms to provide accurate healthcare support that aids clinical decision-making.
This study adhered to relevant EQUATOR guidelines for reporting observational studies.
Patients were not directly involved in the conduct of this study.
To describe the co-creation of the ‘Desired Dementia Care Towards End of Life’ (DEDICATED) approach to improve person-centred palliative care for individuals with dementia and to describe the experiences of healthcare professionals during the approach's implementation.
A needs assessment, comprising both qualitative and quantitative studies, informed palliative care needs of healthcare professionals, family caregivers and individuals with dementia. The approach was co-created with healthcare and education professionals, guided by the findings. Then, healthcare professionals were trained to implement the approach in their organizations. From April to June 2022, semi-structured interviews with actively engaged professionals were analysed using Conventional Content Analysis.
The needs assessment yielded six key themes: (1) raising palliative care awareness, (2) familiarization with a person with dementia, (3) communication about future care preferences, (4) managing pain and responsive behaviour, (5) enhancing interprofessional collaboration in advance care planning and (6) improving interprofessional collaboration during transitions to nursing homes. Interviews with 17 healthcare professionals revealed that active involvement in co-creating or providing feedback facilitated implementation. Overall, the DEDICATED approach was perceived as a valuable toolkit for optimizing palliative care for people with dementia and their loved ones.
Co-creating the DEDICATED approach with healthcare professionals facilitated implementation in daily practice. The approach was considered helpful in enhancing person-centred palliative dementia care.
This study underscores the importance of active involvement of healthcare professionals in the research and development of new interventions or tools for palliative care, which can influence the successful implementation, dissemination and sustained usage of the developed tools.
The developed approach can improve person-centred palliative care for individuals with dementia, ultimately improving their quality of life and that of their loved ones.
This study used the Consolidated Criteria for Reporting Qualitative Research.
No patient or public contribution.
Systematic literature reviews (SLRs) are essential for synthesising research evidence and guiding informed decision-making. However, SLRs require significant resources and substantial efforts in terms of workload. The introduction of artificial intelligence (AI) tools can reduce this workload. This study aims to investigate the preferences in SLR screening, focusing on trade-offs related to tool attributes.
A discrete choice experiment (DCE) was performed in which participants completed 13 or 14 choice tasks featuring AI tools with varying attributes.
Data were collected via an online survey, where participants provided background on their education and experience.
Professionals who have published SLRs registered on Pubmed, or who were affiliated with a recent Health Economics and Outcomes Research conference were included as participants.
The use of a hypothetical AI tool in SLRs with different attributes was considered by the participants. Key attributes for AI tools were identified through a literature review and expert consultations. These attributes included the AI tool’s role in screening, required user proficiency, sensitivity, workload reduction and the investment needed for training. Primary outcome measures: The participants’ adoption of the AI tool, that is, the likelihood of preferring the AI tool in the choice experiment, considering different configurations of attribute levels, as captured through the DCE choice tasks. Statistical analysis was performed using conditional multinomial logit. An additional analysis was performed by including the demographic characteristics (such as education, experience with SLR publication and familiarity with AI) as interaction variables.
The study received responses from 187 participants with diverse experience in performing SLRs and AI use. The familiarity with AI was generally low, with 55.6% of participants being (very) unfamiliar with AI. In contrast, intermediate proficiency in AI tools is positively associated with adoption (p=0.030). Similarly, workload reduction is also strongly linked to adoption (p
The findings suggest that workload reduction is not the only consideration for SLR reviewers when using AI tools. The key to AI adoption in SLRs is creating reliable, workload-reducing tools that assist rather than replace human reviewers, with moderate proficiency requirements and high sensitivity.
To evaluate the cost-effectiveness of anti-vascular endothelial growth factor (VEGF) treatments for neovascular age-related macular degeneration (nAMD) using a value-based model that considers drug durability, dosing regimens and real-world administration strategies, including safe vial fractionation.
Model-based pharmacoeconomic analysis using data from randomised clinical trials and network meta-analyses. Analysis conducted from the payer perspective using cost data from the Spanish National Health System.
A model-based analysis compared five anti-VEGF agents—innovator and biosimilar ranibizumab, aflibercept 2 mg, brolucizumab and faricimab—across three dosing regimens: fixed, Pro Re Nata and Treat-and-Extend (TAE). Administration formats included single-use vials, prefilled syringes and vial fractionation (VF), with or without dead-space-free (DSF) syringes to minimise waste. The primary outcome was cost per optimal responder, defined as a patient gaining ≥15 Early Treatment Diabetic Retinopathy Study (ETDRS) letters, with and without adverse events. Cost-effectiveness was evaluated using Number Needed to Treat (NNT), Net Efficacy Adjusted for Risk-NNT (adjusted for safety) and incremental cost-effectiveness ratios. Secondary outcomes included the number of treated patients and optimal responders achievable within a fixed 1 000 000 budget.
The most cost-effective strategy was aflibercept 2 mg under a TAE regimen using DSF VF, with a total cost of 6214 per patient and a cost per optimal responder of 27 155. Under a fixed budget of 1 000 000, this approach allowed treatment of 160 patients, yielding 36 optimal responders. Faricimab with DSF VF ranked second, with a total cost of 5847 and a cost per optimal responder of 28 652, treating 171 patients and achieving 34 responders. In contrast, single-use vials without VF led to substantially higher total costs (eg, 11 305 for aflibercept TAE) and lower treatment capacity (eg, 88 patients treated).
This model demonstrates that combining durable agents, extended dosing intervals and optimised delivery strategies (eg, prefilled syringes and DSF VF) can substantially improve the cost-effectiveness and sustainability of anti-VEGF therapy in public health systems.
by Andrea C. Aplasca, Peter B. Johantgen, Christopher Madden, Kilmer Soares, Randall E. Junge, Vanessa L. Hale, Mark Flint
Amphibian skin is integral to promoting normal physiological processes in the body and promotes both innate and adaptive immunity against pathogens. The amphibian skin microbiota is comprised of a complex assemblage of microbes and is shaped by internal host characteristics and external influences. Skin disease is a significant source of morbidity and mortality in amphibians, and increasing research has shown that the amphibian skin microbiota is an important component in host health. The Eastern hellbender (Cryptobranchus alleganiensis alleganiensis) is a giant salamander declining in many parts of its range, and captive-rearing programs are important to hellbender recovery efforts. Survival rates of juvenile hellbenders in captive-rearing programs are highly variable, and mortality rates are overall poorly understood. Deceased juvenile hellbenders often present with low body condition and skin abnormalities. To investigate potential links between the skin microbiota and body condition, we collected skin swab samples from 116 juvenile hellbenders and water samples from two holding tanks in a captive-rearing program. We used 16s rRNA gene sequencing to characterize the skin and water microbiota and observed significant differences in the skin microbiota by weight class and tank. The skin microbiota of hellbenders that were housed in tanks in close proximity were generally more similar than those housed physically distant. A single taxa, Parcubacteria, was differentially abundant by weight class only and observed in higher abundance in low weight hellbenders. These results suggest a specific association between this taxa and Low weight hellbenders. Additional research is needed to investigate how husbandry factors and potential pathogenic organisms, such as Parcubacteria, impact the skin microbiota of hellbenders and ultimately morbidity and mortality in the species.by Mireia Solé Pi, Luz A. Espino, Péter Szenczi, Marcos Rosetti, Oxána Bánszegi
A long-standing question in the study of quantity discrimination is what stimulus properties are controlling choice. While some species have been found to do it based on the total amount of stimuli and without using numerical information, others prefer numeric rather than any continuous magnitude. Here, we tested cats, dogs, and humans using a simple two-way spontaneous choice paradigm (involving food for the first two, images for the latter) to see whether numerosity or total surface area has a greater influence on their decision. We found that cats showed preference for the larger amount of food when the ratio between the stimuli was 0.5, but not when it was 0.67; dogs did not differentiate between stimuli presenting the two options (smaller vs. larger amount of food) regardless of the ratio between them, but humans did so almost perfectly. When faced with two stimuli of the same area but different shapes, dogs and humans exhibited a preference for certain shapes, particularly the circle, while cats’ choices seemed to be at chance level. Furthermore, cats’ and dogs’ reaction times were equal across conditions, while humans were quicker when choosing between stimuli in trials where the shape was the same, but the surface area was different, and even more so when asked to choose between two differently sized circle shapes. Results suggest that there is no universal rule regarding how to process quantity, but rather that quantity estimation seems to be tied to the ecological context of each species. Future work should focus on testing quantity estimation in different contexts and different sources of motivation.by Catarina Simões, Diana S. Vasconcelos, Raquel Xavier, Xavier Santos, Catarina Rato, D. James Harris
Fire has long been recognized as an important ecological and evolutionary force in plant communities, but its influence on vertebrate community ecology, particularly regarding predator-prey interactions, remains understudied. This study reveals the impact of wildfires on the diet of Podarcis lusitanicus, a lizard species inhabiting a fire-prone region in the Iberian Peninsula. In order to explore diet variability associated with different local burn histories, we evaluated P. lusitanicus diet across three types of sites in Northern Portugal: those had not burned since 2016, those burned in 2016, and those more recently burned in 2022. Podarcis lusitanicus is a generalist arthropod predator with dietary flexibility. Given the turnover of arthropod species after fire, it is expected to find variations in diet caused by different fire histories, especially between unburned and recently burned sites. From DNA metabarcoding of faecal samples, our study revealed that while prey richness remained unaffected by wildfire regime, significant shifts occurred in diet composition between more recently burned and unburned areas. Specifically, we found that differences in diet composition between these two fire regimes were due to the presence of Tapinoma ants and jumping spiders (Salticus scenicus). These prey were present in the diets of lizards occupying unburned areas, while these were absent in areas burned in 2022. Interestingly, diets in unburned areas and areas burned in 2016 showed no significant differences, highlighting the lizards’ ecological flexibility and the habitat’s resilience over time. The ant species T. topitotum was found in dominance in both burned areas, suggesting that this species may be fire tolerant. In addition, families such as Cicadellidae and Noctuidae were found to be more associated with more recently burned areas. The use of DNA metabarcoding in this study was essential to provide a more detailed and accurate view of predator-prey interactions in ecosystems susceptible to fire, and therefore a better understanding of changes in prey consumption in this fire-adapted ecosystem.There are substantial barriers to initiate advance care planning (ACP) for persons with chronic-progressive disease in primary care settings. Some challenges may be disease-specific, such as communicating in case of cognitive impairment. This study assessed and compared the initiation of ACP in primary care with persons with dementia, Parkinson’s disease, cancer, organ failure and stroke.
Longitudinal study linking data from a database of Dutch general practices’ electronic health records with national administrative databases managed by Statistics Netherlands.
Data from general practice records of 199 034 community-dwelling persons with chronic-progressive disease diagnosed between 2008 and 2016.
Incidence rate ratio (IRR) of recorded ACP planning conversations per 1000 person-years in persons with a diagnosis of dementia, Parkinson’s disease, organ failure, cancer or stroke, compared with persons without the particular diagnosis. Poisson regression and competing risk analysis were performed, adjusted for age, gender, migration background, living situation, frailty index and income, also for disease subsamples.
In adjusted analyses, the rate of first ACP conversation for persons with organ failure was the lowest (IRR 0.70 (95% CI 0.68 to 0.73)). Persons with cancer had the highest rate (IRR 1.75 (95% CI 1.68 to 1.83)). Within the subsample of persons with organ failure, the subsample of persons with dementia and the subsample of stroke, a comorbid diagnosis of cancer increased the probability of ACP. Further, for those with organ failure or cancer, comorbid dementia decreased the probability of ACP.
Considering the complexity of initiating ACP for persons with organ failure or dementia, general practitioners should prioritise offering it to them and their family caregivers. Policy initiatives should stimulate the implementation of ACP for people with chronic-progressive disease.
To explore challenges parents of children with cancer encounter while providing complex medical care at home.
Design: Cross-sectional convergent mixed-methods study. Instruments: Questionnaire and open interviews that mirrored and complemented each other.
Parents (n = 32), with no prior medical training, were expected to remain constantly vigilant as they monitored and managed rapidly changing situations. Regardless of time from diagnosis, they detected a mean of 3.3 ± 1.4 (0–6) symptoms, reported administering up to 22 daily medications, including cytotoxics, narcotics and injections, and dealt with many related challenges. Parents described needing responsive communication channels, especially when dealing with bleeding and infection emergency situations during off-hours.
Findings highlight the constantly shifting demands when managing a child with cancer at home. Educational programmes that address parental needs throughout treatment, tailored to protocol changes and individual circumstances, should be expanded and further developed.
Parents need continual education regarding home management throughout their children's illness and treatment.
This study addresses challenges parents of children with cancer encounter while providing complex medical care at home. The findings demonstrated that parents, responsible for administering numerous medications via various routes and managing symptoms and side effects, did not feel confident performing these tasks regardless of time from diagnosis. Nurses should adapt ongoing parental education regarding complex medical tasks, symptoms, side effects, emergency detection and management for children with cancer at home. The study adhered to the Mixed Methods Appraisal Tool (MMAT) and STROBE reporting method.
Parents of children with cancer participated in the design and questionnaire validation.
by Muluken Chanie Agimas, Mekuriaw Nibret Aweke, Berhanu Mengistu, Lemlem Daniel Baffa, Elsa Awoke Fentie, Ever Siyoum Shewarega, Aysheshim Kassahun Belew, Esmael Ali Muhammad
IntroductionMalaria is a global public health problem, particularly in sub-Saharan African countries. It is responsible for 90% of all deaths worldwide. To reduce the impact and complications associated with delayed treatment of malaria among children under five, comprehensive evidence about the magnitude and determinants of delayed treatment for malaria could be the solution. But there are no national-level studies in the Horn of Africa for decision-makers.
ObjectiveTo assess the prevalence and associated factors of delay in seeking malaria treatment among under-five children in the Horn of Africa.
MethodPublished and unpublished papers were searched on Google, Google Scholar, PubMed/Medline, EMBASE, SCOPUS, and the published articles’ reference list. The search mechanism was established using Medical Subject Heading (MeSH) terms by combining the key terms of the title. Joana Brigg’s Institute critical appraisal checklist was used to assess the quality of articles. A sensitivity test was conducted to evaluate the heterogeneity of the studies. The visual funnel plot test and Egger’s and Begg’s statistics in the random effect model were done to evaluate the publication bias and small study effect. The I2 statistics were also used to quantify the amount of heterogeneity between the included studies.
ResultsThe pooled prevalence of delayed treatment for malaria among under-five children in the Horn of Africa was 48% (95% CI: 34%–63%). History of child death (OR =2.5, 95% CI: 1.73–3.59), distance >3000 meters (OR = 2.59, 95% CI: 2.03–3.3), drug side effect (OR = 2.94, 95% CI: 1.86–4.67), formal education (OR = 0.69, 95% CI: 0.49–0.96), middle income (OR = 0.42, 95% CI: 0.28–0.63), expensiveness (OR = 4.39, 95% CI: 2.49–7.76), and affordable cost (OR = 2.13, 95% CI: 1.41–3.2) for transport were factors associated with malaria treatment delay among children.
Conclusion and recommendationsAbout one out of two parents in the Horn of Africa put off getting their kids treated for malaria. High transportation expenses, long travel times (greater than 3,000 meters) to medical facilities, and anxiety about drug side effects were major risk factors that contributed to this delay. On the other hand, a middle-class income was found to be protective of treatment delays. These results highlight how crucial it is to improve access to healthcare services, both financially and physically, to minimize delays in treating malaria in the area’s children.