by Ana Caroline Bini de Lima, Vanessa Cristini Sebastião da Fé, Maria Simara Palermo Hernandes, Emily Caroline Pfeifer de Cristo, Ana Gabrieli dos Santos Fagundes Euzébio, Maria Vitória e Silva Sousa, Fabiana Ribeiro Caldara, Viviane Maria Oliveira dos Santos
This study aimed to evaluate the ability of social noncontact environmental enrichment to facilitate social buffering and to characterize the emotional experience of horses subjected to restraint in stock by assessing physiological parameters and facial expressions. Pantaneiro horses (n = 11) were evaluated in a crossover design with two treatments: social noncontact enrichment during stock restraint and social isolation during stock restraint. Physiological parameters (heart rate, heart rate variability, respiratory rate, ocular temperature by infrared thermography, and auricular temperature by infrared thermometer) and facial expressions (EquiFACS) were assessed throughout the 24-minute restraint period. When horses were accompanied by a conspecific, heart rate, respiratory rate, and eye temperature were lower (p nostril dilator (AD38), inner brow raiser (AU101), upper eyelid raiser (AU5), eye white increase (AD1), ears forward (EAD101), and ears back (EAD104), was also lower (pby Zhilan Huang, Tingyi Xie, Mingwen Tang, Zhuni Chen, Dan Jia, Anqi Su, Zhujin Jin, Tuliang Liang, Wei Xie
BackgroundPulmonary fibrosis is a severe chronic lung disease whose prevalence has been rising in recent years, representing one of the major respiratory health challenges globally in the 21st century. The burden of this disease on the elderly population is garnering growing attention, particularly as the global population ages. The Global Burden of Disease (GBD) study has provided valuable insights; however, systematic analyses focused on this condition remain limited. To date, few studies have specifically examined interstitial lung disease and pulmonary sarcoidosis among individuals aged 55 years and older. This study aims to conduct a comprehensive analysis of burden trends from 1990 to 2021 for those aged 55 and above and to project future trends up to 2035.
MethodsOur approach utilizes the estimation of four broad component measures: incidence, prevalence, death and Disability-Adjusted Life Years (DALYs), using data on ILD&PS from the Global Burden of Disease (GBD) 2021 database. Joinpoint regression models were applied to calculate the average annual percentage change (AAPC) in order to analyze temporal trends in disease burden and to identify years with significant trend shifts. Analyses were further stratified by age, sex, region, country, and Sociodemographic Index (SDI). Additionally, a Bayesian age-period-cohort (BAPC) model was used to project future disease burden trends.
ResultsBetween 1990 and 2021, significant increases were observed in incidence, DALYs, and death rates for ILD&PS (AAPC incidence = 1.09, 95% CI: 1.04 to 1.15; AAPC DALYs = 1.10, 95% CI: 0.97 to 1.23; AAPC death = 1.65, 95% CI: 1.47 to 1.83). In 2021, the total number of incident cases reached 284,887 (95% UI 248,300–328,800), with the highest incidence rates observed in Andean Latin America. Across age- and sex-specific analyses, global burden trends were similar, though males consistently exhibited higher rates than females. The oldest age group (95 + years) had the highest incidence and DALYs rates among all age strata. Furthermore, incidence rates increased most markedly in high-SDI regions, showing a strong positive correlation between SDI and incidence. Bayesian age–period–cohort (BAPC) analyses indicated that while prevalence rates are projected to decline slightly, incidence rates are expected to continue rising. Both males and females showed a dip then rise in prevalence trends, but the increase was more pronounced among females. In 2035, the highest number of incident cases is projected to occur in the 65–69 age group, whereas the highest incidence rate is predicted in the 95 + age group.
ConclusionsA concerning upward trend in incidence, DALYs, and deaths related to ILD&PS was observed in the global population aged 55 years and older, particularly among females. To our knowledge, this is the first study to comprehensively analyze the burden of ILD&PS in this age group from 1990 to 2021. Our findings on epidemiological trends and their variations across geography, SDI, age, and sex can inform policy-makers in designing targeted strategies to mitigate the anticipated rise in disease burden.
by Yang Tong, Huang Qianzhen, Tan Bo, Hu Bin, Zhang Min
BackgroundAdvancing the development of centers for disease control and prevention (CDCs) has become a priority within global public health governance. However, public health governance capacity varies significantly among CDCs across different countries and regions, grassroots CDCs face particular disadvantages. Establishing stable, efficient collaborative development mechanisms among CDCs across diverse regions to maximize overall effectiveness and ensure sustainable development represents a critical public health science issue.
ObjectiveThis study aims to provide scientific references and a theoretical foundation for the coordinated development of grassroots CDCs within the Chengdu–Chongqing Economic Circle (CCEC) and the construction of public health systems.
MethodsA questionnaire for collaborative development needs indicators in grassroots CDCs, comprising 4 primary needs and 13 secondary needs, was developed through literature review, the Delphi expert consultation method, and the Kano model. Analysis focused on questionnaires collected from eight grassroots CDCs within the CCEC. The importance of needs was ranked using the better–worse coefficient and satisfaction sensitivity analysis.
ResultsAnalysis of the 110 valid questionnaires showed that for the must-be attribute, satisfaction sensitivity ranked as follows: performance compensation (0.883)> talent exchange and scientific research and innovation cooperation (0.824)> public health emergency rescue mechanism (emergency material reserve and cross-regional material mobilization; 0.817)> cross-regional case monitoring, investigation, and tracking (0.775). Regarding the one-dimensional attribute, the satisfaction sensitivity ranking was joint risk assessment and emergency command (0.937)> business archive co-construction and sharing mechanism (emergency response plan, and technical scheme) (0.909)> regional co-construction and sharing between the university and the local area (0.832). For the attractive attribute, the satisfaction sensitivity ranking was regional monitoring and early-warning information management system (0.922)> community chronic disease prevention and service (0.804)> coordinated transfer and diversion diagnosis and treatment of patient with infectious diseases within the region (0.734). However, the collaborative release and interaction mechanism of social integrated media information, public health collaborative governance entities, and the construction of a cross-regional expert database constitute indifferent attributes.
ConclusionsThis study provides preliminary scientific evidence for the precise allocation of public health resources and the establishment of localized collaborative development mechanisms. Simultaneously, the research methodology and analytical framework offer new theoretical references for similar studies in other regions globally.
by Munawar Farooq, Uffaira Hafeez, Amir Ahmad, Susan Waller, Gabriel Andrade, Arif Alper Cevik, Syed Fahad Javaid
BackgroundStress is a prevalent issue among university students and is linked to adverse academic and emotional outcomes. While research emphasizes the roles of resilience, personality traits, and psychosocial factors, most studies are drawn from North American and European contexts.
ObjectivesThis is the first study of its kind in the United Arab Emirates (UAE) exploring the relationship between perceived stress, resilience, and personality traits among university students, offering insights into region-specific influences on emotional well-being.
MethodsAn online cross-sectional survey was conducted among 168 students from two colleges at the United Arab Emirates University (79% College of Medicine and Health Sciences, 21% College of Information Technology; 72% female). Data were analyzed using descriptive statistics and regression models in R version 4.2.0. Personality traits were assessed using the Ten-Item Personality Inventory, perceived stress was measured with the Perceived Stress Scale, and resilience was evaluated with the Brief Resilience Scale.
ResultsThe median perceived stress score was 22 (IQR: 17–28), and 30% reported high stress. Multivariable analysis showed that heavier academic workload, financial difficulties, lack of social support, lower physical activity, and poorer academic performance significantly predicted higher perceived stress, whereas resilience and emotional stability were protective.
ConclusionUniversity students’ perceived stress is closely associated with modifiable factors, including academic workload, social support, resilience, and physical activity. Targeted interventions, such as resilience training, promoting physical activity, optimizing academic schedules, and strengthening support services, are vital to reducing perceived stress and enhancing student well-being.
by José Manuel García-Moreno, Tyler Adams, Amber Beynon, Janine Vlaar Olthuis, Stephan U. Dombrowski, Richelle Witherspoon, Niels Wedderkopp, Jeffrey J. Hébert
BackgroundRehabilitation and behavior change interventions are commonly used after lumbar surgery to improve recovery, but their effects on physical capacity and physical activity remain unclear. This study aimed to investigate the effectiveness of rehabilitation and behavior change interventions on physical capacity and physical activity behavior in patients following lumbar surgery for degenerative disease.
MethodsEMBASE, MEDLINE, PsycINFO, and CENTRAL were searched from inception to September 2025 and reference lists were hand-searched. Randomized controlled trials assessing rehabilitation or behavior change interventions on physical capacity or physical activity behavior in adults with lumbar degenerative disc disease who underwent lumbar surgery were included. Review author pairs independently extracted data and assessed included studies. Risk of bias was assessed with the Cochrane tool, and study quality with the Grading of Recommendations Assessment, Development and Evaluation classification. Results were pooled using random-effects models and reported as standardized mean differences (SMD) with 95% confidence intervals (CI).
ResultsExercise was more effective than minimal or usual care in improving trunk extension endurance in the immediate term (SMD, 1.54; 95% CI, 0.93–2.16). Supervised exercise outperformed self-directed exercise in improving trunk extension endurance in the immediate term (SMD, 1.28; 95% CI, 0.75–1.81). Psychologically informed rehabilitation was more effective than minimal or usual care in increasing physical activity levels in the intermediate term (SMD, 0.26; 95% CI, 0.02–0.49), but not in the immediate term (SMD, 0.17; 95% CI, −0.14 to 0.49). Physical activity advice did not increase physical activity levels compared to minimal or usual care in the immediate term (SMD, 0.21; 95% CI, −0.13 to 0.55). Prehabilitation was more effective than minimal or usual care in increasing physical activity levels in the intermediate term (SMD, 0.28; 95% CI, 0.03–0.53). Certainty of evidence ranged from low to moderate.
ConclusionsFor adults with lumbar degenerative disease who underwent lumbar surgery, exercise, especially supervised programs, improved trunk extension endurance in the immediate term. Psychologically informed rehabilitation and prehabilitation increased physical activity levels in the intermediate term, while physical activity advice showed no benefit. Findings are limited by low certainty of evidence and high risk of bias.
by Emily Tufano, Kondaiah Palsa, Rebecka O. Serpa, Timothy B. Helmuth, Gabriela Remit-Berthet, Sara Mills-Huffnagle, Mathias Kant, Aurosman Sahu, James R. Connor
Iron is essential for normal physiological function, yet dysregulation of iron metabolism is increasingly recognized as a hallmark of cancers such as glioblastoma (GBM). Recent clinical evidence suggests that systemic iron deficiency anemia (IDA) negatively impacts GBM outcomes in a sex-dependent manner, but the mechanisms linking systemic iron availability to tumor iron metabolism remain poorly understood. Here, we interrogate the impact of systemic iron through dietary modulation (control, iron deficiency (ID), and high iron diets), stratified by sex, on tumor iron handling and GBM outcomes utilizing an immune competent (C57BL/6) GBM (GL261) mouse model. Subsequently, we analyzed clinical samples to evaluate translational value. In the preclinical study, we show that iron deficiency decreased survival in males but conferred a slight survival advantage in females, consistent with prior clinical trends. Among circulating iron markers, only ferritin light chain (FTL), but not ferritin heavy chain (FTH) or serum iron, positively correlated with survival in males but not females. In the brain, contralateral iron levels reflected dietary iron status in males but not females, further supporting sex-dependent regulation of local and circulating iron. Notably, tumor iron content remained unchanged in males but was significantly elevated in ID female tumors, complemented by increased transferrin receptor (TfR1) and FTH expression. In clinical GBM samples, we observed non-statistically significant but similar survival trends across varying iron and ferritin levels, suggesting potential translational relevance of our exploratory model. These findings demonstrate that systemic iron availability exerts a sex-specific effect on tumor iron handling, highlighting a critical relationship between systemic and tumor iron regulation in GBM.by Edidiong Orok, Oluwaseun Olumoko, Inimuvie Ekada, Amos Oladunni
Inappropriate use of antimalarial medications can accelerate the development of antimicrobial resistance (AMR), undermining treatment efficacy and public health goals. Artemether-lumefantrine (A/L) is the first-line treatment for uncomplicated malaria in Nigeria, yet its misuse persists, particularly among young adults. This study assessed knowledge gaps in A/L use among university students in Southwestern Nigeria to identify opportunities for targeted intervention. A cross-sectional online survey was conducted among undergraduate students from three universities in Southwestern Nigeria. Respondents’ knowledge of A/L was categorized as good (≥70%), fair (50–69%), or poor (by Yuzhong Feng, Jiazhen Cui, Xuan Huang, Yupeng Li, Haolong Dong, Xianghua Xiong, Gang Liu, Qingyang Wang, Huipeng Chen
Uricase-based drugs excel at treating refractory hyperuricemia and tumor lysis syndrome by directly degrading uric acid but are limited by immunogenicity. Here, we engineered RAW264.7 macrophages with ectopic co-expression of Aspergillus flavus uricase and murine urate anion transporter 1 (URAT1), forming a “transport-degradation” system: URAT1 actively transports uric acid into cells for intracellular degradation. Recombinant lentiviral vectors carrying target genes were transfected into RAW264.7 cells, followed by puromycin screening. In vitro assays showed that the engineered macrophages nearly completely degraded uric acid (from 556.0 ± 37.0 μmol/L to 0.7 ± 0.6 μmol/L) at 72 h. URAT1 inhibition with benzbromarone abolished uric acid degradation in URAT1-expressing cells. In both acute dietary-induced and chronic genetic hyperuricemic mouse models, RAW-afUri-URAT1 exerted robust and sustained uric acid-lowering activity, maintaining serum uric acid at 77.14 ± 37.48 μmol/L on day 16 in yeast extract gavaged mice and normalizing serum uric acid to 76.2 ± 15.9 μmol/L in liver uricase conditional knockout mice, both significantly superior to the rebound levels observed in mice treated with Rasburicase (143.19 ± 38.21 μmol/L and 142.4 ± 17.4 μmol/L, respectively; Pby Changze Ou, Binbin Chen, Jun Deng, Huajun Long
BackgroundHistone deacetylases (HDACs) regulate neuroprotection; however, Trichostatin A (TSA), an HDAC inhibitor, lacks clear molecular mechanisms and core targets in Alzheimer’s disease (AD), limiting clinical translation. This study aimed to decipher TSA’s AD-regulating network, screen core genes, and support AD early diagnosis and multi-target therapies.
MethodsTSA targets were computationally predicted. Five GEO AD datasets were analyzed for differential genes and core modules, and 130 machine learning algorithms were employed to identify core genes. Functional annotation, immune cell analysis, and single-cell expression profiling were conducted. Molecular docking and 100 ns molecular dynamics simulations verified TSA-protein interactions.
Results949 potential TSA targets were identified, overlapping with AD differential genes and enriching key pathways such as GABAergic synapse and tau phosphorylation. Eight machine learning-identified core genes (EFNA1, GABRB2, GABARAPL1, EGR1, CDK5, KCNC2, MET, GRIA2) exhibited a distinct AD expression pattern: synergistic downregulation of protective genes and unique upregulation of pathological EFNA1. These genes are implicated in neurotransmission, synaptic plasticity, tau clearance, and immune-neural crosstalk. Molecular dynamics simulations suggested TSA may not stably bind these candidates, implying its regulation relies on epigenetic mechanisms via HDAC1–3/6 inhibition, potentially restoring gene network balance and disrupting neuroinflammation-neurodegeneration cycles. Complex regulatory modes and cell type-specific expression were also observed.
ConclusionThis study provides preliminary insights into TSA’s putative mechanisms in AD intervention, highlighting the eight candidate core genes’ potential diagnostic and therapeutic value as AD biomarkers, supporting TSA’s multi-target therapy. All findings are computationally derived and require experimental verification.
by Sandra S. Chaves, Valérie Bosch Castells, Ainara Mira-Iglesias, Joan Puig-Barberà, F. Xavier López-Labrador, Miguel Tortajada-Girbés, Mario Carballido-Fernández, Joan Mollar-Maseres, Germán Schwarz-Chávarri, Javier Díez-Domingo, Alejandro Orrico-Sánchez, Valencia Hospital Network for the Study of Influenza and other Respiratory Viruses (VAHNSI)
BackgroundUnderstanding the burden of acute viral respiratory infection-related hospitalizations is crucial for guiding research and development. Unlike influenza, respiratory syncytial virus (RSV), or severe acute respiratory syndrome coronavirus 2, no pharmaceutical interventions exist for other respiratory viruses; therefore, their impact remains poorly characterized. This study aimed to investigate the association of current non-vaccine-preventable respiratory viruses, especially rhinovirus/enterovirus (RV/EV), on hospitalizations during the respiratory seasons.
MethodsData from a prospective study that used multiplex polymerase chain reaction to conduct long-term surveillance on respiratory viruses in Valencia, Spain were analyzed. Patients aged ≥50 years hospitalized due to respiratory illness from 2014–15–2019–20 were included.
ResultsRespiratory viruses were detected in 35.2% (3,755/10,675) of hospitalized patients with acute respiratory illness. Influenza and RSV accounted for 22.1% of hospitalizations, RV/EV for 7.6%, and other non-vaccine-preventable viruses for 5.4%. Adults ≥75 years had average seasonal hospitalization incidence rates more than twice those aged 65–74 years and eight times those aged 50–64-year-olds. No significant differences in severity markers were observed among patients with or without virus identified, those aged ≥75 years had a 2–3 times higher mortality rate compared to younger age groups.
ConclusionsThe potential impact of respiratory viruses on hospitalization rates among older adults, particularly those aged ≥75 years, highlights the need for targeted interventions to reduce healthcare system burden. Enhanced diagnostic capabilities and the development of next-generation preventive strategies, including vaccines and therapeutics, could improve patient outcomes and strengthen the resilience of the healthcare system during respiratory virus seasons.
by Yuzhen Sun, Ziguang Zhou, Yu Mao, Niu Liu, Yanfeng Li, Weiyuan Fang
BackgroundPsoriasis, a chronic inflammatory skin disease affecting 2–3% of the global population, is driven by dysregulated immune responses. Despite advancements in biologic therapies, treatment challenges persist due to high recurrence rates. This study aimed to identify immune-related hub genes and elucidate their clinical implications in psoriasis pathogenesis and therapy.
MethodsMultiple microarray datasets from psoriasis patients (GSE30999, GSE106992, GSE14905, GSE78097, and GSE117468) were obtained to identify immune-key genes by differential gene analysis and Weighted Gene Co-expression Network Analysis (WGCNA). Subsequently, immune-related hub genes were identified using the Least Absolute Shrinkage and Selection Operator (LASSO) algorithm and Protein-Protein Interaction (PPI) networks, with further validation through Gene Set Enrichment Analysis (GSEA) and Receiver Operating Characteristic (ROC) curves to assess exploratory within-sample discrimination. Pearson correlation analysis evaluated the relationship between hub genes, skin lesion severity, and treatment outcomes. The study also conducted immune infiltration by using the Cell-type Identification by Estimating Relative Subsets Of RNA Transcripts (CIBERSORT) algorithm and identified potential therapeutic targets by the Drug-Gene Interaction Database (DGIdb).
ResultsThirty-one immune-related key genes were identified, and six hub genes (CLEC7A, CXCL1, IRF1, S100A12, S100A8, S100A9) were validated as central players in immune signaling pathways. These genes exhibited within-sample discrimination (AUC > 0.9) and correlated with disease severity and biological therapy efficacy. Immune infiltration analysis revealed increased activated memory CD4+ T cells and M1 macrophages in lesional skin, which was strongly associated with hub gene expression. Additionally, drug-gene interaction analysis identified potential therapeutic agents targeting these genes.
ConclusionThis study identified six immune-related hub genes that were closely linked to the severity of psoriasis, the effectiveness of biological treatments, and infiltrated activated memory CD4+ T cells and M1 macrophages. Our findings elucidate a novel immune-related hub gene network in psoriasis and provide potential targets for the development and application of biologics.
by Lateef Oluwatoyin Busari, Zarat Oyindamola Iwalewa, Olabanji Ahmed Surakat, Adedapo Olufemi Adeogun, Akinlabi Mohammad Rufai, Kamilu Ayo Fasasi, Monsuru Adebayo Adeleke
Insecticide resistance in malaria vectors remains a global public health problem; however, little is known about resistance levels in Osun State, despite relatively high rates of malaria and distribution of insecticide-treated nets in the area. This study evaluates the resistance status of adult female Anopheles gambiae s.l to pyrethroids (permethrin, deltamethrin and alpha-cypermethrin) and an organophosphate (pirimiphos-methyl) insecticides and knockdown resistant (KDR) gene detection in six locations (Ido-Osun, Ipetumodu, Inisa, Ejigbo, Ijebu-Jesha and Ila) across the three senatorial districts in Osun State, Nigeria. Larval sampling was done between 0700hr and 1100hrs weekly between January and December 2022. Collected larvae were reared to the adult stage in the Department of Animal and Environmental Biology laboratory of Osun State University, Osogbo, Nigeria and then identified morphologically using morphological keys. Insecticide bioassay was conducted with permethrin (0.75%), deltamethrin (0.05%), alpha-cypermethrin (0.05%) and pirimiphos-methyl (0.25%) using WHO procedure. The mosquitoes were subjected to molecular analysis to detect the KDR gene. Pirimiphos-methyl showed significantly higher knockdown at 60 minutes (KD60) and achieved 100% mortality compared with the pyrethroids tested (p Anopheles gambiae s.l as compared to pyrethroids. Therefore, there is a need to intensify insecticide resistance surveillance of Anopheles in Osun State to plan indoor residual spraying with pirimiphos-methyl and explore the use of PBO or dual active ingredient insecticides treated nets (ITNs) to address the potential impacts of pyrethroid resistance.by Yukari Hara, Aoi Nakagawa, Junko Omori
This study aimed to investigate changes in work values among Japanese nurses before and after pregnancy, their return to work while managing childcare responsibilities, and the multifaceted factors influencing these changes. A web-based survey of 199 female nurses assessed their work values before and after pregnancy, including retrospective questions and open-ended responses. Data were analyzed comprehensively using paired t-tests, generalized estimating equations, and inductive thematic analysis. Only prestige work values demonstrated a significant decrease after pregnancy, as revealed by paired t-tests. Generalized estimating equation analysis identified age at first child, social support from family and other sources, employment status, and educational background as the main factors influencing changes in work values. Qualitative findings indicated that this decline in prestige work values was due to a shift in nurses’ awareness that “their life became centered on their children and family,” alongside family roles and time constraints that limited career development. This study demonstrates that Japanese nurses experience significant changes in their work values during the transition from childbirth to work re-entry. These changes are intricately shaped by several factors, particularly individual life stage variables and social support from family and other sources. A nuanced understanding of these shifts in work values is essential for developing effective and individualized support systems to promote the retention and long-term career development of female nurses in Japan.by Esther Ortega-Martin, Javier Alvarez-Galvez
ObjectiveTo characterize the heterogeneity of Long COVID (LC) by identifying distinct patient profiles based on symptoms and quality of life (QoL), and to examine the sociodemographic and clinical predictors associated with these profiles.
Study designA cross-sectional observational study was conducted.
MethodsWe recruited 363 patients with LC in Spain via an online survey. Symptom patterns were identified through latent class analysis of 15 binary symptoms. QoL was assessed with the patient-derived LC-6D-QoL across six dimensions, and cluster analysis defined QoL subgroups. Logistic regression was applied to examine clinical and sociodemographic predictors of QoL profiles.
ResultsTwo symptom profiles emerged: a low-burden profile, dominated by fatigue and cognitive problems, and a high-burden profile with multisystem involvement. QoL clustered into three profiles—high, middle, and low QoL—with more than half of participants in the low QoL group. Symptom burden and employment status were the strongest predictors of poor QoL, whereas age, sex, education, and income showed limited associations. Social support was more frequently reported among participants with low QoL.
ConclusionsLC is characterized by distinct clinical and QoL profiles, with strong interactions between multisystem symptom burden and social determinants. Identifying patients at greatest risk of poor QoL can inform stratified interventions and integrated policies that combine medical care, psychosocial support, and workplace reintegration.
Extrapulmonary tuberculosis (EPTB) poses a significant diagnostic and economic challenge in HIV endemic, low-resource settings due to its complex presentation and current diagnostic tools limitations. While accurate and timely diagnosis is critical for reducing morbidity, mortality and health system costs, economic evaluations of EPTB diagnostics remain sparse and fragmented. This protocol aims to map existing evidence on the economic evaluation of diagnostic innovations for EPTB in low-resource settings.
This scoping review protocol follows the Joanna Briggs Institute (JBI) methodological framework and registered on the Open Science Framework. Peer-reviewed articles, grey literature and official reports published between 2000 and 2025 will be searched in PubMed, MEDLINE, Google Scholar, Scopus and Science Direct. The search strategy is structured using the Population, Intervention, Comparator, Outcome, Time, Study design and Setting (PICOTSS) framework, and will be peer-reviewed using the Peer Review of Electronic Search Strategies (PRESS) guideline. Study selection, data charting and extraction will be performed independently by two reviewers. Data will be charted iteratively, and the methodological quality of selected economic evaluations will be appraised using the Drummond checklist. Results will be synthesised in narrative summaries and tabular formats. Final reporting will follow the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews (PRISMA-ScR) reporting guideline.
For review of previously published data, ethical approval is not required. Findings will be disseminated in professional networks, stakeholder meetings and a peer-reviewed journal.
OSF Registration DOI 10.17605/OSF.IO/BTCPG
Staphylococcus aureus (S. aureus) bacteraemia is a common and severe infection. With mortality rates ranging from 20–30% and long-term impairments in over a third of survivors, better treatments are urgently needed. Linezolid, a well-established treatment for pneumonia and complicated skin infections, has been shown in preclinical studies to strongly suppress S. aureus virulence factors critical to bacterial persistence and tissue damage. Hence, we aim to investigate whether the addition of linezolid to standard therapy in patients with S. aureus bacteraemia leads to an overall improvement in patient-relevant outcomes.
We will conduct a two-arm, parallel-group, multicentre, randomised controlled trial (Linezolid Plus Standard of Care) in 12 hospitals in Switzerland with blinded treating physicians, patients and outcome assessors. Hospitalised patients aged ≥18 years with S. aureus bacteraemia will be eligible. Patients will receive standard antibiotic treatment as prescribed by the treating physician. Within 72 hours of collection of the blood sample yielding the first positive blood culture, patients will be enrolled and randomised 1:1 to receive either adjunctive linezolid (600 mg orally two times per day for 5 days) or placebo. To determine patient-relevant outcomes, we implemented a comprehensive patient-representative consultation process. Consequently, we will use the desirability of outcome ranking (DOOR) established for S. aureus bacteraemia as the primary outcome at 90 days. The hierarchical composite DOOR outcome includes the following four components, ranked from most to least important: (1) survival, (2) return to level of function before S. aureus infection, (3) complications leading to treatment changes and serious adverse reactions; and (4) hospital length of stay. This approach will allow us to analyse the win ratio, that is, whether patients receiving linezolid have a better DOOR rank compared to patients in the placebo group. We calculated a target sample size of 606 patients providing 90% power at a two-sided significance level of 0.05.
Ethical approval was received from the Ethics committee for Northern and Central Switzerland (BASEC number 2025-00655). Eligible patients will be informed about the study by the local study team and asked for written consent if they wish to participate. For patients unable to provide informed consent, an appropriate substitute (ie, a close relative or a physician not involved in the research project) may make decisions based on the presumed wishes and the best interest of the patient. The patient’s own consent will be obtained as soon as their condition permits. Results will be published in peer-reviewed journals and in laymen's terms through various channels (social media, Swiss national portal HumRes).
To examine the risk of severe cardiovascular (CV) events in patients with chronic obstructive pulmonary disease (COPD) across different time periods following COPD exacerbations and the incidence rate of cardiopulmonary events in a real-world setting in China.
Retrospective cohort study.
Regional electronic health records database from Yinzhou District of Ningbo City, China.
A total of 14 713 patients aged ≥40 years with a first COPD diagnosis between 1 January 2014 and 1 March 2022.
The risk of severe CV events (ie, hospitalisation and a primary or secondary discharge code for acute coronary syndrome, heart failure decompensation, cerebral ischaemia, arrhythmia and CV-related death) during different exposed time periods following a COPD exacerbation, the incidence rate of overall cardiopulmonary events (ie, severe exacerbation of COPD, all-cause mortality, inpatient CV events, inpatient ischaemic stroke and inpatient tachyarrhythmia/atrial fibrillation) and the incidence rate stratified by COPD exacerbation history.
We included a total of 14 713 patients. During a median (IQR) follow-up of 2.8 (4.0) years, 20.1% experienced severe CV events. Compared with the unexposed period, the risk of severe CV events was the highest in the first 10 days following a COPD exacerbation (adjusted HR 10.00, 95% CI 8.16 to 12.25). The risk of severe CV events decreased over time but remained significantly elevated up to 90 days post exacerbation. We found that 32.7% of COPD patients experienced cardiopulmonary events, with a crude incidence rate of 9.38 (95% CI 9.09 to 9.69) per 100 person-years.
This study is the largest retrospective cohort study investigating CV and cardiopulmonary events among patients with COPD in China. Our findings highlight an elevated risk of CV events closer to the time of COPD exacerbations and show that nearly one-third of COPD patients experience cardiopulmonary events.
Visual impairment is reported to affect 40%–50% of children with cerebral palsy (CP). Vision difficulties in the context of rehabilitation are often under-recognised, under-treated and therefore under-studied, pointing to an urgent need for the development of evidence-based vision interventions for infants and toddlers with cerebral vision impairment (CVI). We present the protocol of a multisite pragmatic pilot randomised controlled trial (RCT) of feasibility, acceptability and preliminary efficacy of an early vision-awareness and parent-directed environmental enrichment programme for infants with or at risk of CP under 7 months corrected age (CA) with vision impairment.
The main objective is to determine the feasibility and acceptability of the Vision Intervention for Seeing Impaired Babies: Learning through Enrichment (VISIBLE) intervention. We will estimate the preliminary effects of the programme on infants’ visual functions and early development, as compared with standard community-based care (SCC).
A two-group RCT will be conducted. Infants at 3–6 months at entry, with severe visual impairment and at high risk of CP, will be enrolled and randomised (n=16 per group) to receive the VISIBLE intervention compared to SCC. Randomisation will be completed through an independent automated process (Research Electronic Data Capture). VISIBLE intervention will be delivered by a therapist through home visits (90–120 min) once every 2 weeks. Completion of 10 visits (80% of the intervention target dose) within 6 months is required for adherence to the VISIBLE trial. Outcome will be assessed at 12 months CA. Visual function will be evaluated with the Infant Battery for Vision, motor outcomes with the Peabody Developmental Motor Scales, Second Edition. Developmental quotients, infant quality of life, parent well-being and parent-infant relationship will be also monitored through standardised tools.
The enrolling sites have historically demonstrated rapid and effective translation of successful evidence-based interventions into routine clinical practice, as well as the dissemination of the findings through local, national and international scientific meetings.
ACTRN12618000932268.
Poor communication between healthcare professionals is one of the main causes of medical errors. Many articles about interprofessional communication (IPC) do not define what communication is and often describe it only as a domain of competencies of interprofessional collaboration. Three communication paradigms coexist: the transmission model, the transactional model and the constitutive model. These models focus on different aspects of communication and are complementary. No review about IPC, including all healthcare professionals or all healthcare settings, has been found.
A scoping review protocol was developed to map the research on the topic of IPC, the paradigms of communication used by the researchers, as well as to clarify the definition of this concept. We will follow the Joanna Briggs Institute methodology for scoping reviews and the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) extension for scoping reviews. Eligibility criteria follow the Population, Concept, Context framework. Articles about health professionals, allied health professionals and social workers and students in these fields will be included. Articles evaluating IPC in healthcare, either quantitatively or qualitatively, will be included. Articles investigating IPC in any type of healthcare setting in any country will be considered. All types of published articles in scientific journals will be included. The databases that were searched are MEDLINE, CINAHL, APA PsycINFO, EMBASE and Web of Science. In October 2025, 22 798 citations were retrieved, of which 9722 duplicates were deleted. Two researchers will then independently assess the remaining 13 078 citations against the eligibility criteria. This step is scheduled for completion in May 2026. They will then chart the data using a standardised data extraction tool.
Formal ethical approval is not required, as primary data will not be collected in this study. Findings of the scoping review will be disseminated through professional networks, conference presentations and publication in a scientific journal.
Because the study is a scoping and not a systematic review, registration was not possible on PROSPERO. The study was registered on Open Science Framework: https://osf.io/dyh2a.
Poor participant retention in randomised clinical trials, resulting in missing outcome data, can impact the validity, reliability and generalisability of results. While participants’ views on general non-retention issues have been reported elsewhere, a qualitative evidence synthesis specifically focusing on trial processes (ie, outcome data collection) impacting retention has not been undertaken to date. This is an important research question to inform targeted interventions to support retention. This review aims to address this by systematically searching and synthesising the evidence on participant reasons for trial non-completion, linked to outcome data collection.
We conducted a qualitative evidence synthesis of qualitative studies and mixed methods studies with a qualitative component, in Embase, Ovid MEDLINE, PsycINFO, Cochrane Central Register of Controlled Trials (CENTRAL), Social Science Citation Index, Cumulative Index of Nursing & Allied Health Literature and Applied Social Sciences Index and Abstracts, up to February 2025. We used Thomas and Harden’s thematic synthesis approach. The Grading of Recommendations Assessment, Development and Evaluation-Confidence in the Evidence from Reviews of Qualitative framework was used to assess confidence in the review findings.
We identified 11 studies reporting qualitative data from 14 separate trials, with findings from 105 trial non-retainers. The studies were undertaken between 2007 and 2025.
There were three types of participant non-retention behaviours reported across the studies, where participants either: (1) missed at least one clinic visit; (2) did not complete a postal questionnaire or (3) did not complete online data collection. We developed four analytical themes outlining participant-reported influences on trial non-retention, specifically related to trial processes (ie, data collection for outcome measures): fluctuating health, balancing trial burdens, navigating life as a trial participant and managing expectations of participation.
This review generates important insights into participants’ reasons for trial non-completion linked to outcome data collection. The review highlights the need for further research into supporting trial recruitment discussions that provide clear, realistic expectations for potential trial participants, as well as strategies that recognise, and where possible, address some of the influences on participants to improve outcome data completeness and ultimately improve trial retention.