by Ana Caroline Bini de Lima, Vanessa Cristini Sebastião da Fé, Maria Simara Palermo Hernandes, Emily Caroline Pfeifer de Cristo, Ana Gabrieli dos Santos Fagundes Euzébio, Maria Vitória e Silva Sousa, Fabiana Ribeiro Caldara, Viviane Maria Oliveira dos Santos
This study aimed to evaluate the ability of social noncontact environmental enrichment to facilitate social buffering and to characterize the emotional experience of horses subjected to restraint in stock by assessing physiological parameters and facial expressions. Pantaneiro horses (n = 11) were evaluated in a crossover design with two treatments: social noncontact enrichment during stock restraint and social isolation during stock restraint. Physiological parameters (heart rate, heart rate variability, respiratory rate, ocular temperature by infrared thermography, and auricular temperature by infrared thermometer) and facial expressions (EquiFACS) were assessed throughout the 24-minute restraint period. When horses were accompanied by a conspecific, heart rate, respiratory rate, and eye temperature were lower (p nostril dilator (AD38), inner brow raiser (AU101), upper eyelid raiser (AU5), eye white increase (AD1), ears forward (EAD101), and ears back (EAD104), was also lower (pby Zhilan Huang, Tingyi Xie, Mingwen Tang, Zhuni Chen, Dan Jia, Anqi Su, Zhujin Jin, Tuliang Liang, Wei Xie
BackgroundPulmonary fibrosis is a severe chronic lung disease whose prevalence has been rising in recent years, representing one of the major respiratory health challenges globally in the 21st century. The burden of this disease on the elderly population is garnering growing attention, particularly as the global population ages. The Global Burden of Disease (GBD) study has provided valuable insights; however, systematic analyses focused on this condition remain limited. To date, few studies have specifically examined interstitial lung disease and pulmonary sarcoidosis among individuals aged 55 years and older. This study aims to conduct a comprehensive analysis of burden trends from 1990 to 2021 for those aged 55 and above and to project future trends up to 2035.
MethodsOur approach utilizes the estimation of four broad component measures: incidence, prevalence, death and Disability-Adjusted Life Years (DALYs), using data on ILD&PS from the Global Burden of Disease (GBD) 2021 database. Joinpoint regression models were applied to calculate the average annual percentage change (AAPC) in order to analyze temporal trends in disease burden and to identify years with significant trend shifts. Analyses were further stratified by age, sex, region, country, and Sociodemographic Index (SDI). Additionally, a Bayesian age-period-cohort (BAPC) model was used to project future disease burden trends.
ResultsBetween 1990 and 2021, significant increases were observed in incidence, DALYs, and death rates for ILD&PS (AAPC incidence = 1.09, 95% CI: 1.04 to 1.15; AAPC DALYs = 1.10, 95% CI: 0.97 to 1.23; AAPC death = 1.65, 95% CI: 1.47 to 1.83). In 2021, the total number of incident cases reached 284,887 (95% UI 248,300–328,800), with the highest incidence rates observed in Andean Latin America. Across age- and sex-specific analyses, global burden trends were similar, though males consistently exhibited higher rates than females. The oldest age group (95 + years) had the highest incidence and DALYs rates among all age strata. Furthermore, incidence rates increased most markedly in high-SDI regions, showing a strong positive correlation between SDI and incidence. Bayesian age–period–cohort (BAPC) analyses indicated that while prevalence rates are projected to decline slightly, incidence rates are expected to continue rising. Both males and females showed a dip then rise in prevalence trends, but the increase was more pronounced among females. In 2035, the highest number of incident cases is projected to occur in the 65–69 age group, whereas the highest incidence rate is predicted in the 95 + age group.
ConclusionsA concerning upward trend in incidence, DALYs, and deaths related to ILD&PS was observed in the global population aged 55 years and older, particularly among females. To our knowledge, this is the first study to comprehensively analyze the burden of ILD&PS in this age group from 1990 to 2021. Our findings on epidemiological trends and their variations across geography, SDI, age, and sex can inform policy-makers in designing targeted strategies to mitigate the anticipated rise in disease burden.
by Munawar Farooq, Uffaira Hafeez, Amir Ahmad, Susan Waller, Gabriel Andrade, Arif Alper Cevik, Syed Fahad Javaid
BackgroundStress is a prevalent issue among university students and is linked to adverse academic and emotional outcomes. While research emphasizes the roles of resilience, personality traits, and psychosocial factors, most studies are drawn from North American and European contexts.
ObjectivesThis is the first study of its kind in the United Arab Emirates (UAE) exploring the relationship between perceived stress, resilience, and personality traits among university students, offering insights into region-specific influences on emotional well-being.
MethodsAn online cross-sectional survey was conducted among 168 students from two colleges at the United Arab Emirates University (79% College of Medicine and Health Sciences, 21% College of Information Technology; 72% female). Data were analyzed using descriptive statistics and regression models in R version 4.2.0. Personality traits were assessed using the Ten-Item Personality Inventory, perceived stress was measured with the Perceived Stress Scale, and resilience was evaluated with the Brief Resilience Scale.
ResultsThe median perceived stress score was 22 (IQR: 17–28), and 30% reported high stress. Multivariable analysis showed that heavier academic workload, financial difficulties, lack of social support, lower physical activity, and poorer academic performance significantly predicted higher perceived stress, whereas resilience and emotional stability were protective.
ConclusionUniversity students’ perceived stress is closely associated with modifiable factors, including academic workload, social support, resilience, and physical activity. Targeted interventions, such as resilience training, promoting physical activity, optimizing academic schedules, and strengthening support services, are vital to reducing perceived stress and enhancing student well-being.
by José Manuel García-Moreno, Tyler Adams, Amber Beynon, Janine Vlaar Olthuis, Stephan U. Dombrowski, Richelle Witherspoon, Niels Wedderkopp, Jeffrey J. Hébert
BackgroundRehabilitation and behavior change interventions are commonly used after lumbar surgery to improve recovery, but their effects on physical capacity and physical activity remain unclear. This study aimed to investigate the effectiveness of rehabilitation and behavior change interventions on physical capacity and physical activity behavior in patients following lumbar surgery for degenerative disease.
MethodsEMBASE, MEDLINE, PsycINFO, and CENTRAL were searched from inception to September 2025 and reference lists were hand-searched. Randomized controlled trials assessing rehabilitation or behavior change interventions on physical capacity or physical activity behavior in adults with lumbar degenerative disc disease who underwent lumbar surgery were included. Review author pairs independently extracted data and assessed included studies. Risk of bias was assessed with the Cochrane tool, and study quality with the Grading of Recommendations Assessment, Development and Evaluation classification. Results were pooled using random-effects models and reported as standardized mean differences (SMD) with 95% confidence intervals (CI).
ResultsExercise was more effective than minimal or usual care in improving trunk extension endurance in the immediate term (SMD, 1.54; 95% CI, 0.93–2.16). Supervised exercise outperformed self-directed exercise in improving trunk extension endurance in the immediate term (SMD, 1.28; 95% CI, 0.75–1.81). Psychologically informed rehabilitation was more effective than minimal or usual care in increasing physical activity levels in the intermediate term (SMD, 0.26; 95% CI, 0.02–0.49), but not in the immediate term (SMD, 0.17; 95% CI, −0.14 to 0.49). Physical activity advice did not increase physical activity levels compared to minimal or usual care in the immediate term (SMD, 0.21; 95% CI, −0.13 to 0.55). Prehabilitation was more effective than minimal or usual care in increasing physical activity levels in the intermediate term (SMD, 0.28; 95% CI, 0.03–0.53). Certainty of evidence ranged from low to moderate.
ConclusionsFor adults with lumbar degenerative disease who underwent lumbar surgery, exercise, especially supervised programs, improved trunk extension endurance in the immediate term. Psychologically informed rehabilitation and prehabilitation increased physical activity levels in the intermediate term, while physical activity advice showed no benefit. Findings are limited by low certainty of evidence and high risk of bias.
by Emily Tufano, Kondaiah Palsa, Rebecka O. Serpa, Timothy B. Helmuth, Gabriela Remit-Berthet, Sara Mills-Huffnagle, Mathias Kant, Aurosman Sahu, James R. Connor
Iron is essential for normal physiological function, yet dysregulation of iron metabolism is increasingly recognized as a hallmark of cancers such as glioblastoma (GBM). Recent clinical evidence suggests that systemic iron deficiency anemia (IDA) negatively impacts GBM outcomes in a sex-dependent manner, but the mechanisms linking systemic iron availability to tumor iron metabolism remain poorly understood. Here, we interrogate the impact of systemic iron through dietary modulation (control, iron deficiency (ID), and high iron diets), stratified by sex, on tumor iron handling and GBM outcomes utilizing an immune competent (C57BL/6) GBM (GL261) mouse model. Subsequently, we analyzed clinical samples to evaluate translational value. In the preclinical study, we show that iron deficiency decreased survival in males but conferred a slight survival advantage in females, consistent with prior clinical trends. Among circulating iron markers, only ferritin light chain (FTL), but not ferritin heavy chain (FTH) or serum iron, positively correlated with survival in males but not females. In the brain, contralateral iron levels reflected dietary iron status in males but not females, further supporting sex-dependent regulation of local and circulating iron. Notably, tumor iron content remained unchanged in males but was significantly elevated in ID female tumors, complemented by increased transferrin receptor (TfR1) and FTH expression. In clinical GBM samples, we observed non-statistically significant but similar survival trends across varying iron and ferritin levels, suggesting potential translational relevance of our exploratory model. These findings demonstrate that systemic iron availability exerts a sex-specific effect on tumor iron handling, highlighting a critical relationship between systemic and tumor iron regulation in GBM.by Yuzhong Feng, Jiazhen Cui, Xuan Huang, Yupeng Li, Haolong Dong, Xianghua Xiong, Gang Liu, Qingyang Wang, Huipeng Chen
Uricase-based drugs excel at treating refractory hyperuricemia and tumor lysis syndrome by directly degrading uric acid but are limited by immunogenicity. Here, we engineered RAW264.7 macrophages with ectopic co-expression of Aspergillus flavus uricase and murine urate anion transporter 1 (URAT1), forming a “transport-degradation” system: URAT1 actively transports uric acid into cells for intracellular degradation. Recombinant lentiviral vectors carrying target genes were transfected into RAW264.7 cells, followed by puromycin screening. In vitro assays showed that the engineered macrophages nearly completely degraded uric acid (from 556.0 ± 37.0 μmol/L to 0.7 ± 0.6 μmol/L) at 72 h. URAT1 inhibition with benzbromarone abolished uric acid degradation in URAT1-expressing cells. In both acute dietary-induced and chronic genetic hyperuricemic mouse models, RAW-afUri-URAT1 exerted robust and sustained uric acid-lowering activity, maintaining serum uric acid at 77.14 ± 37.48 μmol/L on day 16 in yeast extract gavaged mice and normalizing serum uric acid to 76.2 ± 15.9 μmol/L in liver uricase conditional knockout mice, both significantly superior to the rebound levels observed in mice treated with Rasburicase (143.19 ± 38.21 μmol/L and 142.4 ± 17.4 μmol/L, respectively; Pby Changze Ou, Binbin Chen, Jun Deng, Huajun Long
BackgroundHistone deacetylases (HDACs) regulate neuroprotection; however, Trichostatin A (TSA), an HDAC inhibitor, lacks clear molecular mechanisms and core targets in Alzheimer’s disease (AD), limiting clinical translation. This study aimed to decipher TSA’s AD-regulating network, screen core genes, and support AD early diagnosis and multi-target therapies.
MethodsTSA targets were computationally predicted. Five GEO AD datasets were analyzed for differential genes and core modules, and 130 machine learning algorithms were employed to identify core genes. Functional annotation, immune cell analysis, and single-cell expression profiling were conducted. Molecular docking and 100 ns molecular dynamics simulations verified TSA-protein interactions.
Results949 potential TSA targets were identified, overlapping with AD differential genes and enriching key pathways such as GABAergic synapse and tau phosphorylation. Eight machine learning-identified core genes (EFNA1, GABRB2, GABARAPL1, EGR1, CDK5, KCNC2, MET, GRIA2) exhibited a distinct AD expression pattern: synergistic downregulation of protective genes and unique upregulation of pathological EFNA1. These genes are implicated in neurotransmission, synaptic plasticity, tau clearance, and immune-neural crosstalk. Molecular dynamics simulations suggested TSA may not stably bind these candidates, implying its regulation relies on epigenetic mechanisms via HDAC1–3/6 inhibition, potentially restoring gene network balance and disrupting neuroinflammation-neurodegeneration cycles. Complex regulatory modes and cell type-specific expression were also observed.
ConclusionThis study provides preliminary insights into TSA’s putative mechanisms in AD intervention, highlighting the eight candidate core genes’ potential diagnostic and therapeutic value as AD biomarkers, supporting TSA’s multi-target therapy. All findings are computationally derived and require experimental verification.
by Sandra S. Chaves, Valérie Bosch Castells, Ainara Mira-Iglesias, Joan Puig-Barberà, F. Xavier López-Labrador, Miguel Tortajada-Girbés, Mario Carballido-Fernández, Joan Mollar-Maseres, Germán Schwarz-Chávarri, Javier Díez-Domingo, Alejandro Orrico-Sánchez, Valencia Hospital Network for the Study of Influenza and other Respiratory Viruses (VAHNSI)
BackgroundUnderstanding the burden of acute viral respiratory infection-related hospitalizations is crucial for guiding research and development. Unlike influenza, respiratory syncytial virus (RSV), or severe acute respiratory syndrome coronavirus 2, no pharmaceutical interventions exist for other respiratory viruses; therefore, their impact remains poorly characterized. This study aimed to investigate the association of current non-vaccine-preventable respiratory viruses, especially rhinovirus/enterovirus (RV/EV), on hospitalizations during the respiratory seasons.
MethodsData from a prospective study that used multiplex polymerase chain reaction to conduct long-term surveillance on respiratory viruses in Valencia, Spain were analyzed. Patients aged ≥50 years hospitalized due to respiratory illness from 2014–15–2019–20 were included.
ResultsRespiratory viruses were detected in 35.2% (3,755/10,675) of hospitalized patients with acute respiratory illness. Influenza and RSV accounted for 22.1% of hospitalizations, RV/EV for 7.6%, and other non-vaccine-preventable viruses for 5.4%. Adults ≥75 years had average seasonal hospitalization incidence rates more than twice those aged 65–74 years and eight times those aged 50–64-year-olds. No significant differences in severity markers were observed among patients with or without virus identified, those aged ≥75 years had a 2–3 times higher mortality rate compared to younger age groups.
ConclusionsThe potential impact of respiratory viruses on hospitalization rates among older adults, particularly those aged ≥75 years, highlights the need for targeted interventions to reduce healthcare system burden. Enhanced diagnostic capabilities and the development of next-generation preventive strategies, including vaccines and therapeutics, could improve patient outcomes and strengthen the resilience of the healthcare system during respiratory virus seasons.
by Baraa E. Elawy, Chadi E. Soukkarieh, Abdul Q. Abbady, Shaza A. Allaham, Georges M. Deeb
In order to achieve pain relief without associated tolerance and dependence risks of general opioids like morphine, researchers have designed AT121 as potent safe alternative. In this study, we evaluated the analgesic and neurochemistry effects of AT121, a bifunctional partial agonist at Mu and nociceptin/orphanin FQ peptide (NOP) receptors, compared to morphine in hippocampal neurons for the measurement of dopamine neurotransmitters concentration and action potential of cortical neurons isolated from newborn BALB/c mice. This helps us to predict and assess its success in vivo by detecting the effect of AT121 in vitro. This activates G0/Gi protein pathways while blocking the β-arrestin pathway, significantly delayed action potential generation, prolonged spike duration, and reduced amplitude, without altering firing thresholds or inducing tolerance over a two-hour window. In contrast, morphine has produced similar analgesic effects but with a higher risk of tolerance. Co-administration of AT121 and morphine improved these changes, whereas naloxone failed to reverse AT121’s effects, suggesting distinct receptor interactions. Dopamine quantification in hippocampal culture media revealed that morphine, alone or combined with AT121, markedly elevated extracellular dopamine, consistent with its reinforcing properties to morphine on analgesia. Notably, AT121 alone led to significantly lower dopamine levels compared to control, indicating a reduced risk of triggering reward-related pathways. Together, these findings highlight AT121 as a promising candidate for both acute and chronic pain management, and suggest its offering potent analgesia with a lower likelihood of tolerance and addiction following chronic opioid exposure.Staphylococcus aureus (S. aureus) bacteraemia is a common and severe infection. With mortality rates ranging from 20–30% and long-term impairments in over a third of survivors, better treatments are urgently needed. Linezolid, a well-established treatment for pneumonia and complicated skin infections, has been shown in preclinical studies to strongly suppress S. aureus virulence factors critical to bacterial persistence and tissue damage. Hence, we aim to investigate whether the addition of linezolid to standard therapy in patients with S. aureus bacteraemia leads to an overall improvement in patient-relevant outcomes.
We will conduct a two-arm, parallel-group, multicentre, randomised controlled trial (Linezolid Plus Standard of Care) in 12 hospitals in Switzerland with blinded treating physicians, patients and outcome assessors. Hospitalised patients aged ≥18 years with S. aureus bacteraemia will be eligible. Patients will receive standard antibiotic treatment as prescribed by the treating physician. Within 72 hours of collection of the blood sample yielding the first positive blood culture, patients will be enrolled and randomised 1:1 to receive either adjunctive linezolid (600 mg orally two times per day for 5 days) or placebo. To determine patient-relevant outcomes, we implemented a comprehensive patient-representative consultation process. Consequently, we will use the desirability of outcome ranking (DOOR) established for S. aureus bacteraemia as the primary outcome at 90 days. The hierarchical composite DOOR outcome includes the following four components, ranked from most to least important: (1) survival, (2) return to level of function before S. aureus infection, (3) complications leading to treatment changes and serious adverse reactions; and (4) hospital length of stay. This approach will allow us to analyse the win ratio, that is, whether patients receiving linezolid have a better DOOR rank compared to patients in the placebo group. We calculated a target sample size of 606 patients providing 90% power at a two-sided significance level of 0.05.
Ethical approval was received from the Ethics committee for Northern and Central Switzerland (BASEC number 2025-00655). Eligible patients will be informed about the study by the local study team and asked for written consent if they wish to participate. For patients unable to provide informed consent, an appropriate substitute (ie, a close relative or a physician not involved in the research project) may make decisions based on the presumed wishes and the best interest of the patient. The patient’s own consent will be obtained as soon as their condition permits. Results will be published in peer-reviewed journals and in laymen's terms through various channels (social media, Swiss national portal HumRes).
To examine the risk of severe cardiovascular (CV) events in patients with chronic obstructive pulmonary disease (COPD) across different time periods following COPD exacerbations and the incidence rate of cardiopulmonary events in a real-world setting in China.
Retrospective cohort study.
Regional electronic health records database from Yinzhou District of Ningbo City, China.
A total of 14 713 patients aged ≥40 years with a first COPD diagnosis between 1 January 2014 and 1 March 2022.
The risk of severe CV events (ie, hospitalisation and a primary or secondary discharge code for acute coronary syndrome, heart failure decompensation, cerebral ischaemia, arrhythmia and CV-related death) during different exposed time periods following a COPD exacerbation, the incidence rate of overall cardiopulmonary events (ie, severe exacerbation of COPD, all-cause mortality, inpatient CV events, inpatient ischaemic stroke and inpatient tachyarrhythmia/atrial fibrillation) and the incidence rate stratified by COPD exacerbation history.
We included a total of 14 713 patients. During a median (IQR) follow-up of 2.8 (4.0) years, 20.1% experienced severe CV events. Compared with the unexposed period, the risk of severe CV events was the highest in the first 10 days following a COPD exacerbation (adjusted HR 10.00, 95% CI 8.16 to 12.25). The risk of severe CV events decreased over time but remained significantly elevated up to 90 days post exacerbation. We found that 32.7% of COPD patients experienced cardiopulmonary events, with a crude incidence rate of 9.38 (95% CI 9.09 to 9.69) per 100 person-years.
This study is the largest retrospective cohort study investigating CV and cardiopulmonary events among patients with COPD in China. Our findings highlight an elevated risk of CV events closer to the time of COPD exacerbations and show that nearly one-third of COPD patients experience cardiopulmonary events.
Visual impairment is reported to affect 40%–50% of children with cerebral palsy (CP). Vision difficulties in the context of rehabilitation are often under-recognised, under-treated and therefore under-studied, pointing to an urgent need for the development of evidence-based vision interventions for infants and toddlers with cerebral vision impairment (CVI). We present the protocol of a multisite pragmatic pilot randomised controlled trial (RCT) of feasibility, acceptability and preliminary efficacy of an early vision-awareness and parent-directed environmental enrichment programme for infants with or at risk of CP under 7 months corrected age (CA) with vision impairment.
The main objective is to determine the feasibility and acceptability of the Vision Intervention for Seeing Impaired Babies: Learning through Enrichment (VISIBLE) intervention. We will estimate the preliminary effects of the programme on infants’ visual functions and early development, as compared with standard community-based care (SCC).
A two-group RCT will be conducted. Infants at 3–6 months at entry, with severe visual impairment and at high risk of CP, will be enrolled and randomised (n=16 per group) to receive the VISIBLE intervention compared to SCC. Randomisation will be completed through an independent automated process (Research Electronic Data Capture). VISIBLE intervention will be delivered by a therapist through home visits (90–120 min) once every 2 weeks. Completion of 10 visits (80% of the intervention target dose) within 6 months is required for adherence to the VISIBLE trial. Outcome will be assessed at 12 months CA. Visual function will be evaluated with the Infant Battery for Vision, motor outcomes with the Peabody Developmental Motor Scales, Second Edition. Developmental quotients, infant quality of life, parent well-being and parent-infant relationship will be also monitored through standardised tools.
The enrolling sites have historically demonstrated rapid and effective translation of successful evidence-based interventions into routine clinical practice, as well as the dissemination of the findings through local, national and international scientific meetings.
ACTRN12618000932268.
Poor participant retention in randomised clinical trials, resulting in missing outcome data, can impact the validity, reliability and generalisability of results. While participants’ views on general non-retention issues have been reported elsewhere, a qualitative evidence synthesis specifically focusing on trial processes (ie, outcome data collection) impacting retention has not been undertaken to date. This is an important research question to inform targeted interventions to support retention. This review aims to address this by systematically searching and synthesising the evidence on participant reasons for trial non-completion, linked to outcome data collection.
We conducted a qualitative evidence synthesis of qualitative studies and mixed methods studies with a qualitative component, in Embase, Ovid MEDLINE, PsycINFO, Cochrane Central Register of Controlled Trials (CENTRAL), Social Science Citation Index, Cumulative Index of Nursing & Allied Health Literature and Applied Social Sciences Index and Abstracts, up to February 2025. We used Thomas and Harden’s thematic synthesis approach. The Grading of Recommendations Assessment, Development and Evaluation-Confidence in the Evidence from Reviews of Qualitative framework was used to assess confidence in the review findings.
We identified 11 studies reporting qualitative data from 14 separate trials, with findings from 105 trial non-retainers. The studies were undertaken between 2007 and 2025.
There were three types of participant non-retention behaviours reported across the studies, where participants either: (1) missed at least one clinic visit; (2) did not complete a postal questionnaire or (3) did not complete online data collection. We developed four analytical themes outlining participant-reported influences on trial non-retention, specifically related to trial processes (ie, data collection for outcome measures): fluctuating health, balancing trial burdens, navigating life as a trial participant and managing expectations of participation.
This review generates important insights into participants’ reasons for trial non-completion linked to outcome data collection. The review highlights the need for further research into supporting trial recruitment discussions that provide clear, realistic expectations for potential trial participants, as well as strategies that recognise, and where possible, address some of the influences on participants to improve outcome data completeness and ultimately improve trial retention.
Despite its serious impact, anorexia nervosa (AN) remains one of the least understood mental illnesses, with significant gaps in effective treatment options. No medications have been deemed effective and only 50% of individuals respond to conventional psychotherapies. Gastrointestinal (GI) bacteria have been found to be altered in individuals with AN. While, Fecal microbiota transplantation (FMT) has shown potential for alleviating anxiety and depression, its effects remain understudied for individuals with AN. This study aims to determine whether oral capsular FMT is acceptable to adolescents with AN and results in clinical improvement in weight and/or psychological symptoms.
This study will randomise 20 adolescents with AN, ages 12–17 years, to receive either FMT or placebo capsules. These 20 youth, as well as an additional 10 youth who decline trial enrolment, will participate in qualitative interviews. We will track recruitment rates and collect psychological and biological measures (blood, stool, urine and saliva) at multiple timepoints to assess how gut microbiota and their metabolites may influence the symptoms of AN. Interviews with participants and caregivers will explore their experiences and views on FMT as a treatment approach.
This study has received ethics approval by the Hamilton Integrated Research Ethics Board (#17493) and investigational drug approval by Health Canada (Dossier ID: c292423). Informed consent will be obtained by research staff from all participants. Findings will be disseminated through academic conferences, clinical forums and partnerships with advocacy organisations to reach clinicians, researchers and individuals with lived experience.
Deprescribing is important because inappropriate polypharmacy increases the risk of adverse drug events, treatment burden, reduced adherence and healthcare costs, while potentially compromising patient safety and quality of life. This study aimed to investigate the perceived barriers and enablers experienced by healthcare professionals (HCPs) in Indonesia regarding deprescribing in patients with type 2 diabetes (T2D) and polypharmacy.
A qualitative study using focus group discussions (FGDs) and thematic analysis.
Four FGDs were conducted with general practitioners, specialists (internists) and pharmacists from healthcare facilities in West Java Province, Indonesia. Each group included 3–4 participants from the same discipline, with one mixed group that included one participant of each profession. In total, 13 participants were included in the study.
HCPs across disciplines recognised the goals of deprescribing as optimising treatment, reducing polypharmacy risks and preserving treatment outcomes. However, implementation was hindered by the lack of clear guidelines, hierarchical dynamics, limited training and resource constraints, particularly in rural and high-volume settings. Enablers included clinical competence, effective communication, access to comprehensive clinical data and interprofessional collaboration. Patient education level, family support and community engagement were also key, underscoring the need for system-level support and shared decision-making to achieve effective deprescribing.
Deprescribing in T2D with polypharmacy is shaped by clinical competence, interprofessional collaboration, patient engagement and system-level resources. Improving practice in Indonesia requires clear guidelines, targeted HCP training, stronger interprofessional communication, better access to patient data and active involvement of patients and families. These strategies could provide context-specific insights to guide practice and policy on deprescribing initiatives.
766/UN6.KEP/EC/2024
To develop an empirically grounded, activity-based tariff framework for Hospital at Home (HaH) services using time-driven activity-based costing (TDABC) and micro-costing to support transparent and equitable reimbursement for acute elderly care delivered at home.
Microcosting study embedded within a randomised controlled trial (RCT) comparing HaH with conventional hospital admission in Denmark.
Three municipalities in the Central Denmark Region in collaboration with emergency department physicians at a regional hospital.
A consecutive subsample of 107 elderly acute patients enrolled in the RCT between June 2022 and February 2024. Resource use for HaH activities was measured prospectively using microcosting logs, time-motion observations and administrative records.
Empirically derived tariffs per HaH visit (first and subsequent) calculated using an eight-step TDABC framework incorporating process mapping, resource identification, capacity cost rates and time equations. Sensitivity analyses tested robustness to variation in key cost drivers.
The mean total tariff was 338.89 (95% CI 310.94 to 351.49) for first visits and 207.81 (95% CI 200.70 to 215.69) for subsequent visits, including treatment and transport components. Staff time was the principal cost driver, while equipment, overhead and travel reimbursement had smaller effects. The framework accommodates variation in staffing, geography and visit intensity and can be used to estimate total costs across diverse HaH pathways.
A transparent and reproducible tariff-development framework for HaH services was established using TDABC and microcosting. The model aligns reimbursement with actual resource use and care complexity and provides a transferable template for economic evaluation and operational planning.
Self-injurious behaviour (SIB) consists of persistent, repetitive movements that can result in serious injury without suicidal intent. These behaviours are prevalent among children with neurodevelopmental disorders, including profound autism. Although many individuals benefit from currently available therapies, some exhibit treatment-refractory SIB that necessitates ongoing use of personal protective equipment and restraint, presumably due to stronger neurobiological drivers. We recently completed a phase I, open-label clinical trial demonstrating the safety, feasibility and preliminary efficacy of bilateral deep brain stimulation targeting the nucleus accumbens (NAc-DBS) in children with profound autism and severe, refractory SIB. The objective of the proposed study is to characterise the effectiveness of NAc-DBS in treating severe, refractory SIB in this unique and vulnerable population.
A single-centre, randomised double-blinded, crossover trial is proposed. Informed by the results of our pilot study, 25 subjects with autism spectrum disorder and severe, refractory SIB will undergo bilateral NAc-DBS. Following a 4-week recovery period, participants will be randomised to either group A (stimulation ON then OFF) or group B (stimulation OFF then ON). Each block will last 12 weeks, separated by a 2-week washout period. Following completion of the second block, all participants will enter a 6-month open-label phase with stimulation ON. The primary outcome is the difference in the Repetitive Behaviour Scale–Revised total score, between DBS-ON and DBS-OFF conditions. Secondary outcomes include measures of quality of life, caregiver burden, daily logs of SIB events and direct observation of SIB under structured analogues.
The proposed trial has been approved by the institutional Research Ethics Board (1000081171). Trial results will be disseminated through peer-reviewed publications and conference presentations.
Guided by Straussian Grounded Theory, this study aimed to explore patients’ dynamic trade-off processes in evaluating bariatric surgery outcomes and to construct a patient-centred theoretical framework to inform clinical assessment and intervention.
Qualitative study using Straussian Grounded Theory, semi-structured, in-depth interviews were conducted. Data were analysed using open, axial and selective coding. Reporting followed the Standards for Reporting Qualitative Research guidelines.
This study was conducted at a tertiary hospital in China between June 2023 and August 2023.
A total of 11 patients who had undergone bariatric surgery were enrolled, aged 21–54 years, with postoperative follow-up durations ranging from 1 to 10 years.
A core category—Dynamic Trade-off Evaluation of Bariatric Surgery Outcomes—was identified, characterised by dynamism, trade-off and subjectivity. The framework comprises four inter-related components: trade-off basis, trade-off moderation, trade-off process and comprehensive evaluation. Outcome evaluation emerged as a non-linear process progressing through four stages: burden-dominant, contradiction-coexistence, contradiction-persistence and meaning-reconstruction stages. Individual goal orientation and psychological resilience served as key moderating factors shaping evaluative trajectories.
This study proposes a novel theoretical framework elucidating how patients dynamically evaluate bariatric surgery outcomes. By revealing stage-specific mechanisms and moderating factors, the framework provides a theoretical basis for improving preoperative expectation management and postoperative support.
Chronic wounds represent a major global health and economic burden. Smart wound dressings integrate biosensing and stimuli-responsive materials to monitor and modulate biological parameters within the wound microenvironment. This scoping review maps the biological parameters monitored by smart wound dressings, an area not previously synthesized across preclinical and clinical contexts. Following Joanna Briggs Institute (JBI) and PRISMA-ScR frameworks, five databases were searched in March 2025. Studies published between 2008 and 2025 reporting biosensing or responding technologies in wound dressings were included. A total of 179 studies met the inclusion criteria, most being preclinical (in vitro or in vivo rodent models), with few human investigations. The most frequently monitored parameters were pH, temperature, oxygenation, moisture, bacterial burden, and protease activity (particularly MMP-9). Preclinical data showed enhanced collagen deposition, angiogenesis, and infection control compared with conventional dressings, whereas human studies mainly assessed feasibility and biocompatibility. Smart dressings demonstrate strong technical and biological performance, but clinical validation and standardized outcome reporting remain limited. Future interdisciplinary research should prioritize well-designed clinical trials to confirm therapeutic and economic benefits and enable translation into personalized wound care.
Calcium sulphate (CS) is a fully synthetic, sterile, bioabsorbable biomaterial extensively applied for the management of infected tissues and postoperative dead spaces resulting from surgical interventions. Residual DS may facilitate hematoma accumulation and bacterial colonisation, thereby heightening the risk of surgical-site infections. Within orthopaedic surgery, CS has been predominantly evaluated as a bone-void filler and an off-label antibiotic delivery vehicle—particularly in arthroplasty revisions, chronic osteomyelitis, and open fractures—yielding high rates of infection prophylaxis, bone regeneration, and low complication profiles. Commercially available as injectable ‘pearls’ or beads, CS permits local, sustained antibiotic elution while undergoing gradual biodegradation, thus obviating the need for secondary removal procedures. Over the last decade, Calcium Sulphate beads (CSBs) have transcended orthopaedics, gaining traction across general, vascular, and endocrine surgery disciplines for the prevention and treatment of complex wound infections. However, their application in plastic and reconstructive surgery remains underreported, despite the specialty's frequent engagement with complex soft-tissue defects, bone exposure, suture dehiscence, and trauma-related wounds vulnerable to infection. To our knowledge, this represents the first scoping review synthesising current evidence, clinical indications, and emerging roles of CSBs within plastic and reconstructive surgery.