Pilonidal sinus disease (PNS) in children and adolescents lacks standardised management pathways. Minimally invasive and outpatient-based strategies are increasingly adopted, but paediatric-specific data remain limited. This study evaluated outcomes following implementation of a structured, tiered outpatient pathway. A retrospective single-centre cohort study was conducted including patients aged ≤ 18 years treated for PNS between February 2023 and August 2024. Management followed a stepwise protocol: structured conservative care, in-clinic debridement and operative intervention (trephination or limited excision) for refractory or severe disease. Primary outcome was recurrence after documented healing. Secondary outcomes included time to healing, clinic utilisation and associations with clinical variables. 69 patients were included (median age 15 years [IQR 14–16]; 64% male). Twenty-three patients (33.3%) required operative management. Recurrence occurred in 6/23 (26.1%) in the operative group and 1/46 (2.2%) in the non-operative group (Fisher's exact p = 0.0045). Median follow-up duration did not differ significantly between groups. Prior infection at presentation showed a numerical but not statistically significant association with recurrence. Time to healing was prolonged in both groups and did not differ significantly. Within a structured outpatient pathway, paediatric patients demonstrated low overall recurrence rates. Conservative management was associated with lower recurrence; however, patients undergoing operative intervention likely represented a more severe subgroup. Prospective severity-adjusted studies are required to define optimal paediatric wound management strategies for PNS.
This real-world study investigated the changes of lipid lowering therapy (LLT) usage in patients with high or very high cardiovascular (CV) risk in the UK and the group of all other European countries in the SANTORINI study up to 1 year from baseline and the impact this treatment had on the attainment of low-density lipoprotein cholesterol (LDL-C) risk-adjusted goals set by the National Institute for Health and Care Excellence (NICE) and those in the 2019 European Society of Cardiology (ESC)/European Atherosclerosis Society (EAS) dyslipidaemia guidelines.
Secondary analysis of the SANTORINI dataset (an international, prospective, observational, non-interventional study (NCT04271280)).
Primary and secondary care centres in the UK and the group of other European countries (Austria, Belgium, Denmark, Finland, France, Germany, Ireland, Italy, the Netherlands, Portugal, Spain, Sweden and Switzerland).
663 UK patients with high and very high CV risk were included in this analysis and 8502 from the group of other European countries. Of these, 380 UK patients and 6830 from the group of other European countries had LDL-C information available at baseline and 1-year follow-up.
The primary objectives were to describe patients’ lipid management, LDL-C levels at 1-year follow-up and their attainment of 2023 NICE (≤2.0 mmol/L) and 2019 ESC/EAS LDL-C 2019 guideline-recommended LDL-C goals (
Over the course of 1-year follow-up, the overall proportion of UK patients on no LLT reduced from 20.4% at baseline to 7.1%, similar to that observed in the group of other European countries (baseline–20.9%, 1 year–3.0%). The proportion of UK patients receiving LLT monotherapy increased from 74.8% at baseline to 84.9%, higher at both time points than that observed for the group of other European countries (baseline: 52.0%, 1 year: 55.0%). The use of any combination therapy increased slightly from baseline to 1 year in the UK overall cohort (4.9% vs 7.1%) and overall in the group of all other European countries, the cohort increased from baseline (27.1%) to 1 year (40.2%). Overall, mean (SD) LDL-C levels in the UK were 2.5 (1.2) mmol/L at baseline and 2.1 (1.0) mmol/L at 1 year and for the group of other European countries were 2.4 (1.2) mmol/L at baseline and 2.0 (0.9) mmol/L at 1 year. The overall proportions of UK patients achieving the UK NICE treatment goal and ESC/EAS 2019 guidelines at baseline versus 1-year follow-up were 40.3% vs 52.6% and 22.9% vs 32.9%, respectively; 21.1% and 30.9% of patients in the group of other European countries achieved the ESC/EAS 2019 guidelines at baseline and 1-year follow-up, respectively.
In this UK-focused analysis of the SANTORINI study, use of LLT increased modestly over 1 year, accompanied by a reduction in average LDL-C levels. However, mean LDL-C remained above the NICE goal, and attainment of both NICE and ESC/EAS LDL-C thresholds remained suboptimal. The findings highlight continued opportunities to optimise lipid management in UK clinical practice, including the potential for broader use of combination therapies.
Inadequate emergence is a common postoperative complication in elderly patients following major abdominal surgery. This study was designed to determine its incidence, identify associated risk factors and characterise its clinical subtypes within this high-risk cohort.
This prospective single-centre cohort study was conducted at a comprehensive specialised tertiary care hospital in Northwest Ethiopia. Consecutive patients aged 65 years and older scheduled for elective major abdominal surgery under general anaesthesia were enrolled.
The primary outcome was the proportion of patients experiencing inadequate emergence.
A total of 388 patients were analysed. Inadequate emergence occurred in 21.9% of participants (95% CI 14.3% to 31.6%), with hypoactive emergence observed in 10.7% and emergence delirium in 11.2%. Multivariable logistic regression identified several independent predictors, including advanced age (adjusted OR (AOR)=1.9; 95% CI 1.5 to 8.2), preoperative anxiety (AOR=2.7; 95% CI 1.2 to 7.2), prolonged preoperative fasting (AOR=2.1; 95% CI 1.8 to 9.1), non-ketofol-based induction (AOR=3.4; 95% CI 1.6 to 6.3), absence of abdominal field block (AOR=4.2; 95% CI 4.0 to 9.6), substantial intraoperative blood loss (>1000 mL; AOR=1.9; 95% CI 1.2 to 7.6), postoperative nausea and vomiting requiring antiemetics (AOR=2.2; 95% CI 2.1 to 7.1) and presence of an indwelling urinary catheter (AOR=2.4; 95% CI 1.8 to 7.9).
Inadequate emergence occurred in approximately one in five elderly patients undergoing elective major abdominal surgery. Independent predictors included advanced age, major intraoperative blood loss, postoperative nausea/vomiting requiring antiemetics, non-ketofol-based induction, preoperative anxiety, absence of abdominal field block, presence of an indwelling urinary catheter and prolonged preoperative fasting.
Maternal human milk feedings continue an offspring’s exposure to the programming stimuli of maternal metabolism during the postnatal period. While considerable research focuses on associations between in utero environments and offspring metabolic disease, few studies have been able to specifically measure how human milk composition modifies programming of children’s growth in conjunction with comprehensive measures of maternal glycaemia during pregnancy.
The Glycemia Range and Offspring Weight and adiposity in response To Human milk (GROWTH) Study is a longitudinal cohort enrolling women with a singleton pregnancy who (1) undergo serial testing of glycaemia during pregnancy and (2) are intending to provide their breast milk through direct breastfeeding or pumped milk as the primary nutrition for their infant. Enrolment started in October 2023 and is expected to be completed in December 2026. Key procedures include virtual lactation support visits, serial human milk sampling at three time points, maternal and infant blood sampling, serial maternal and child anthropometric measurements and diet assessment. After delivery, mother–child dyads are followed until children turn 2 years of age. The primary exposure variable is maternal glycaemia obtained from a fasting, 3 hour 100 g oral glucose tolerance test performed at 24–28 weeks of gestation, and the primary outcome measure is the composite of human milk linoleic and docosahexaenoic acid concentrations in milk samples collected at 1 month postpartum.
Lurie Children’s Hospital Institutional Review Board (IRB) provides central oversight of the GROWTH Study in conjunction with each participating centre’s IRB. The GROWTH Study data has the potential to inform perinatal health and future research in lactation and human milk science by providing comprehensive measures of human milk composition and early childhood growth and body composition parameters impacted by maternal metabolism in pregnancy.
by Shuhong Zheng, Renxiu Bian, Haixin Song, Zhiping Liao, Ting Gao, Min Yan, Heqing Huang, Zuodong Lou, Fangchao Wu, Jianhua Li
BackgroundLow-intensity focused ultrasound (LIFU) is a non-invasive neuromodulation technique with high spatial precision and the ability to reach deeper brain regions, offering potential advantages for post-stroke rehabilitation. Repetitive transcranial magnetic stimulation (rTMS) is a widely adopted non-invasive brain stimulation technique that modulates cortical excitability to promote neuroplasticity. However, direct head-to-head comparisons between these two modalities for post-stroke motor recovery remain limited.
ObjectiveTo perform a secondary head-to-head comparison of LIFU and repetitive transcranial magnetic stimulation (rTMS) for motor recovery after stroke, based on a prospectively registered randomized controlled trial.
MethodsThis secondary analysis included patients with subacute stroke who received two weeks of standard rehabilitation combined with either LIFU (n = 25) or rTMS (n = 25) targeting the ipsilesional primary motor cortex. LIFU parameters: 0.5 MHz, spatial-peak pulse-average intensity (ISPPA) 10.2 W/cm² (free-field), pulse duration 0.2 ms, duty cycle 20%, 20 minutes per session, five days per week for two weeks (10 sessions total). rTMS parameters: 10 Hz, 80% resting motor threshold, 1,000 pulses per session (20 trains of 5 seconds), 20 minutes per session, five days per week for two weeks (10 sessions total). Motor outcomes were assessed using the Fugl–Meyer Assessment (FMA; upper and lower extremities), Modified Barthel Index (MBI), and Brunnstrom stages. Resting-state functional near-infrared spectroscopy (fNIRS) was used to evaluate cortical activity and functional connectivity before and after the intervention. Primary analyses were conducted in the intention-to-treat (ITT) population (n = 50), with completer analyses (n = 43) performed as sensitivity analyses.
ResultsBoth groups showed significant within-group improvements in FMA and MBI after the intervention (all p 0.05), and completer analyses yielded consistent between-group conclusions. In contrast, change-from-baseline analyses demonstrated greater improvements in FMA scores in the LIFU group compared with the rTMS group (ΔFMA upper limb: median 7 [IQR 3–10.5] vs. 2 [1–3], p = 0.001; lower limb: 3 [1–4.5] vs. 1 [0–1.5], p Conclusion
LIFU and rTMS were associated with comparable short-term motor outcomes in subacute stroke. Differences observed in change-from-baseline motor improvements and exploratory neuroimaging measures suggest potential divergence in recovery dynamics and cortical modulation, warranting further investigation in larger, longitudinal studies.
Trial registrationThis study was derived from a prospectively registered, three-arm randomized controlled trial in the Chinese Clinical Trial Registry (ChiCTR2500114687). The present manuscript reports a secondary head-to-head comparison between the two neuromodulation intervention arms.
by Amma Aboagyewa Larbi, Moses Etsey, Obed Brew, Bismark Koduah, Rosemond Enam Mawuenyega, Emmanuel Kobla Atsu Amewu, Nehemiah Kweku Essilfie, Solomon Wireko, Alexander Kwarteng, Ben Adu Gyan
The human gut microbiome, consisting of bacteria, archaea, fungi, and viruses, influences various physiological processes of the body. The gut microbiome composition is shaped by factors such as diet, geography, and antibiotic use. Malaria has been a global health challenge over the years, especially in low- and middle-income countries. This study investigated how asymptomatic malaria infection altered gut microbial communities in Ghanaian children, offering insights for novel malaria control strategies. Standard aseptic phlebotomy procedures were employed to collect venous blood samples for Plasmodium species detection. The gut microbial community was profiled by sequencing the 16S rRNA V4 region, and sequence data were processed using the DADA2 pipeline in R. Asymptomatic malaria infections were predominantly mixed with P. falciparum and P. malariae. Microbiome analysis revealed that Firmicutes and Bacteroidetes comprised nearly 70% of the total microbial population. Asymptomatic individuals showed a decrease in Firmicutes abundance from 52.5% to 44.0% and an increase in Bacteroidetes from 34.7% to 45.6%. There was also a slight increase in the abundance of Proteobacteria from 3.0% to 4.8%. At the genus level, Prevotella_9 was the most abundant and exhibited the highest variability in the infected groups. The Alloprevotella and Streptococcus genera increased in both infected groups, but Escherichia-Shigella was significantly elevated in only those with mixed infections. Faecalibacterium significantly declined in asymptomatic malaria-infected individuals compared to healthy controls, with variability further reduced in mixed infections. Beta-diversity analysis indicated a significant effect of malaria status on microbial composition (PERMANOVA, pby Mequanent Dessie Bitewa, Thomas Kidanemariam Yewodiaw, Aysheshim Asnake Abneh, Mikias Getahun Molla, Mulat Belay Simegn, Tadele Sinishaw Jemere, Mequannt Alemu Endayehu, Aysheshim Belaineh Haimanot, Werkneh Melkie Tilahun, Atirsaw Assefa Melikamu, Tadele Derbew Kassie
BackgroundCervical cancer is preventable, yet it remains a leading cause of cancer death in women. About 90% of cases and 94% of deaths occur in low- and middle-income countries (LMICs). Limited access to screening drives high incidence and mortality. Screening is central to secondary prevention and global elimination efforts.
ObjectiveThis study aimed to assess determinants of cervical cancer screening among women aged 30–49 years in low- and middle-income countries: a multilevel analysis.
MethodsA cross-sectional study used nationally representative data from 148,605 weighted women aged 30–49 years in 20 LMICs (2019–2024). Multilevel logistic regression identified factors associated with cervical cancer screening while accounting for cluster-level variation. Statistical significance was set at p Result
Overall cervical cancer screening uptake was 14.03% (95% CI: 13.63–14.45%), ranging from 0.92% in Mauritania to 42.98% in Zambia. Higher screening was associated with older age 40–49 years (AOR = 1.48; 95% CI: 1.41–1.54), occupation (AOR = 1.15; 95% CI: 1.10–1.21), contraceptive use (AOR = 1.38; 95% CI: 1.31–1.44), recent health-facility visit (AOR = 1.93; 95% CI: 1.84–2.02), prior abortion (AOR = 1.28; 95% CI: 1.22–1.34), female-headed households (AOR = 1.11; 95% CI: 1.05–1.18), high community education (AOR = 1.63; 95% CI: 1.49–1.79), and high media exposure (AOR = 2.54; 95% CI: 2.30–2.80). Lower uptake was observed among individuals in high-poverty communities (AOR = 0.63; 95% CI: 0.57–0.68), higher parity (1–4 birth) (AOR = 0.86; 95% CI: 0.78–0.94); (five or more births) (AOR=0.66 95% CI: 59–0.73), and those residing in rural areas (AOR = 0.89; 95% CI: 0.82–0.97).
ConclusionCervical cancer screening uptake in LMICs is far below the WHO 2030 target, with wide country disparities. Socio-demographic factors, health-facility contact, and community education increase uptake, while poverty and geographic barriers reduce it. Integrating screening into routine reproductive and maternal care, strengthening community and media education, and addressing structural barriers to access are essential to improving coverage.
by Jakob Brandstetter, Lea Goldstein, Tim Schreiber, Rupert Palme, Tobias Lindner, Markus Joksch, Bernd Krause, Brigitte Vollmar, Simone Kumstel
Pancreatic cancer is the third leading cause of cancer-related death, with a 5-year survival rate of only 10%. Preclinical studies remain essential for identifying novel therapeutic strategies, discovering biomarkers, and deepening the understanding of disease biology. The most frequent driver mutation in pancreatic cancer is the G12D mutation in the KRAS gene, present in approximately 90% of the tumors. A recent study demonstrated complete regression of KRAS-driven pancreatic cancer upon systemic ablation up- and downstream signaling proteins EGFR and C-RAF. Building on these findings, we investigated the therapeutic benefit of combining the EGFR inhibitor erlotinib with the novel pan-RAF inhibitor LXH-254. The anticancer effects of this combination were assessed in vitro in murine and human pancreatic cancer cell lines by evaluating cell proliferation, cell death and phosphorylation of key signaling proteins. Subsequent in vivo studies were performed in an orthotopic murine pancreatic cancer model and in genetically engineered KPC mice, using daily oral administration of LXH-254 (35 mg/kg) and erlotinib (75 mg/kg). While the treatment robustly inhibited MAPK signaling and caused significant anti-proliferative effects in vitro, it did not improve survival or reduce tumor burden in either in vivo model. hese results contrast with previous reports of efficacy from monotherapies in xenograft models, highlighting the limitations of current preclinical approaches. Our findings underscore the need to develop more effective pathway-targeted inhibitors, and preclinical models that predict clinical outcomes more accurately.by Jabir Aman, Bikila Balis, Naol Oda, Dawit Tamiru, Tadesse Gure Eticha, Dawit Firdisa, Aboma Motuma
BackgroundMeconium aspiration syndrome is a life-threatening respiratory disease affecting around 5% of neonates worldwide. Although several studies have been conducted in developed countries, data on meconium aspiration syndrome and its associated factors remain limited in low-resource settings, including Ethiopia. Therefore, this study aimed to determine the meconium aspiration syndrome and associated factors among neonates admitted to the neonatal intensive care unit at public hospitals in Harari region, Eastern Ethiopia.
MethodA retrospective hospital-based cross-sectional study design was conducted among all neonates admitted from January 1 to December 30, 2023 and data were extracted from patient charts during April 1–30, 2025. A simple random sampling technique was employed to select 417 charts of neonates admitted to the neonatal intensive care unit. The data were collected by a data extraction checklist via Kobo Toolbox. Descriptive statistics and binary logistic regression were used in SPSS version 25 (IBM Corp., Armonk, NY, USA) for the analysis. Adjusted odds ratios with 95% confidence intervals were used to declare statistical significance at a p-value ≤ 0.05.
ResultsThe prevalence of meconium aspiration syndrome among neonates admitted to the neonatal intensive care unit was 24.2% [95% CI, 20.2–28.6]. Factors significantly associated with meconium aspiration syndrome were post-term gestation [AOR = 9.05, 95% CI 2.38–34.41], antepartum hemorrhage [AOR = 3.34, 95% CI 1.31–8.60], prolonged labor [AOR = 3.06, 95% CI 1.27–7.36], premature rupture of membranes [AOR = 3.65, 95% CI 1.28–10.45], low Apgar scores at 5th minute [AOR = 11.27, 95% CI 3.44–36.92] and intrapartum thick meconium passage [AOR = 5.98, 95% CI 2.6–13.6].
Conclusions and recommendationsThese findings indicate a high prevalence of meconium aspiration syndrome, and to reduce its impact, targeted clinical interventions should be implemented. Pregnancies reaching 42 weeks of gestation, prolonged labor, and high-risk conditions such as antepartum hemorrhage, premature rupture of membranes, or the presence of thick meconium are important factors to consider. Careful monitoring and appropriate management may be warranted in these cases.
To test a theory-informed, person-centred rehabilitation intervention for older adults following a hospital admission complicated by delirium, developed in line with the Medical Research Council framework for complex interventions, to determine whether: (a) the intervention is acceptable to individuals with delirium and (b) a definitive trial and parallel economic evaluation of the intervention are feasible.
Multicentre, single-arm feasibility study.
19 patient (aged >65 years old) and carer pairs were recruited from six National Health Service acute hospitals across the UK.
Home-based rehabilitation programme designed to support recovery after hospital discharge, addressing cognitive, physical, physiological and psychosocial needs. Delivered by a trained team of occupational therapists, physiotherapists and rehabilitation support workers, the intervention included a comprehensive home assessment, collaborative goal setting, up to 10 personalised sessions over 12 weeks and the use of a recovery record to guide progress, education and psychosocial support.
Examined aspects of feasibility including eligibility, recruitment, data collection, attrition, acceptability of the rehabilitation intervention and potential to calculate cost-effectiveness.
In total, 419 patients were identified as having delirium and 36 met the full eligibility. 19 patient and carer pairs agreed to participate in the study (consent rate 53%; 95% CI 35% to 70%) with 13 participants going on to start the intervention (68%; 95% CI 43% to 87%) and 10 participants completing final follow-up (53%; 95% CI 29% to 76%). Baseline assessments were conducted either during hospitalisation or postdischarge, with initial assessments occurring a mean of 18 days (SD=13.0) postdischarge, and 77% completed within 14 days. Participants completed a mean of eight sessions (SD=2.9). 19 participants completed the primary outcome at baseline, while 10 participants completed it at 6-month follow-up. The economic evaluation indicated a total cost of £1249.29 per participant, covering assessments, intervention sessions and training costs.
The intervention showed feasibility among older adults recovering from delirium, as evidenced by the trial processes for participants who entered the study. However, recruitment challenges indicate a need for better strategies and further research through a definitive randomised controlled trial to demonstrate the effectiveness and cost-effectiveness of the intervention.
People living with HIV (PLHIV) frequently face psychological challenges, including stigma, stress and social isolation, which can negatively affect adherence to antiretroviral therapy (ART). Even in high-income countries where treatment is accessible, poor adherence can lead to drug resistance, reduced immune function and early morbidity. This systematic review aims to synthesise evidence on the relationship between psychological and mental health factors and ART adherence among PLHIV in high-income settings.
We will include studies published in any language between January 2015 and the date of the last searches. Reports of studies published in languages other than English, and which appear to be eligible for inclusion after the first level of screening, will be translated using Google Translate.
Studies will be included if they continue to meet the inclusion criteria and the quality of the translation is sufficient to extract the relevant data. PLHIV aged ≥15 years receiving ART in high-income countries. The studies to be included must assess psychological or mental health variables and ART adherence. Peer-reviewed journal articles will be the primary source of evidence. Grey literature identified from reference lists of key articles or using Google Advanced search techniques will be included. Searches for published studies will be done in OVID Medline, PsycINFO and Embase. Cochrane CENTRAL will be used to identify clinical trials in ClinicalTrials.gov and the International Clinical Trials Registry Platform.
Two independent reviewers will assess study quality and risk of bias using the Newcastle-Ottawa Scale, National Institutes of Health (NIH) Quality Assessment Tool and Jadad Scale. Discrepancies will be resolved by a third reviewer. Synthesis of quantitative data will be primarily descriptive. Predictors that have been examined in three or more studies will be reported in detail while those assessed in fewer studies will be presented concisely.
This study will be a review of the literature and will not involve primary collection of patients’ data. We will include amendments to the protocol in the final review. The final study will be published in a peer-reviewed journal and presented at conferences. The results of this systematic review will inform clinical practice, guide future research and support policy development that minimise mental health barriers to ART adherence.
CRD420251102248.
Data on long-term outcomes after surgical repair of pulmonary valve stenosis are limited. This study evaluated survival, clinical outcomes and quality of life (QoL) after surgery during childhood.
Single centre, longitudinal cohort study evaluating consecutive patients with pulmonary valve stenosis who underwent surgical repair between 1968–1980 and were evaluated every decade since 1990.
Of the original cohort of 89 operated patients, 11 died (12%), including 2 who died within 30 days postsurgery (2%), and 7 (8%) were lost to follow-up. Survival at 50 years follow-up was 87%, which was not significantly different from the GDP. Of the remaining 71 survivors, 32 refrained earlier from participating in this cohort study, leaving 39 eligible, of whom 34 (87%) participated again (50% male, median age 48 years) with a median follow-up of 45 (range 40–52) years. Event-free survival was 50%, with supraventricular tachycardia (14%) and reintervention (13%) being the most frequent events, although less frequently in the last 10 years. At last follow-up, biventricular function was preserved in most patients. Reduced right and left ventricular ejection fraction (EF) was found in 33% and 13%, respectively. Exercise capacity and maximum rate of oxygen consumption were mildly impaired in 14% and 32% of patients. Patients who underwent an infundibulectomy during initial surgery were significantly more likely to undergo reintervention (HR=8.32, p=0.003). Patient-reported QoL scores remained stable over time and consistently exceeded those of the age-matched GDP.
Fifty-year survival after surgery for pulmonary valve stenosis was excellent and comparable to the GDP. Most patients maintained preserved ventricular function, functional capacity and excellent QoL. Routine lifelong follow-up may not be necessary for all patients, but should be considered for those who underwent an infundibulectomy or have residual lesions.
Obesity is a global public health issue, with its effects a particular issue in Kuwait. Advances in pharmaceutical treatment (eg, glucagon-like peptide-1s) offer an effective solution, with the magnitude of weight lost something to celebrate. However, this level of weight loss also results in dramatic reductions in lean mass, reflecting loss of muscle mass and muscle strength which can predispose people to sarcopenia. This is a particular issue in people with type 2 diabetes in Kuwait, where the prevalence of muscle weakness is extremely high. Solutions to mitigate this loss of muscle mass and strength are needed, with a pragmatic resistance exercise intervention and increasing dietary protein intake having potential. This trial aims to determine whether resistance exercise and/or protein intake can preserve muscle mass and improve physical function in people with obesity initiating semaglutide/tirzepatide therapy.
This single-centre, 6-month, randomised controlled trial at Dasman Diabetes Institute will enrol 232 adults with obesity, randomised (1:1:1:1) to control, resistance exercise, protein supplementation or combined resistance exercise and protein in conjunction with semaglutide or tirzepatide therapy. Resistance exercise will be home-based and involve three sessions per week, progressing from one to three sets targeting major muscle groups. Protein supplementation will target 1.6 g/kg/day via dietary adjustment and protein products. Assessments at baseline and 6 months will include MRI measured quadriceps cross-sectional area (primary outcome), plus measures of secondary outcomes of MRI measured liver fat content and stiffness and intramuscular fat, body composition (dual energy X-ray absorptiometry), strength, physical function, dietary assessment, physical activity levels, sleep patterns, quality of life, glycaemic control and metabolic biomarkers.
The study has received ethical approval from the Dasman Diabetes Institute Ethical Review Committee (HR-RA-2025-01, 19 February 2025) and is registered at ClinicalTrials.gov (NCT06885736, 26 June 2025). Written informed consent will be obtained from all participants, with no financial compensation provided. Data will be reported in accordance with Consolidated Standards of Reporting Trials (CONSORT) guidelines, ensuring participant anonymity. Findings will be disseminated through peer-reviewed publications and presentations at national and international conferences.
Irritability represents one of the most common causes of referral to child and adolescent mental health services. Conceptually, tonic irritability (i.e., persistent grumpy mood) can be distinguished from phasic irritability (i.e., temper outbursts). The objective of this research project is to develop a fine-grained, ecologically valid and multimodal characterisation of tonic and phasic irritability to better understand the differential role of the two components in developmental psychopathology.
The study has a longitudinal observational and experimental design and involves two sites: (a) the Division of Child and Adolescent Psychiatry at the University Hospital of Lausanne and (b) the Division of Youth Mental Health at the Faculty of Psychology at the University of Basel. 220 help-seeking and healthy youths aged 8–14 years and their families will participate in the study consisting of a baseline assessment (i.e., self-report, interviews, cognitive assessments, autonomic measures, as well as in-situ experiments), an ecological momentary assessment (EMA) phase (over 2 weeks, including experience sampling method, cognitive assessment and passive monitoring) and a 1-year follow-up. Statistical analyses will include multilevel regression (e.g., linear mixed modelling).
We obtained ethical approval from the local ethics committees (Cantonal Research Ethics Commission on Human Beings, CER-VD, #2023-01846) and data collection began in January 2025. The results of the present study will be published in peer-reviewed scientific journals and will be presented at key conferences in the field of child and adolescent mental health, as well as at conferences focusing on EMA. Additionally, findings will be disseminated to practitioners, the educational sector and associations working with youths. We further intend to make the findings accessible to the general public through social media, for instance.
Tuberculosis (TB) remains a major global health challenge, with an estimated 10.8 million new cases and 1.25 million deaths in 2023. Despite advances in molecular detection of Mycobacterium TB (MTB), significant diagnostic gaps remain: in 2023, only 48% of newly diagnosed TB cases received rapid diagnostic testing, far below the 100% target. These challenges are intensified in high-burden settings, where sputum collection and distinguishing TB from other illnesses are difficult. The Xpert MTB Host Response (Xpert-HR) assay measures host immune gene expression from blood, shows promise but variable accuracy across studies. Hence, this study will perform an Individual Patient Data Meta-Analysis (IPDMA) to evaluate the diagnostic accuracy, subgroup performance, predictive values and clinical benefit of Xpert-HR compared with conventional sputum-based testing.
This systematic review and IPDMA will follow Preferred Reporting Items for Systematic Reviews and Meta-Analyses for Diagnostic Test Accuracy guidelines. Prospective studies including adolescents (>12 years) or adults with presumed TB tested using the Xpert MTB Host-Response assay will be identified through PubMed, Embase and Web of Science. Study quality will be assessed using an adapted diagnostic accuracy tool. Diagnostic accuracy will be pooled using random-effects models, with subgroup analyses where applicable. Decision curve analysis will evaluate clinical utility. Predictive values will be estimated across TB prevalences of 1-10%. Both one-stage and two-stage IPDMA approaches will be explored, and the proportion of unevaluable samples will be reported.
The review will be based on deidentified individual patient data to be obtained upon request from the corresponding authors of studies fulfilling all the data sharing agreement. Ethical approval has been obtained from the Ethical Committee of the Medical Faculty of Heidelberg University (Approval No. S-043/2026). The results will be disseminated through publication in a peer-reviewed journal, and through presentations at academic conferences.
CRD420251071857.
Chronic central serous chorioretinopathy (CSC) can cause progressive and permanent vision loss. Although photodynamic therapy (PDT) is a primary treatment option globally, it is not approved for CSC worldwide, limiting therapeutic access. The REPLAY trial is a phase III, investigator-initiated trial to evaluate the efficacy and safety of reduced-fluence PDT (rf-PDT) for chronic CSC to seek the first regulatory approval globally.
This study comprises two cohorts. The ‘untreated cohort’ is a multicentre, randomised, placebo-controlled, double-masked trial involving 60 patients with untreated, fovea-involving chronic CSC, randomised 2:1 to receive a single rf-PDT or placebo treatment. The ‘previously treated cohort’ is a single-arm, open-label trial for up to 10 patients with recurrent CSC after PDT. The primary endpoint for both cohorts is the proportion of eyes with a complete resolution of subfoveal fluid at 12 weeks post-treatment, assessed by optical coherence tomography. Secondary endpoints include changes in best-corrected visual acuity, central choroidal thickness, recurrence rates and incidence of adverse events over a 48 week follow-up.
The study protocol was approved by the Kyoto University Hospital Institutional Review Board, IRB of Chiba University Hospital, Tokyo Women’s Medical University Institutional Review Board and Institutional Review Board of Kansai Medical University Hospital. Written informed consent is obtained from all participants. The results will be disseminated through publication in a peer-reviewed journal and presentations at scientific conferences.
jRCT2051230156 (URL: https://jrct.mhlw.go.jp/latest-detail/jRCT2051230156).
To examine HIV care attrition patterns and risk factors among adolescent girls and young women (AGYW) enrolled in prevention of mother-to-child transmission of HIV (PMTCT) services in Tanzania.
Prospective cohort study.
The study was conducted in three regions of Tanzania: Kagera, Tabora and Dar es Salaam across 543 public and private health facilities.
A total of 10 147 pregnant and postpartum AGYW living with HIV attending PMTCT services between 1 January 2018 and 31 December 2020 were included in this study and followed prospectively until they were censored at the last appointment date or 31 December 2023, whichever was earlier.
The primary outcome was time to HIV care attrition, defined as death, discontinuation of antiretroviral treatment (ART) or loss to follow-up (LTFU). LTFU was defined as failure to attend a scheduled clinic appointment and being absent from care for ≥90 consecutive days following a missed appointment among non-transfers. Kaplan-Meier analyses were used to estimate time to first attrition. The Anderson-Gill proportional hazard model estimated the risk factors for repeated care interruptions, adjusted for baseline characteristics and stratified by ART status at PMTCT enrolment.
A total of 3259 attrition events were observed, of which 79% occurred within the first year, with the median time to first attrition of 4 months (IQR: 1–8), 96.3% were due to LTFU. Over two-thirds of first-year attrition occurred among AGYW newly started on ART at PMTCT enrolment, who had more than twice the attrition rate of those already on ART (28.6 vs 11.2 per 100-person-years). Of AGYW lost to follow-up, 44.8% returned to care and 20.9% experienced subsequent attrition. Among AGYW new on ART, attrition was higher in those enrolled late in their third trimester (adjusted HR (aHR) 1.20; 95% CI 1.01 to 1.42) versus those in the first trimester and lower during the postpartum period (aHR 0.58; 95% CI 0.43 to 0.79). In AGYW already on ART, attrition rate was higher among adolescents 18–19 years (aHR 1.37; 95% CI 1.13 to 1.66) and those enrolled late; during the second (aHR 1.41; 95% CI 1.16 to 1.72) and third trimesters (aHR 1.57; 95% CI 1.23 to 2.00) or post partum (aHR 1.36; 95% CI 1.09 to 1.70) compared with the first trimester. AGYW with early-stage HIV, on second-line regimens and attending facilities with fewer AGYW, had a lower attrition rate in contrast to comparison groups.
AGYW newly started on ART at PMTCT enrolment are more likely to have early and recurring dropout. Given the cyclical nature of HIV care engagement, tailored and repeated interventions are needed to support continuous retention and re-engagement for pregnant and postpartum AGYW with HIV.
by Ana Caroline Bini de Lima, Vanessa Cristini Sebastião da Fé, Maria Simara Palermo Hernandes, Emily Caroline Pfeifer de Cristo, Ana Gabrieli dos Santos Fagundes Euzébio, Maria Vitória e Silva Sousa, Fabiana Ribeiro Caldara, Viviane Maria Oliveira dos Santos
This study aimed to evaluate the ability of social noncontact environmental enrichment to facilitate social buffering and to characterize the emotional experience of horses subjected to restraint in stock by assessing physiological parameters and facial expressions. Pantaneiro horses (n = 11) were evaluated in a crossover design with two treatments: social noncontact enrichment during stock restraint and social isolation during stock restraint. Physiological parameters (heart rate, heart rate variability, respiratory rate, ocular temperature by infrared thermography, and auricular temperature by infrared thermometer) and facial expressions (EquiFACS) were assessed throughout the 24-minute restraint period. When horses were accompanied by a conspecific, heart rate, respiratory rate, and eye temperature were lower (p nostril dilator (AD38), inner brow raiser (AU101), upper eyelid raiser (AU5), eye white increase (AD1), ears forward (EAD101), and ears back (EAD104), was also lower (pby Yang Tong, Huang Qianzhen, Tan Bo, Hu Bin, Zhang Min
BackgroundAdvancing the development of centers for disease control and prevention (CDCs) has become a priority within global public health governance. However, public health governance capacity varies significantly among CDCs across different countries and regions, grassroots CDCs face particular disadvantages. Establishing stable, efficient collaborative development mechanisms among CDCs across diverse regions to maximize overall effectiveness and ensure sustainable development represents a critical public health science issue.
ObjectiveThis study aims to provide scientific references and a theoretical foundation for the coordinated development of grassroots CDCs within the Chengdu–Chongqing Economic Circle (CCEC) and the construction of public health systems.
MethodsA questionnaire for collaborative development needs indicators in grassroots CDCs, comprising 4 primary needs and 13 secondary needs, was developed through literature review, the Delphi expert consultation method, and the Kano model. Analysis focused on questionnaires collected from eight grassroots CDCs within the CCEC. The importance of needs was ranked using the better–worse coefficient and satisfaction sensitivity analysis.
ResultsAnalysis of the 110 valid questionnaires showed that for the must-be attribute, satisfaction sensitivity ranked as follows: performance compensation (0.883)> talent exchange and scientific research and innovation cooperation (0.824)> public health emergency rescue mechanism (emergency material reserve and cross-regional material mobilization; 0.817)> cross-regional case monitoring, investigation, and tracking (0.775). Regarding the one-dimensional attribute, the satisfaction sensitivity ranking was joint risk assessment and emergency command (0.937)> business archive co-construction and sharing mechanism (emergency response plan, and technical scheme) (0.909)> regional co-construction and sharing between the university and the local area (0.832). For the attractive attribute, the satisfaction sensitivity ranking was regional monitoring and early-warning information management system (0.922)> community chronic disease prevention and service (0.804)> coordinated transfer and diversion diagnosis and treatment of patient with infectious diseases within the region (0.734). However, the collaborative release and interaction mechanism of social integrated media information, public health collaborative governance entities, and the construction of a cross-regional expert database constitute indifferent attributes.
ConclusionsThis study provides preliminary scientific evidence for the precise allocation of public health resources and the establishment of localized collaborative development mechanisms. Simultaneously, the research methodology and analytical framework offer new theoretical references for similar studies in other regions globally.
by Changze Ou, Binbin Chen, Jun Deng, Huajun Long
BackgroundHistone deacetylases (HDACs) regulate neuroprotection; however, Trichostatin A (TSA), an HDAC inhibitor, lacks clear molecular mechanisms and core targets in Alzheimer’s disease (AD), limiting clinical translation. This study aimed to decipher TSA’s AD-regulating network, screen core genes, and support AD early diagnosis and multi-target therapies.
MethodsTSA targets were computationally predicted. Five GEO AD datasets were analyzed for differential genes and core modules, and 130 machine learning algorithms were employed to identify core genes. Functional annotation, immune cell analysis, and single-cell expression profiling were conducted. Molecular docking and 100 ns molecular dynamics simulations verified TSA-protein interactions.
Results949 potential TSA targets were identified, overlapping with AD differential genes and enriching key pathways such as GABAergic synapse and tau phosphorylation. Eight machine learning-identified core genes (EFNA1, GABRB2, GABARAPL1, EGR1, CDK5, KCNC2, MET, GRIA2) exhibited a distinct AD expression pattern: synergistic downregulation of protective genes and unique upregulation of pathological EFNA1. These genes are implicated in neurotransmission, synaptic plasticity, tau clearance, and immune-neural crosstalk. Molecular dynamics simulations suggested TSA may not stably bind these candidates, implying its regulation relies on epigenetic mechanisms via HDAC1–3/6 inhibition, potentially restoring gene network balance and disrupting neuroinflammation-neurodegeneration cycles. Complex regulatory modes and cell type-specific expression were also observed.
ConclusionThis study provides preliminary insights into TSA’s putative mechanisms in AD intervention, highlighting the eight candidate core genes’ potential diagnostic and therapeutic value as AD biomarkers, supporting TSA’s multi-target therapy. All findings are computationally derived and require experimental verification.