To explore the concurrent trajectories of depressive symptoms and insomnia among adolescents and to analyse the individual, familial and social predictors of the concurrent trajectories.
This study tracked depressive symptoms and insomnia in eight secondary schools annually from 2021 to 2023. We also collected data on individual, familial and social factors that may influence these conditions. Group-based multi-trajectory (GBMT) modelling was used to categorise adolescents into depressive–insomnia severity subgroups.
This study included 2822 adolescents, who were categorised into four groups, including the no symptom group, mild symptom group, symptom relief group and symptom increase group. Compared with the no symptom group, predictors of the mild symptom group were gender (OR = 1.30), academic performance (OR = 1.57), subjective well-being (OR = 0.78), anxiety (OR = 1.14), economic status (OR = 1.23) and relationship with teachers (OR = 1.46). Predictors of the symptom relief group were personality (OR = 1.75), academic performance (OR = 2.28), subjective well-being (OR = 0.69) and anxiety (OR = 1.25). Predictors of the symptom-increasing group were personality (OR = 2.45), academic performance (OR = 1.96), subjective well-being (OR = 0.69), anxiety (OR = 1.20), maternal education level (OR = 1.58), family function (OR = 0.93), parental relationship (OR = 2.07) and relationship with teachers (OR = 1.54).
This study provided a comprehensive understanding of the concurrent trajectories of depressive symptoms and insomnia among adolescents, revealing distinct subgroups and identifying predictors across individual, familial and social levels.
This study emphasises the importance of a multi-faceted approach involving family, school and society to promote adolescent mental health and also highlights the need for conducting precise interventions according to adolescents' features.
The identification of four distinct symptom trajectories and their predictors advances the understanding of adolescent mental health development, informing precision prevention strategies.
STROBE checklist.
None.
Acute kidney injury (AKI) is a significant challenge in hospital settings, and accurately differentiating between intrinsic and prerenal AKI is crucial for effective management. The fractional excretion of urea (FEUN) has been proposed as a potential biomarker for this purpose, offering an alternative to traditional markers such as fractional excretion of sodium. This study aimed to assess the diagnostic accuracy of FEUN for differentiating intrinsic from prerenal AKI in hospitalised patients.
We conducted a systematic review and bivariate random effects meta-analysis of diagnostic accuracy studies. The study followed the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach.
PubMed, Embase and Cochrane databases were searched from inception to 1 November 2023.
We included observational studies that focused on patient with AKI and reported FEUN data sufficient to reconstruct a complete 2x2 contingency table (true positives, true negatives, false positives and false negatives) for evaluating its diagnostic accuracy.
Two reviewers extracted data, assessed risk of bias with Quality Assessment of Diagnostic Accuracy Studies-2 and graded certainty of evidence using the GRADE approach. Pooled sensitivity, specificity, positive and negative likelihood ratios, and the area under the summary receiver operating characteristic curve (SROC) were calculated; heterogeneity was measured with I². A prespecified subgroup restricted to patients receiving diuretics served as a sensitivity analysis.
12 studies involving 1240 patients were included, with an overall occurrence rate of intrinsic AKI of 38.8%. FEUN had a pooled sensitivity of 0.74 (95% CI 0.60 to 0.84) and specificity of 0.78 (95% CI 0.66 to 0.87), with positive predictive value and negative predictive value of 0.76 (95% CI 0.68 to 0.83) and 0.74 (95% CI 0.66 to 0.81), respectively. The SROC curve showed a pooled diagnostic accuracy of 0.83. Heterogeneity was substantial (I²>90%) for sensitivity and specificity. In a diuretic-only subgroup (six studies) specificity rose to0.87 and heterogeneity declined (I²=56%). Overall certainty of evidence was low owing to inconsistency.
FEUN is a biomarker with moderate diagnostic accuracy for differentiating between intrinsic and prerenal AKI in hospitalised patients. Its application could enhance AKI management; however, the high heterogeneity observed in our study highlights the need for further research to evaluate its utility across diverse patient populations and clinical settings.
CRD42024496083.
Transcranial magnetic stimulation (TMS) and upper extremity manipulation training have demonstrated clinical effectiveness in stroke rehabilitation. Post-stroke, the affected cerebral cortex often shows reduced excitability, which can limit the optimal outcomes of conventional manual training. To address this, we developed a new upper limb training method integrating TMS with active sensory training (AST) to enhance the fine motor ability in the upper limbs following stroke, potentially improving overall rehabilitation efficacy. However, the clinical effectiveness of this approach remains unclear. Importantly, we demonstrated the efficacy of the new rehabilitation strategy by using TMS in conjunction with AST in patients experiencing upper limb motor dysfunction after stroke.
This single-centre, single-blind, sham stimulation, randomised controlled clinical trial investigated the efficacy of AST combined with TMS in patients with stroke and upper limb motor dysfunction post-stroke (1–24 months post-onset) at Brunnstrom stages III–V. Upper limb motor function was evaluated before and 2 weeks after the intervention. The primary outcome was the Action Research Arm Test result, and the secondary indicators included results on the Fugl–Meyer Assessment Upper Extremity Scale, Modified Barthel Index, Semmes–Weinstein Monofilament, Erasmus MC revised Nottingham Sensory Assessment Scale, Embodied Sense of Self Scale (stroke version), functional near-infrared spectroscopy and neuroelectrophysiology. Between-group differences were analysed using independent t-tests, and within-group differences were examined with paired t-tests, with statistical significance set at p
This study was approved by the Ethics Committee of the Second Rehabilitation Hospital of Shanghai for ethical application (Approval number: 2024-34-01). Written informed consent will be obtained from all participants. Study results will be disseminated through peer-reviewed journals and presentations at local and international conferences.
ChiCTR2500097067.
This study assessed the global burden of glaucoma using data from the Global Burden of Disease (GBD) 2021 study. The analysis of epidemiological trends aimed to inform future public health prevention strategies.
Retrospective cross-sectional study.
None.
Analysis of 1990–2021 GBD data on glaucoma prevalence, disability-adjusted life years (DALYs), age-standardised prevalence rates (ASPR), and age-standardised DALY rates (ASDR). Estimated annual percentage changes (EAPC) were calculated, Joinpoint regression identified trend changes, and Autoregressive Integrated Moving Average (ARIMA) modelling projected the burden for the year 2050.
Globally, the number of prevalent glaucoma cases increased from 4 072 106.59 (95% uncertainty interval (UI) 3 489 888.7 to 4 752 867.3) in 1990 to 7 587 672.9 (95% UI 6 522 906 to 8 917 725.4) in 2021. Concurrently, DALYs increased from 467 600.4 (95% UI 323 490.5 to 648 641.6) in 1990 to 759 900.2 (95% UI 530 942.9 to 1 049 127.2) in 2021. In contrast, the ASPR and ASDR declined to 90.1 per 100 000 population (95% UI 77.8 to 105.5) and 9.1 per 100 000 population (95% UI 6.3 to 12.5) in 2021, respectively. During the COVID-19 pandemic period (2019–2021), the slowest growth rates in crude case numbers and overall disease burden were observed, accompanied by the most pronounced decline in annual percentage change of ASPR. The highest estimates for both case counts and DALYs were identified in the 70–74 age group, with males demonstrating higher prevalence rates than females. Furthermore, regions with lower Sociodemographic Index (SDI) values bore a disproportionately higher burden of glaucoma.
These findings underscore the need to strengthen early screening and treatment of glaucoma, particularly in ageing populations, male groups and low SDI regions. We urge cautious interpretation of COVID-19 related data and vigilance against potential post-pandemic surges in burden. Critical strategies include enhanced screening and intervention for high-risk groups, targeted prevention measures and integration of ophthalmic care into public health emergency frameworks to alleviate the disease burden.
by Yuxuan Gao, Shiyao Jiang, Yu Cui, Yumeng Wang, Lili Yu
With the extensive clinical application of immune checkpoint inhibitors (ICIs), immune-related adverse events (irAEs) associated with these agents have increasingly garnered significant attention. Unlike other irAEs, endocrine irAEs are mostly irreversible, with variable and nonspecific symptoms, which poses challenges for clinicians in diagnosis. As a result, this study leveraged the U.S. Food and Drug Administration Adverse Event Reporting System (FAERS) and the Japanese Adverse Drug Event Report (JADER) pharmacovigilance databases to conduct an in-depth investigation into adverse events induced by PD-1/PD-L1 inhibitors, with a focus on irAEs induced by PD-1/PD-L1 inhibitors. This study pioneers the systematic cross-database validation of endocrine irAEs induced by PD-1/PD-L1 inhibitors. The integration of data from the JADER offers unique safety insights for Asian populations, bolsters global pharmacovigilance efforts, and uncovers regional variations in irAEs reporting. Notably, this study revealed a higher prevalence of endocrine irAEs among men aged over 50 years receiving PD-1/PD-L1 inhibitors. Both PD-1 and PD-L1 inhibitors are strongly associated with thyroid dysfunction, adrenal insufficiency, and pituitary inflammation. Additionally, it identifies several previously undocumented endocrine irAEs. This result unearthed safety signals hitherto unreported in drug inserts, underscoring the imperative for updating the safety labeling of PD-1/PD-L1 inhibitors with respect to endocrine irAEs. The emergence of off-label uses further underscores the need for additional clinical trials to assess their efficacy and safety.by Yonggang Chen, Jintai Luo, Yingying Zheng, Xiaomei Jiang, Zixiang Yang, Xiaobing Liu
BackgroundDiabetic kidney disease (DKD) poses a significant health burden with inadequate diagnostic sensitivity. This study develops non-invasive biomarkers by integrating urinary and renal single-cell sequencing with machine learning.
MethodsThis study analyzed DKD single-cell and bulk transcriptomic data from public repositories. We established a computational pipeline to distinguish kidney-originating cells in urinary sediments, enabling the identification of injury-associated gene signatures. These signatures were refined using machine learning to develop a diagnostic model, which was validated in independent cohorts. The biomarkers were further verified in DKD renal tissues at single-cell resolution and across multiple nephropathies. Functional and spatial analyses confirmed biological relevance using transcriptomic and histological validation.
ResultsSingle-cell analysis of 2,089 urine-derived cells identified eight renal cell types, including injured proximal tubule cells (Inj-PTC) showing upregulated injury markers (HAVCR1, VCAM1) and enriched apoptotic/TGF-β pathways. A machine learning-selected biomarker panel (PDK4, RHCG, FBP1) demonstrated strong diagnostic value (area under the curve, AUC > 0.9), with consistent downregulation across multiple chronic kidney diseases. PDK4 and FBP1 were specifically suppressed in DKD renal Inj-PTC (p Conclusions
This study identifies a three-gene biomarker panel (PDK4, RHCG, FBP1) as a promising non-invasive diagnostic tool for DKD. While demonstrating excellent diagnostic performance. It represents a tubular injury-associated gene signature that is detectable in urinary cells and shows strong association with DKD in transcriptomic datasets, presenting a promising candidate for a non-invasive diagnostic assay.
Technology-assisted interventions offer a promising alternative to conventional cardiac rehabilitation. However, there is limited evidence on their effectiveness, particularly in non-Western settings with emphasis on exercise self-efficacy.
To evaluate the effects of a 12-week, technology-assisted hybrid cardiac rehabilitation (TecHCR) program on physical, physiological, and psychological outcomes of patients with coronary heart disease.
A two-arm parallel randomized controlled trial including 160 participants was randomly assigned to either TecHCR or usual care. TecHCR was underpinned by the Health Belief Model, consisting of three supervised exercise training and occupational therapy sessions, a fitness watch for exercise self-monitoring, six audio-visual educational videos, and a weekly video call follow-up. Data were collected at baseline, immediately post-intervention, and at 24 weeks post-intervention.
Participants in TecHCR demonstrated significantly greater improvement in exercise self-efficacy (β = 5.909, 95% CI [3.146, 8.672]; p < 0.001), health-promoting behaviors (β = 9.058, 95% CI [5.524, 12.591]; p < 0.001), and perceived anxiety levels (β = −1.255, 95% CI [−1.893, −0.616]; p < 0.001) at immediate post-intervention and (β = 8.506, 95% CI [4.951, 12.061]; p < 0.001, β = 14.563, 95% CI [8.809, 20.317]; p < 0.001, β = −1.145, 95% CI [−1.975, −0.315]; p = 0.007, respectively) 24 weeks post-intervention when compared with the control group. No statistically significant improvements were observed in perceived depression and cardiovascular risk factors.
The TecHCR program, combining supervised sessions with technology-assisted components, is an effective approach for significantly improving exercise self-efficacy, health-promoting behaviors, and anxiety in patients with coronary heart disease. Healthcare institutions should consider implementing hybrid programs to overcome barriers to traditional cardiac rehabilitation, leveraging technology to extend support and maintain patient engagement beyond supervised sessions.
clinicaltrials.gov identifier: NCT04862351
by Ian C. Murphy, Kelly Bryan, Muriel Burk, Rong Jiang, Francesca Cunningham, Sarah Providence, Elizabeth Rightnour, Sarah Zavala, Kathleen Morneau, Trisha Exline, Stacey Rice, Travis Schmitt, Kelly Drumright, Jennifer Lee, BreAnna Davids, Tram Guilbeault, Brooke Klenosky, Ann-Marie Sutherland, Abbie Rosen, Lauren Ratliff, Kenneth Bukowski, Margaret A. Pisani, Andrew Franck, Mark Wong, Preston Witcher, Kathleen M. Akgün
OBJECTIVESEarly data suggested higher sedative requirements for ventilated COVID+ patients, deviating from established guidelines. We assessed the relationship between sedative use and outcomes in mechanically ventilated Veterans during the COVID-19 pandemic.
DesignRetrospective Medication Use Evaluation
SettingNational Sample of 13 Distinct VA Medical Center Intensive Care Units
PatientsCritically ill Veteran patients requiring mechanically ventilation for ≥2 days
InterventionsNone.
Measurements and main resultsThe proportion of patients receiving fentanyl, midazolam and propofol was higher during COVID years. Compared with pre-COVID, median fentanyl dose was higher during Years 1 and 2 (1575mcg [(IQR) 1000–1650] vs. 1900 [1250–3000] vs. 1910 [1150–3500]). Adjuvant antipsychotics use was relatively low but tended to increase over time (pre = 10.5% vs. Year 1 = 12.3% vs. Year 2 = 14.1%). Most patients started on antipsychotics in the ICU were continued on the drug after extubation. Mortality was higher during COVID years (pre = 26.9% vs. 1 = 36.8% and 2 = 35.9%). In stratified analyses by COVID status years 1–2 (n = 79, 27%), a higher proportion of COVID+ patients received fentanyl (96% vs. 84%) and propofol (90% vs. 77%) and at higher doses (fentanyl = 1650mcg vs. 2688mcg median cumulative dose; propofol maximum infusion rate = 30 mc/kg/min (20–50) vs. 40 (25–50)). Sedative doses were similar to pre-COVID among non-COVID patients. Anti-psychotics were more frequently continued post extubation among COVID+ (34.6% vs. non-COVID+=14.9%). COVID+ patients were also less likely to have awakening and breathing trials at 48 hours after intubation (18% vs. 46%).
ConclusionsSedative use and dosing increased during the first two years of COVID compared to pre-COVID, especially for COVID+ patients. The sustained elevated levels of fentanyl use in Year 2 suggests possible ‘therapeutic creep’ away from guideline-concordant practices for COVID+ patients. Antipsychotic prescription during intubation and following extubation was also more common among COVID + . These findings could inform development and implementation of safer sedation practices across VA ICUs during respiratory pandemics.
Urinary incontinence, often perceived as embarrassing, perpetuates the stigma that delays treatment and encourages concealment. This stigma significantly diminishes quality of life and imposes both financial and medical burdens. Although prior research has examined stigma reduction in urinary incontinence, it persists as a widespread issue. Most studies have focused on interviews, primarily addressing urine leakage, with a limited understanding of the factors influencing urinary incontinence stigma and their interrelations. More in-depth quantitative studies are crucial to inform targeted interventions.
(1) To develop targeted interventions aimed at alleviating urinary incontinence-related stigma in older adults. (2) To identify factors that mitigate stigma in older adults with urinary incontinence. (3) To examine the associations between these factors and stigma.
Cross-sectional survey.
A cross-sectional survey was conducted with 510 older adults across three hospitals in Guangdong from July 2022 to January 2024, utilising the SSCI-24 and Incontinence Severity Index. Three multivariate linear regression models, adjusted for covariates based on directed acyclic graphs, were employed to explore the relationships between variables and stigma. Additionally, subgroup analyses were performed.
Participants reported higher levels of self-stigma compared to perceived stigma. Multivariate analysis revealed significant associations between urinary incontinence type, severity, frequency of micturitions and stigma. Key factors contributing to stigma reduction include managing incontinence severity, reducing frequency of micturitions and preventing the progression to mixed incontinence.
The study identified associations between urinary incontinence characteristics—type, severity and frequency of micturitions—and stigma. Strategies for stigma reduction are proposed, underscoring the vital role of nurses in this process.
The findings of this study contribute to a deeper understanding of stigma surrounding urinary incontinence in older adults and provide insights for developing more effective interventions by healthcare professionals and community caregivers.
This study adhered to the STROBE checklist for observational studies.
No patient or public contribution.
This study aims to determine the mediating effect of emotional exhaustion on the relationship between lateral violence in nursing and turnover intentions.
A cross-sectional survey.
This research project involved the enrollment of 314 nursing professionals from two tertiary medical facilities associated with academic institutions in Changsha, Hunan Province, China. To measure the variables, a series of self-administered questionnaires was used. The data were analysed using SPSS 25.
Lateral violence and emotional exhaustion were positively correlated with turnover intention. Emotional exhaustion partially mediated the relationship between lateral violence and turnover intention.
Emotional exhaustion serves as a partial mediator in the relationship between lateral violence and turnover intention. Reducing lateral violence and avoiding emotional exhaustion can help to reduce the turnover intention of nurses.
When developing targeted programs or policies aimed at decreasing nurses' turnover intention, it is important to consider the issue of lateral violence among nurses, as well as their negative emotions.
The study provides us with a more fine-grained understanding of the relationship between lateral violence among nurses and turnover intention. Insights to enhance nurse retention are also provided, which can support the development of future relevant policies and guidelines.
The study adhered to the STROBE guidelines.
No patient or public contribution.
Smoking cessation is a pressing public health concern. Behavioral therapy has been widely promoted as a means to aid smoking cessation. Acceptance and commitment therapy (ACT), based on the principles of cognitive behavioral therapy, can help participants accept, rather than suppress, the physical and emotional experiences and thoughts associated with not smoking, identify experiential avoidance behaviors, strengthen the determination to quit, and ultimately commit to adaptive behavioral changes guided by smoking-cessation-related values, thereby achieving the goal of quitting smoking.
To assess the effects of ACT compared with other smoking cessation interventions by examining three key outcomes: cessation rates, smoking behaviors, and psychological outcomes.
We searched 8 databases and 2 registration platforms, covering the period from inception to March 26, 2025. We included only randomized controlled trials that recruited adult smokers and implemented ACT for smoking cessation, with the comparison group receiving either active treatment, no treatment, or any other intervention.
A total of 23 studies involving 8951 participants were included. The findings indicated that, compared with all types of control interventions, ACT significantly increased smoking cessation rates both immediately postintervention (RR = 1.48, 95% CI [1.03, 2.14], p = 0.04, I 2 = 81%) and at short-term follow-up (RR = 1.63, 95% CI = 1.31 to 2.01, p < 0.01, I 2 = 0%). Subgroup analyses showed that ACT significantly improved short-term cessation rates compared with behavioral support (RR = 1.60, 95% CI [1.27, 2.02], p < 0.01, I 2 = 0%), while, compared with the blank control, ACT significantly increased smoking cessation rates across three different time points (postintervention: RR = 3.11, 95% CI [2.13, 4.54], p < 0.01, I 2 = 0%; medium-term follow-up: RR = 2.55, 95% CI [1.32, 4.93], p < 0.01; long-term follow-up: RR = 3.33, 95% CI [1.66, 6.68], p < 0.01). Narrative synthesis suggested that compared with behavioral therapy, ACT may confer benefits in improving psychological outcomes, while compared with the blank control, it may also reduce daily cigarette consumption and nicotine dependence, and enhance psychological outcomes.
Acceptance and commitment therapy may be a beneficial approach for improving cessation rates, enhancing smoking cessation behaviors, and promoting psychological well-being among adult smokers. However, the quality of the included evidence was limited, thereby weakening the strength of these findings. Future rigorously designed trials with larger sample sizes, particularly those comparing ACT against other smoking cessation interventions, are warranted to further confirm its effects.
by Lei Guo, Jun Ge, Li Cheng, Xinyi Zhang, Zhengzheng Wu, Meili Liu, Hanmei Jiang, Wei Gong, Yi Liu
BackgroundThe incidence of ulcerative colitis (UC) remains high, with an increasing prevalence among elderly patients. Cellular senescence has been widely recognized as a contributor to UC susceptibility; however, the underlying molecular mechanisms remain incompletely understood. This study aimed to identify senescence-associated biomarkers in UC to provide new insight for diagnosis and treatment.
MethodsBy integrating transcriptomic data from UC patients with established aging-related databases, we identified aging-associated differentially expressed genes (DEGs). Using weighted gene co-expression network analysis (WGCNA) and Cytoscape, we pinpointed the core genes involved. A diagnostic model for UC was then developed based on these core genes, and their expression patterns were characterized at single-cell resolution. The roles of these genes were ultimately validated through in vitro and animal experiments.
ResultsWe identified 24 aging-related DEGs in UC, which were primarily implicated in inflammatory responses and cytokine-receptor interactions. Further analyses pinpointed three core genes (CXCL1, MMP9, and STAT1) that were predominantly expressed in macrophages. A diagnostic model constructed using these genes exhibited robust predictive performance. Experimental validation confirmed that the expression levels of all three core genes were significantly upregulated in both a UC mouse model and in macrophages compared to controls. Additionally, pathway analyses revealed elevated levels of CXCL12 and VEGFA in the enriched pathways.
DiscussionOur findings underscore the pivotal roles of CXCL1, MMP9, and STAT1 in UC-associated cellular senescence. The analysis positions these molecules as promising macrophage-mediated diagnostic biomarkers and therapeutic targets. Collectively, this work provides novel insights into UC pathogenesis and lays a foundation for developing precision medicine strategies that target senescence pathways.
This study aimed to identify potential categories of rotation stress among nurses undergoing standardised training and to explore the relevant factors associated with each profile.
Cross-sectional study.
Data were collected in November 2024 from three hospitals in Zunyi City, Guizhou Province, China.
Nurses undergoing standardised training were recruited for this study.
Convenience sampling method was used to recruit standardised training nurses in November 2024 from three hospitals in Zunyi City, Guizhou Province. The survey instruments used included demographic characteristics questionnaire, the Nursing Job Rotation Stress Scale and the Maslach Burnout Inventory. Latent profile analysis method was used to analyse rotation stress characteristics of nurses during standardised training. Additionally, logistic regression was performed to identify the factors influencing different characteristics.
A total of 493 nurses completed the questionnaires, of which 453 were valid, resulting in a validity rate of 91.88%. Rotation stress was classified into two profiles: ‘Low Emotional Response–Stress Adaptation Group’ (21.5%) and ‘High Emotional Response–Stress Distress Group’ (78.5%). Univariate analysis showed that highest degree (2=11.389, p=0.001), monthly night shifts (2=33.913, p2=20.858, p2=12.319, p2=35.754, p2=15.357, p=0.002) significantly influenced the two subgroups. Multivariable regression analysis revealed significant associations of monthly night shifts, pretraining work experience, training duration and burnout level (p
Nurses undergoing standardised training exhibit two distinct rotation stress profiles. Monthly night shifts, pretraining work experience, training duration and burnout are significant factors. Nursing managers should implement targeted interventions such as mindfulness, laughter therapy and emotional freedom techniques to mitigate stress and thereby enhance the quality of standardised training.
Flexible ureteroscopy has advanced modern stone management; however, lower pole renal stones remain a challenge due to suboptimal ureteroscope deflection and navigation using conventional flexible and navigable suction ureteral access sheaths (FANS). The SCULPT trial is designed to assess whether the novel steerable FANS—which enables active controlled deflection—can improve the success rate of lower pole access during flexible ureteroscopy.
This multicentre, prospective, single-blinded, randomised controlled superiority trial will recruit 400 adult patients (aged 18–75 years) with solitary lower pole renal stones ≤2 cm diagnosed by CT from 20 high-volume urological centres in China. Participants will be randomised 1:1 to undergo flexible ureteroscopy with either steerable or conventional FANS. The primary outcome is the success rate of navigating into the lower pole calyx (defined as successful direct stone visualisation, laser lithotripsy and aspiration without adjunct use). Secondary outcomes include immediate and 1 month stone-free rates, operative time, complication profiles (graded by Clavien–Dindo), instrument damage rates, quality-of-life assessments and cost analysis. Statistical analysis will be performed using appropriate tests for continuous and categorical data, with their significance set by prespecified superiority margins.
The study protocol has been designed in accordance with the Declaration of Helsinki and ICH-GCP guidelines. Ethical approval was centrally granted by the Institutional Review Board of The First Affiliated Hospital of Guangzhou Medical University and adopted by all participating centres following local feasibility review. The trial results will be disseminated via peer-reviewed publication and presentation at international conferences.
Microvascular obstruction (MVO) is a common complication following primary percutaneous coronary intervention (PPCI) for ST-segment elevation myocardial infarction (STEMI) and is strongly associated with adverse clinical outcomes. MVO is a dynamic, multifactorial process shaped by factors spanning the myocardial infarction–reperfusion continuum and by PPCI-related microcirculatory injury, which leaves current early risk stratification—often a static snapshot—with limited power to anticipate its evolution. Renalase, a cardioprotective enzyme, exhibits a post-reperfusion surge that parallels MVO development; periprocedural renalase release may likewise be driven by overlapping mechanisms along the ischaemia–reperfusion pathway. This hypothesis-generating observation supports evaluating the delta-Renalase (periprocedural change in serum renalase) as a candidate association-based biomarker. Accordingly, this study aims to assess whether delta-Renalase is independently associated with MVO in patients with STEMI after PPCI and to evaluate its incremental predictive value, without causal inference.
The Renalase and MicroVascular Obstruction Study (ReMVOS) is a prospective, single-centre, observational cohort study conducted at a nationally accredited chest pain centre in China. We will enrol 266 patients with consecutive STEMI with symptom onset within 12 hours who undergo PPCI. The exposure variable is delta-Renalase, calculated as the increase in serum renalase levels at 24 hours post-PPCI relative to the preprocedural baseline. The primary outcome is the presence of MVO, assessed by cardiovascular magnetic resonance (CMR) performed 2–5 days post-PPCI. Secondary outcomes include infarct size and peak global longitudinal strain quantified by CMR, major adverse cardiovascular events within 90 days and peak oxygen pulse from cardiopulmonary exercise testing (CPET) at the 90-day visit. The independent association and predictive value of delta-Renalase will be evaluated using a prespecified multivariable logistic regression model.
This protocol has been approved by the Ethics Committee of the Third Xiangya Hospital of Central South University (approval No. K24655). All patients will provide written informed consent prior to enrolment. The findings of this study will be disseminated through publications in peer-reviewed international medical journals and presentations at relevant academic conferences.
This study aimed to analyse the burden of myocarditis in the Western Pacific Region (WPR).
Data from the Global Burden of Disease (GBD) Study 2021, covering 31 countries in the WPR, were analysed.
Patients diagnosed with myocarditis.
Numbers and age-standardised rates (ASRs) of incidence, prevalence, mortality and disability-adjusted life years (DALYs), along with their average annual percentage changes (AAPCs), were included. The contributions of population growth, ageing and epidemiological changes to ASR changes were assessed. Additionally, the ASRs of four indicators are projected until 2035.
In 2021, GBD estimates for myocarditis were 375 241.19 incident cases, 15 307.52 deaths and 379 674.28 DALYs in the WPR. From 1990 to 2021, the incidence, prevalence and mortality increased by 53.58%, 67.88% and 67.16%, respectively, whereas DALYs decreased by 24.77%. ASRs declined across all metrics: incidence (17.68 to 16.70 per 100,000; AAPC = –0.18, 95% CI –0.19 to –0.18), mortality (0.82 to 0.64 per 100,000; AAPC = –0.78, 95% CI –0.83 to –0.72) and DALYs (35.69 to 19.36 per 100,000; AAPC = –1.97, 95% CI –2.02 to –1.89). Papua New Guinea exhibited the highest increases in incidence, prevalence, deaths and DALYs. Japan, Singapore, China and Kiribati had the highest age-standardised incidence rate (ASIR), prevalence rate (ASPR), mortality rate (ASMR) and DALY rate (ASDR), respectively. Individuals aged ≥65 years and infants had significantly higher ASIR, ASMR and ASDR. Males consistently demonstrated higher myocarditis ASRs than females in the WPR from 1990 to 2021. Ageing was identified as the primary driver of increased incidence and mortality. Projections indicate that the ASIR of myocarditis will remain stable through 2035.
The burden of myocarditis in the WPR exhibits substantial cross-country variation, with males, infants and the elderly disproportionately affected, underscoring the urgent need for context-specific management strategies tailored to high-risk populations and regional epidemiological profiles.
To assess the comparative effectiveness of educational interventions in neurological disease for healthcare workers and students.
Systematic review.
Medline, Embase and Cochrane through to 1 June 2025.
Studies evaluating neurological disease educational interventions with a comparator group (observational cohort/randomised controlled trial (RCT)) were included.
A Preferred Reporting Items for Systematic Reviews and Meta-Analyses-compliant systematic review was conducted (PROSPERO: CRD42023461838). Knowledge acquisition and educational methodologies were collected from each study. Study outcomes were classified using the Kirkpatrick and Kirkpatrick four-level model (learner reaction, knowledge acquisition, behavioural change, clinical outcome).1 Risk of bias was assessed using the Newcastle-Ottawa scale for non-randomised studies and the Cochrane Risk of Bias tool for RCTs.2 3
A total of 67 studies involving 4728 participants were included. Of these, 36 were RCTs, and 31 were observational studies. Virtual interventions were the most common (67.2%, n=45 studies), primarily targeting either medical students (46.3%, n=31 studies) or specialists (40.3%, n=27 studies). Overall, 70.1% (n=47) of studies demonstrated outcomes in favour of the intervention. However, few studies used K&K level 3/4 outcomes, with two studies evaluating behaviour change (level 3) and three assessing clinical outcomes (level 4 combined with other levels). No study exclusively assessed level 4 outcomes. Meta-analysis of 22 RCTs with calculable standardised mean differences (SMDs) (n=1748) showed a significant benefit of interventions (SMD 0.75, 95% CI 0.22 to 1.27, p=0.0056).
This review highlights a growing body of research particularly focusing on virtual techniques, specialist audiences and treatment-oriented content. Few studies assessed changes in practice or patient care. Non-specialists remain underrepresented. Future studies should prioritise assessing the clinical impact of educational interventions within non-specialist audiences.
Sense organ diseases (SODs) are among the leading causes of disability worldwide. They severely impact communication, mobility and quality of life, with rising prevalence and widening inequalities across populations. This study aims to provide an updated, comprehensive assessment of the global, regional and national burden and trends of SODs, and to inform strategies for prevention, treatment and health policy development.
This is a population-based observational study using secondary data from the Global Burden of Disease (GBD) 2021 study. SODs, defined in the GBD framework as age-related and other hearing loss (AHL), blindness and vision loss (BVL), and other sensory impairments, were analysed in terms of prevalence and disability-adjusted life years (DALYs). We focused on SODs overall and conducted specific analyses for AHL and BVL, stratified by age, sex and sociodemographic index (SDI).
Global dataset covering 204 countries and territories across all regions and sociodemographic strata from 1990 to 2021.
This study covered the global population represented in the GBD 2021 dataset, using aggregated population-level estimates with no direct individual recruitment.
Not applicable.
Primary outcomes were prevalence (cases and age-standardised prevalence rates) and DALYs (number and age-standardised DALY rates). Secondary outcomes included age–period–cohort effects, decomposition of contributors (population growth, ageing and epidemiological change), inequality metrics and burden projections to 2030.
Between 1990 and 2021, the global age-standardised rate (ASR) of DALYs for SODs increased from 884.07 to 912.8 per 100 000 population. The ASR of prevalence rose from 25 297.36 to 28 050.29 per 100 000. The disease burden increased across all age groups, with females experiencing a higher prevalence of SODs, and population growth and ageing as the leading contributors. AHL emerged as the predominant category of SODs. Socioeconomic disparities widened, with the slope index of inequality for DALYs rising from 128.82 in 1990 to 418.62 in 2021. In 2021, China reported the highest DALYs and case numbers. Predictive analysis showed a stable ASR of DALYs and prevalence, but a continued rise in cases through 2030, with COVID-19 further exacerbating the burden.
The global burden of SODs continues to rise, driven primarily by population ageing and growth, with widening disparities across sociodemographic levels. These findings emphasise the need for targeted prevention strategies, improved early detection and equitable access to sensory healthcare services. Monitoring the long-term impact of COVID-19 and demographic shifts remains a priority.
Not applicable. This study is a secondary analysis of GBD data and is not linked to a clinical trial.
by Wenya Bai, Shixuan Liu, Guilin Zhou, Xuelian Li, Huan Jiang, Jianlin Shao, Junchao Zhu
BackgroundMicroglia polarization plays a crucial role in the progression of cerebral ischemia-reperfusion injury (CIRI), but the mechanisms remain largely undefined. The preset study aimed to investigate the mechanism of microglia polarization following CIRI.
MethodsCIRI was modeled in C57BL/6J mice through middle cerebral artery occlusion-reperfusion and in BV2 cells via oxygen and glucose deprivation/reoxygenation. Reverse transcription-quantitative PCR, western blotting, flow cytometry and fluorescence staining were used to detect the expression levels of key proteins associated with microglia polarization, as well as the expression of TNFAIP3 and RACK1. The interaction between TNFAIP3 and RACK1 was verified by co-immunoprecipitation. TNFAIP3 or RACK1 gene interference (overexpression and/or silencing) was employed to examine the role of the TNFAIP3/RACK1 axis in microglia polarization following CIRI.
ResultsThe results revealed that Arg-1 expression decreased, inducible nitric oxide synthase expression increased and TNFAIP3 was upregulated 24 h after CIRI. Furthermore, TNFAIP3 interacted with RACK1 to deubiquitinate and increase the expression of RACK1. These results indicate that knocking down either TNFAIP3 or RACK1 promotes microglia M1 polarization, and overexpression of RACK1 can promote microglia M2 polarization. RACK1 exerts its neuroprotective effects through NF-κB, as demonstrated by the use of NF-κB inhibitors.
ConclusionThe present findings indicate that TNFAIP3 inhibits M1 microglial polarization via deubiquitination of RACK1 after CIRI, RACK1 exerts its effects through NF-κB.