by Omar Muhumed Maidhane, Omran Salih, Abdisalam Hassan Muse, Abdirahman Omer Osman, Muse H. Abdi, Mahdi Hashi Hassan, Nur Mohamud Ali, Shacban Abdilahi Elmi
BackgroundAccess to adequate sanitation remains a critical public health challenge in Somalia, where a large portion of the population relies on unimproved facilities due to persistent conflict, climate shocks, and political instability. This reliance contributes to a high burden of waterborne diseases. This study aimed to assess the spatial distribution of unimproved sanitation and identify its individual and community-level determinants using recent national data to inform targeted interventions.
MethodsThis study is a secondary analysis of the 2022 Somalia Integrated Household Budget Survey (SIHBS), which included 7,212 households. The primary outcome was the use of unimproved sanitation facilities, categorized according to the WHO/UNICEF Joint Monitoring Programme (JMP) definitions. We employed a multilevel logistic regression model to identify individual and community-level determinants associated with unimproved sanitation. To analyze the spatial patterns of unimproved sanitation, we used Global Moran’s I for spatial autocorrelation and the Getis-Ord Gi* statistic for hotspot analysis.
ResultsOverall, 36.87% of Somali households use unimproved sanitation facilities. There are significant disparities across residence types, with the highest prevalence among nomadic populations (83.28%), followed by rural (51.10%) and urban (23.88%) residents. The multilevel analysis revealed that households in permanent/formal housing (AOR: 3.42) and those with IDP status (AOR: 3.18) had significantly higher odds of using unimproved sanitation. At the community level, urban residence was paradoxically associated with higher odds of unimproved sanitation (AOR: 7.99) compared to rural areas, while nomadic populations had significantly lower odds (AOR: 0.04), likely reflecting a high prevalence of open defecation not captured as a “facility.” Spatial analysis identified significant hotspots of unimproved sanitation in the Hiraan (90.65%) and Bay (80.39%) regions, and cold spots in Banadir (5.37%) and Lower Shabelle (3.70%).
ConclusionThe findings highlight deep inequalities in sanitation access across Somalia, driven by geographic location, socioeconomic status, and population group. The high prevalence of unimproved sanitation, especially among nomadic, rural, and displaced populations, calls for urgent, geographically-targeted interventions. A multi-pronged approach is necessary, focusing on the specific needs of different communities and addressing the underlying structural and individual-level drivers of poor sanitation to advance public health and sustainable development goals in the region.
by Shuhong Zheng, Renxiu Bian, Haixin Song, Zhiping Liao, Ting Gao, Min Yan, Heqing Huang, Zuodong Lou, Fangchao Wu, Jianhua Li
BackgroundLow-intensity focused ultrasound (LIFU) is a non-invasive neuromodulation technique with high spatial precision and the ability to reach deeper brain regions, offering potential advantages for post-stroke rehabilitation. Repetitive transcranial magnetic stimulation (rTMS) is a widely adopted non-invasive brain stimulation technique that modulates cortical excitability to promote neuroplasticity. However, direct head-to-head comparisons between these two modalities for post-stroke motor recovery remain limited.
ObjectiveTo perform a secondary head-to-head comparison of LIFU and repetitive transcranial magnetic stimulation (rTMS) for motor recovery after stroke, based on a prospectively registered randomized controlled trial.
MethodsThis secondary analysis included patients with subacute stroke who received two weeks of standard rehabilitation combined with either LIFU (n = 25) or rTMS (n = 25) targeting the ipsilesional primary motor cortex. LIFU parameters: 0.5 MHz, spatial-peak pulse-average intensity (ISPPA) 10.2 W/cm² (free-field), pulse duration 0.2 ms, duty cycle 20%, 20 minutes per session, five days per week for two weeks (10 sessions total). rTMS parameters: 10 Hz, 80% resting motor threshold, 1,000 pulses per session (20 trains of 5 seconds), 20 minutes per session, five days per week for two weeks (10 sessions total). Motor outcomes were assessed using the Fugl–Meyer Assessment (FMA; upper and lower extremities), Modified Barthel Index (MBI), and Brunnstrom stages. Resting-state functional near-infrared spectroscopy (fNIRS) was used to evaluate cortical activity and functional connectivity before and after the intervention. Primary analyses were conducted in the intention-to-treat (ITT) population (n = 50), with completer analyses (n = 43) performed as sensitivity analyses.
ResultsBoth groups showed significant within-group improvements in FMA and MBI after the intervention (all p 0.05), and completer analyses yielded consistent between-group conclusions. In contrast, change-from-baseline analyses demonstrated greater improvements in FMA scores in the LIFU group compared with the rTMS group (ΔFMA upper limb: median 7 [IQR 3–10.5] vs. 2 [1–3], p = 0.001; lower limb: 3 [1–4.5] vs. 1 [0–1.5], p Conclusion
LIFU and rTMS were associated with comparable short-term motor outcomes in subacute stroke. Differences observed in change-from-baseline motor improvements and exploratory neuroimaging measures suggest potential divergence in recovery dynamics and cortical modulation, warranting further investigation in larger, longitudinal studies.
Trial registrationThis study was derived from a prospectively registered, three-arm randomized controlled trial in the Chinese Clinical Trial Registry (ChiCTR2500114687). The present manuscript reports a secondary head-to-head comparison between the two neuromodulation intervention arms.
by Woong Sik Jang, Young Lan Choe, Soo Young Yoon, Chae Seung Lim, Min-Chul Cho
BackgroundCandida auris is an emerging multidrug-resistant yeast associated with invasive infections, healthcare-associated outbreaks, and high mortality, and is often misidentified by conventional diagnostic methods. Rapid, accurate, and scalable screening tools are essential for effective infection control, particularly in high-risk settings.
Materials and methodsWe developed a multiplex loop-mediated isothermal amplification (LAMP) assay that combines a broad-range Candida Pan target with a C. auris–specific target in a single isothermal reaction. Assay conditions were optimized for primer ratio and temperature, and analytical sensitivity was evaluated using serial dilutions of culture-derived C. albicans and C. auris DNA, as well as contrived specimens consisting of urine, swab, and whole-blood matrices. Clinical performance was assessed using 35 Candida-positive clinical specimens (blood, urine, ear swabs) and 94 non-infectious controls. Results were compared with Candida Pan qPCR and C. auris qPCR. Cross-reactivity was tested against common bacterial isolates.
ResultsUnder optimized conditions (1:1 primer ratio, 64 °C), the assay allowed species-level discrimination, with C. auris positive for both Pan and auris channels and C. albicans positive only for the Pan channel. The C. auris-specific LAMP probe detected approximately 10²–10³ cells/mL in culture-derived and contrived specimens, showing a 1–2 log improvement over C. auris qPCR (10⁴–10⁵ cells/mL), while the Pan LAMP channel detected C. auris at around 10⁵ cells/mL. In clinical specimens, Pan LAMP detected Candida spp. in 34/35 cases (97.14%) versus 32/35 (91.14%) for Pan qPCR. All C. auris–positive specimens (9/9) were detected by the multiplex LAMP assay, compared with 6/9 (66.7%) by Pan qPCR. All 94 non-infectious controls and all bacterial isolates tested negative, indicating 100% clinical specificity and absence of cross-reactivity.
ConclusionThe multiplex Candida Pan/auris LAMP assay provides a rapid, highly sensitive, and specific alternative to qPCR for C. auris screening, while preserving broad Candida detection in a single isothermal reaction. Its improved analytical and clinical sensitivity suggests strong potential for use in active surveillance and infection-control programs, particularly in settings where timely identification and containment of C. auris are critical.
by Linda Abou-Abbas, Rima Kashash, Mustapha Khalife, Mohamad Shafic Ramadan
BackgroundEffective preparedness and response to mass casualty incidents (MCI) are essential for hospital safety, operational efficiency, and the delivery of timely, high-quality patient care during emergencies. This study assessed a tertiary government hospital in Lebanon’s Code Orange plan by reviewing documentation for alignment with international guidelines and evaluating staff knowledge, attitudes, and practices (KAP) regarding MCI preparedness.
MethodsDocuments reviewed at Rafik Hariri University Hospital (RHUH) included the current Code Orange plan, relevant policies, and international guidelines. A comprehensive evaluation framework was used, focusing on preparedness, incident command systems, communication, and management. A comparison with established standards was conducted to identify gaps. Complementing this, a cross-sectional study was conducted using a convenient sample of medical and non-medical healthcare workers to evaluate their KAP regarding MCI preparedness.
ResultsThe desk review of the RHUH Code Orange plan identified both strengths and significant gaps in MCI preparedness. While the plan defines staff roles and resources for emergency response, it lacks detailed procedures for activation strategies, surge capacity, continuity of essential services, and triage processes. Additionally, post-event recovery protocols are insufficient or absent, and the importance of regular drills is not adequately emphasized. The KAP study revealed significant differences between medical and non-medical staff in terms of MCI knowledge, involvement, and training engagement, with medical staff reporting higher levels of familiarity and desire for participation.
ConclusionThe findings underscore the need to bridge knowledge and engagement gaps between medical and non-medical staff to enhance MCI response. Key actions include interdisciplinary training to build coordination, clear communication protocols to streamline information flow, and routine drills with defined roles to strengthen preparedness. Additionally, implementing performance monitoring during drills and real MCIs, along with conducting regular evaluations, will allow for continuous refinement of response strategies.
by Ching-Chung Ko, Jheng-Yan Wu, Kuo-Chuan Hung, Shu-Wei Liao, Ya-Wen Tsai, Tsung Yu, Chien-Ming Lin, I-Wen Chen
PurposeCOVID-19 infection has been associated with cardiovascular complications, including new-onset atrial fibrillation/flutter (NOAF). However, the potential protective effect of COVID-19 vaccination against long-term NOAF risk following COVID-19 infection remains unclear.
MethodsThis retrospective cohort study used the TriNetX Research Network to identify adults diagnosed with COVID-19. Patients were divided into a vaccine group and control group (unvaccinated). After propensity score matching (238,750 patients per group), we assessed the primary outcome of 24-month NOAF incidence, with secondary outcomes at 1, 6 and 12 months. Subgroup analyses examined effects across patient characteristics and comorbidities. Sensitivity analysis was performed by excluding patients with severe COVID-19 illness.
ResultsThe 24-month NOAF incidence was significantly lower in the vaccine group compared to the control group (1.91% vs 2.18%; HR: 0.82, 95% CI: 0.78–0.85). This protective effect was also observed at 1 month (HR: 0.73, p Conclusion
COVID-19 vaccination was associated with a significantly reduced 24-month risk of NOAF after COVID-19 infection. These findings suggest vaccination may mitigate long-term cardiovascular sequelae of COVID-19. Future research should elucidate underlying protective mechanisms and optimize vaccination strategies for cardiovascular protection, particularly in high-risk populations.
by Amma Aboagyewa Larbi, Moses Etsey, Obed Brew, Bismark Koduah, Rosemond Enam Mawuenyega, Emmanuel Kobla Atsu Amewu, Nehemiah Kweku Essilfie, Solomon Wireko, Alexander Kwarteng, Ben Adu Gyan
The human gut microbiome, consisting of bacteria, archaea, fungi, and viruses, influences various physiological processes of the body. The gut microbiome composition is shaped by factors such as diet, geography, and antibiotic use. Malaria has been a global health challenge over the years, especially in low- and middle-income countries. This study investigated how asymptomatic malaria infection altered gut microbial communities in Ghanaian children, offering insights for novel malaria control strategies. Standard aseptic phlebotomy procedures were employed to collect venous blood samples for Plasmodium species detection. The gut microbial community was profiled by sequencing the 16S rRNA V4 region, and sequence data were processed using the DADA2 pipeline in R. Asymptomatic malaria infections were predominantly mixed with P. falciparum and P. malariae. Microbiome analysis revealed that Firmicutes and Bacteroidetes comprised nearly 70% of the total microbial population. Asymptomatic individuals showed a decrease in Firmicutes abundance from 52.5% to 44.0% and an increase in Bacteroidetes from 34.7% to 45.6%. There was also a slight increase in the abundance of Proteobacteria from 3.0% to 4.8%. At the genus level, Prevotella_9 was the most abundant and exhibited the highest variability in the infected groups. The Alloprevotella and Streptococcus genera increased in both infected groups, but Escherichia-Shigella was significantly elevated in only those with mixed infections. Faecalibacterium significantly declined in asymptomatic malaria-infected individuals compared to healthy controls, with variability further reduced in mixed infections. Beta-diversity analysis indicated a significant effect of malaria status on microbial composition (PERMANOVA, pby Peng Zhou, Sitong Chen, Yingli Li, Yan Li
PurposeThis study aimed to develop a machine learning-based prediction model for myopia progression using ocular biometric parameters to provide an objective assessment tool for clinical practice.
MethodsA retrospective analysis was conducted on patients treated at Shanghai Parkway Health Ophthalmology Department as the training set, and myopic individuals from the Optometry Center of Peking University People’s Hospital as the validation set. Demographic and biometric data were collected, including central corneal thickness (CCT), axial length (AL), corneal curvature (K-value), anterior chamber depth (ACD), corneal diameter (WTW), and pupil size (PS). Seven machine learning models (e.g., XGBoost, random forest, support vector machine) were employed for modeling, with performance optimized via 5-fold cross-validation. Model accuracy was evaluated using mean squared error (MSE) and the coefficient of determination (R²), and variable importance was analyzed.
ResultsNo statistically significant differences were observed in baseline characteristics between the training and validation sets (all P > 0.05). The XGBoost model demonstrated the best performance, achieving R² = 0.913 (MSE = 0.005) on the training set and R² = 0.766 (MSE = 0.016) on the test set. Variable importance analysis revealed pupil size (score 100) and corneal thickness (40.88) as the key predictors of axial elongation rate, followed by age of onset (17.96).
ConclusionThe machine learning-based prediction model effectively utilizes ocular biometric data to assess myopia progression risk, with pupil size and corneal thickness identified as core predictive factors. This model provides a quantitative tool for early clinical intervention. Future studies should expand the sample size and incorporate additional biomarkers to optimize performance.
by Mequanent Dessie Bitewa, Thomas Kidanemariam Yewodiaw, Aysheshim Asnake Abneh, Mikias Getahun Molla, Mulat Belay Simegn, Tadele Sinishaw Jemere, Mequannt Alemu Endayehu, Aysheshim Belaineh Haimanot, Werkneh Melkie Tilahun, Atirsaw Assefa Melikamu, Tadele Derbew Kassie
BackgroundCervical cancer is preventable, yet it remains a leading cause of cancer death in women. About 90% of cases and 94% of deaths occur in low- and middle-income countries (LMICs). Limited access to screening drives high incidence and mortality. Screening is central to secondary prevention and global elimination efforts.
ObjectiveThis study aimed to assess determinants of cervical cancer screening among women aged 30–49 years in low- and middle-income countries: a multilevel analysis.
MethodsA cross-sectional study used nationally representative data from 148,605 weighted women aged 30–49 years in 20 LMICs (2019–2024). Multilevel logistic regression identified factors associated with cervical cancer screening while accounting for cluster-level variation. Statistical significance was set at p Result
Overall cervical cancer screening uptake was 14.03% (95% CI: 13.63–14.45%), ranging from 0.92% in Mauritania to 42.98% in Zambia. Higher screening was associated with older age 40–49 years (AOR = 1.48; 95% CI: 1.41–1.54), occupation (AOR = 1.15; 95% CI: 1.10–1.21), contraceptive use (AOR = 1.38; 95% CI: 1.31–1.44), recent health-facility visit (AOR = 1.93; 95% CI: 1.84–2.02), prior abortion (AOR = 1.28; 95% CI: 1.22–1.34), female-headed households (AOR = 1.11; 95% CI: 1.05–1.18), high community education (AOR = 1.63; 95% CI: 1.49–1.79), and high media exposure (AOR = 2.54; 95% CI: 2.30–2.80). Lower uptake was observed among individuals in high-poverty communities (AOR = 0.63; 95% CI: 0.57–0.68), higher parity (1–4 birth) (AOR = 0.86; 95% CI: 0.78–0.94); (five or more births) (AOR=0.66 95% CI: 59–0.73), and those residing in rural areas (AOR = 0.89; 95% CI: 0.82–0.97).
ConclusionCervical cancer screening uptake in LMICs is far below the WHO 2030 target, with wide country disparities. Socio-demographic factors, health-facility contact, and community education increase uptake, while poverty and geographic barriers reduce it. Integrating screening into routine reproductive and maternal care, strengthening community and media education, and addressing structural barriers to access are essential to improving coverage.
by Sosina Workineh Tilahun, Adiam Nega, Lealem Wagaw, Adamu Addissie
BackgroundShared decision-making is crucial for alignment of treatment options with patient values and preferences. However, currently in Ethiopia, shared decision-making in clinical care of cancer, in which cervical cancer is not exceptional, is not well understood.
AimThis study aimed to assess the perceived level of shared decision-making and its predictors in cervical cancer care at Tikur Anbessa Specialized Hospital in Addis Ababa, Ethiopia.
MethodsWe employed a convergent parallel mixed-methods study design from February 18 to May 23, 2025, at Tikur Anbessa Specialized Hospital. The study used interviewer-administered questionnaires for 203 cervical cancer patients and in-depth interviews for 15 cervical cancer patients and 10 clinical oncologists. Using SPSS v26, multiple linear regression analysis was used to determine significant predictors of the perceived level of shared decision-making, with statistical significance set at P Results
The overall mean score for the perceived level of shared decision-making was 24.94 (± 9.12), with a range of 7–44, and the standardized mean score was 2.77 (± 1.01). The perceived level of shared decision-making had positive linear associations with increased trust in oncologists (0.32, 95% CI (0.21, 0.44); p Conclusions
The study emphasized the complex interplay of factors influencing the practice of shared decision-making in clinical care of cervical cancer. Therefore, understanding these dynamics may help to enhance the practice of shared decision-making in clinical cervical cancer care.
by Marius König, Jonas P. Wallraff, Florian Glenewinkel, Ursula Wild, Thomas C. Erren, Philip Lewis
BackgroundTeachers play a key role in society and make up ~1.5–2.5% of the working population. Yet, there is a teacher shortage in many countries and preventive occupational medicine strategies are called for. The primary objective of this project is to explore single and joint associations of the diurnal distributions of light, activity, meal, and sleep timing and work-related exposures with severity scores of burnout, anxiety, and depression in a cross-sectional study of secondary school teachers in Germany.
Methods and analysisThe study will involve a one-time collection of questionnaire-based data on sleep, burnout, anxiety, and depression, sensor-based data on light and activity over one week, and diary-based data on work, sleep, and meals over one week. time. The protocol has been registered on the Open Science Framework (https://doi.org/10.17605/OSF.IO/U4R5M).
DiscussionFrom a preventive occupational medicine perspective, identifying where and how light, activity, meal, and sleep timing may be targeted to mitigate burnout, anxiety, and depression could inform measures to be tested not only at the individual (micro) level, but also at systems (meso-institutions; macro-policy and society) levels.
by Wajdi Amayreh, Mohammad Al-Magableh, Jomana Alsulaiman, Mahdi Alshboul, Maan Amayreh, Ahmad Al-Maqableh, Razan Qasem, Tamara Al-Nemrat
BackgroundBreastfeeding is a key determinant of infant health and survival; however, exclusive breastfeeding (EBF) rates remain low worldwide. Various maternal, infant, and socioeconomic factors influence the feeding practices.
ObjectiveThe main objective of this study was to identify maternal, infant, and socioeconomic determinants of infant feeding practices during the first six months of life among mothers in northern Jordan.
MethodsA prospective cross-sectional study was conducted at Princess Rahma and Prince Rashid Hospitals in Irbid City, northern Jordan, from December 2023 to February 2024. Mothers of healthy infants aged 6–24 months participated in a survey that gathered information on their demographics, feeding practices, and other infant-related details. Statistical analyses were performed to identify the associations and key predictors of feeding type.
ResultsAmong the 508 mothers who participated in this study, 29.9% were exclusively breastfeeding, 46.5% used mixed feeding, and 23.6% opted for formula feeding. The key factors influencing these choices include maternal health issues, work hours, and infant birth weight. Maternal illness was identified as the strongest predictor of exclusive artificial feeding (AOR = 12.72; 95% CI: 4.10–39.45; P Conclusion
This study highlighted the low exclusive breastfeeding rate, emphasizing the need for improved support systems to encourage breastfeeding in the form of workplace accommodations and healthcare counseling to address barriers to its practice.
by Chong Gang, Hao Wang, Yujue Wang, Yongsheng Lan
The purpose of this study was to examine whether short-duration phase-change material cooling (PCM) applied at different temperatures influences acute recovery following fatigue induced by stretch–shortening cycle exercise (SSC). Sixty-four physically active participants were randomly assigned to 5°C, 10°C, or 15°C PCM cryotherapy group or a passive recovery control group. After completing a SSC fatigue protocol, participants underwent a 15-minute PCM intervention, and peak torque (PT), mean power, rate of force development (RFD), countermovement jump (CMJ) performance, Rating of perceived exertion (RPE), modified endurance ratio (MER), vastus lateralis (VL) and Rectus Femoris (RF) stiffness were assessed immediately after fatigue (Imm-fatigue), immediately after PCM cryotherapy (Imm-PCM), and 60 minutes post PCM cryotherapy (Post60-PCM). Mean power and RFD were significantly greater in PCM groups compared with the control group at Imm-PCM (P ≤ 0.01), with mean power remaining elevated in the 15°C PCM group at post-60-PCM (P ≤ 0.05). RPE was significantly lower in all PCM groups at Imm-PCM and post60-PCM compared with control (P ≤ 0.01). No between-group differences were observed for PT, CMJ, MER, or muscle stiffness, and no temperature-dependent effects were detected within the 5–15 °C. These findings indicate that 15-minute PCM cryotherapy selectively accelerates early-phase neuromuscular and perceptual recovery without affecting maximal strength, endurance capacity, or passive muscle mechanical properties. From an applied perspective, PCM cryotherapy may be an effective strategy to enhance explosive performance and perceived readiness during short recovery intervals in training or competition settings.by Jakob Brandstetter, Lea Goldstein, Tim Schreiber, Rupert Palme, Tobias Lindner, Markus Joksch, Bernd Krause, Brigitte Vollmar, Simone Kumstel
Pancreatic cancer is the third leading cause of cancer-related death, with a 5-year survival rate of only 10%. Preclinical studies remain essential for identifying novel therapeutic strategies, discovering biomarkers, and deepening the understanding of disease biology. The most frequent driver mutation in pancreatic cancer is the G12D mutation in the KRAS gene, present in approximately 90% of the tumors. A recent study demonstrated complete regression of KRAS-driven pancreatic cancer upon systemic ablation up- and downstream signaling proteins EGFR and C-RAF. Building on these findings, we investigated the therapeutic benefit of combining the EGFR inhibitor erlotinib with the novel pan-RAF inhibitor LXH-254. The anticancer effects of this combination were assessed in vitro in murine and human pancreatic cancer cell lines by evaluating cell proliferation, cell death and phosphorylation of key signaling proteins. Subsequent in vivo studies were performed in an orthotopic murine pancreatic cancer model and in genetically engineered KPC mice, using daily oral administration of LXH-254 (35 mg/kg) and erlotinib (75 mg/kg). While the treatment robustly inhibited MAPK signaling and caused significant anti-proliferative effects in vitro, it did not improve survival or reduce tumor burden in either in vivo model. hese results contrast with previous reports of efficacy from monotherapies in xenograft models, highlighting the limitations of current preclinical approaches. Our findings underscore the need to develop more effective pathway-targeted inhibitors, and preclinical models that predict clinical outcomes more accurately.by Jabir Aman, Bikila Balis, Naol Oda, Dawit Tamiru, Tadesse Gure Eticha, Dawit Firdisa, Aboma Motuma
BackgroundMeconium aspiration syndrome is a life-threatening respiratory disease affecting around 5% of neonates worldwide. Although several studies have been conducted in developed countries, data on meconium aspiration syndrome and its associated factors remain limited in low-resource settings, including Ethiopia. Therefore, this study aimed to determine the meconium aspiration syndrome and associated factors among neonates admitted to the neonatal intensive care unit at public hospitals in Harari region, Eastern Ethiopia.
MethodA retrospective hospital-based cross-sectional study design was conducted among all neonates admitted from January 1 to December 30, 2023 and data were extracted from patient charts during April 1–30, 2025. A simple random sampling technique was employed to select 417 charts of neonates admitted to the neonatal intensive care unit. The data were collected by a data extraction checklist via Kobo Toolbox. Descriptive statistics and binary logistic regression were used in SPSS version 25 (IBM Corp., Armonk, NY, USA) for the analysis. Adjusted odds ratios with 95% confidence intervals were used to declare statistical significance at a p-value ≤ 0.05.
ResultsThe prevalence of meconium aspiration syndrome among neonates admitted to the neonatal intensive care unit was 24.2% [95% CI, 20.2–28.6]. Factors significantly associated with meconium aspiration syndrome were post-term gestation [AOR = 9.05, 95% CI 2.38–34.41], antepartum hemorrhage [AOR = 3.34, 95% CI 1.31–8.60], prolonged labor [AOR = 3.06, 95% CI 1.27–7.36], premature rupture of membranes [AOR = 3.65, 95% CI 1.28–10.45], low Apgar scores at 5th minute [AOR = 11.27, 95% CI 3.44–36.92] and intrapartum thick meconium passage [AOR = 5.98, 95% CI 2.6–13.6].
Conclusions and recommendationsThese findings indicate a high prevalence of meconium aspiration syndrome, and to reduce its impact, targeted clinical interventions should be implemented. Pregnancies reaching 42 weeks of gestation, prolonged labor, and high-risk conditions such as antepartum hemorrhage, premature rupture of membranes, or the presence of thick meconium are important factors to consider. Careful monitoring and appropriate management may be warranted in these cases.
by Ana Caroline Bini de Lima, Vanessa Cristini Sebastião da Fé, Maria Simara Palermo Hernandes, Emily Caroline Pfeifer de Cristo, Ana Gabrieli dos Santos Fagundes Euzébio, Maria Vitória e Silva Sousa, Fabiana Ribeiro Caldara, Viviane Maria Oliveira dos Santos
This study aimed to evaluate the ability of social noncontact environmental enrichment to facilitate social buffering and to characterize the emotional experience of horses subjected to restraint in stock by assessing physiological parameters and facial expressions. Pantaneiro horses (n = 11) were evaluated in a crossover design with two treatments: social noncontact enrichment during stock restraint and social isolation during stock restraint. Physiological parameters (heart rate, heart rate variability, respiratory rate, ocular temperature by infrared thermography, and auricular temperature by infrared thermometer) and facial expressions (EquiFACS) were assessed throughout the 24-minute restraint period. When horses were accompanied by a conspecific, heart rate, respiratory rate, and eye temperature were lower (p nostril dilator (AD38), inner brow raiser (AU101), upper eyelid raiser (AU5), eye white increase (AD1), ears forward (EAD101), and ears back (EAD104), was also lower (pby Yang Tong, Huang Qianzhen, Tan Bo, Hu Bin, Zhang Min
BackgroundAdvancing the development of centers for disease control and prevention (CDCs) has become a priority within global public health governance. However, public health governance capacity varies significantly among CDCs across different countries and regions, grassroots CDCs face particular disadvantages. Establishing stable, efficient collaborative development mechanisms among CDCs across diverse regions to maximize overall effectiveness and ensure sustainable development represents a critical public health science issue.
ObjectiveThis study aims to provide scientific references and a theoretical foundation for the coordinated development of grassroots CDCs within the Chengdu–Chongqing Economic Circle (CCEC) and the construction of public health systems.
MethodsA questionnaire for collaborative development needs indicators in grassroots CDCs, comprising 4 primary needs and 13 secondary needs, was developed through literature review, the Delphi expert consultation method, and the Kano model. Analysis focused on questionnaires collected from eight grassroots CDCs within the CCEC. The importance of needs was ranked using the better–worse coefficient and satisfaction sensitivity analysis.
ResultsAnalysis of the 110 valid questionnaires showed that for the must-be attribute, satisfaction sensitivity ranked as follows: performance compensation (0.883)> talent exchange and scientific research and innovation cooperation (0.824)> public health emergency rescue mechanism (emergency material reserve and cross-regional material mobilization; 0.817)> cross-regional case monitoring, investigation, and tracking (0.775). Regarding the one-dimensional attribute, the satisfaction sensitivity ranking was joint risk assessment and emergency command (0.937)> business archive co-construction and sharing mechanism (emergency response plan, and technical scheme) (0.909)> regional co-construction and sharing between the university and the local area (0.832). For the attractive attribute, the satisfaction sensitivity ranking was regional monitoring and early-warning information management system (0.922)> community chronic disease prevention and service (0.804)> coordinated transfer and diversion diagnosis and treatment of patient with infectious diseases within the region (0.734). However, the collaborative release and interaction mechanism of social integrated media information, public health collaborative governance entities, and the construction of a cross-regional expert database constitute indifferent attributes.
ConclusionsThis study provides preliminary scientific evidence for the precise allocation of public health resources and the establishment of localized collaborative development mechanisms. Simultaneously, the research methodology and analytical framework offer new theoretical references for similar studies in other regions globally.
by Munawar Farooq, Uffaira Hafeez, Amir Ahmad, Susan Waller, Gabriel Andrade, Arif Alper Cevik, Syed Fahad Javaid
BackgroundStress is a prevalent issue among university students and is linked to adverse academic and emotional outcomes. While research emphasizes the roles of resilience, personality traits, and psychosocial factors, most studies are drawn from North American and European contexts.
ObjectivesThis is the first study of its kind in the United Arab Emirates (UAE) exploring the relationship between perceived stress, resilience, and personality traits among university students, offering insights into region-specific influences on emotional well-being.
MethodsAn online cross-sectional survey was conducted among 168 students from two colleges at the United Arab Emirates University (79% College of Medicine and Health Sciences, 21% College of Information Technology; 72% female). Data were analyzed using descriptive statistics and regression models in R version 4.2.0. Personality traits were assessed using the Ten-Item Personality Inventory, perceived stress was measured with the Perceived Stress Scale, and resilience was evaluated with the Brief Resilience Scale.
ResultsThe median perceived stress score was 22 (IQR: 17–28), and 30% reported high stress. Multivariable analysis showed that heavier academic workload, financial difficulties, lack of social support, lower physical activity, and poorer academic performance significantly predicted higher perceived stress, whereas resilience and emotional stability were protective.
ConclusionUniversity students’ perceived stress is closely associated with modifiable factors, including academic workload, social support, resilience, and physical activity. Targeted interventions, such as resilience training, promoting physical activity, optimizing academic schedules, and strengthening support services, are vital to reducing perceived stress and enhancing student well-being.
by José Manuel García-Moreno, Tyler Adams, Amber Beynon, Janine Vlaar Olthuis, Stephan U. Dombrowski, Richelle Witherspoon, Niels Wedderkopp, Jeffrey J. Hébert
BackgroundRehabilitation and behavior change interventions are commonly used after lumbar surgery to improve recovery, but their effects on physical capacity and physical activity remain unclear. This study aimed to investigate the effectiveness of rehabilitation and behavior change interventions on physical capacity and physical activity behavior in patients following lumbar surgery for degenerative disease.
MethodsEMBASE, MEDLINE, PsycINFO, and CENTRAL were searched from inception to September 2025 and reference lists were hand-searched. Randomized controlled trials assessing rehabilitation or behavior change interventions on physical capacity or physical activity behavior in adults with lumbar degenerative disc disease who underwent lumbar surgery were included. Review author pairs independently extracted data and assessed included studies. Risk of bias was assessed with the Cochrane tool, and study quality with the Grading of Recommendations Assessment, Development and Evaluation classification. Results were pooled using random-effects models and reported as standardized mean differences (SMD) with 95% confidence intervals (CI).
ResultsExercise was more effective than minimal or usual care in improving trunk extension endurance in the immediate term (SMD, 1.54; 95% CI, 0.93–2.16). Supervised exercise outperformed self-directed exercise in improving trunk extension endurance in the immediate term (SMD, 1.28; 95% CI, 0.75–1.81). Psychologically informed rehabilitation was more effective than minimal or usual care in increasing physical activity levels in the intermediate term (SMD, 0.26; 95% CI, 0.02–0.49), but not in the immediate term (SMD, 0.17; 95% CI, −0.14 to 0.49). Physical activity advice did not increase physical activity levels compared to minimal or usual care in the immediate term (SMD, 0.21; 95% CI, −0.13 to 0.55). Prehabilitation was more effective than minimal or usual care in increasing physical activity levels in the intermediate term (SMD, 0.28; 95% CI, 0.03–0.53). Certainty of evidence ranged from low to moderate.
ConclusionsFor adults with lumbar degenerative disease who underwent lumbar surgery, exercise, especially supervised programs, improved trunk extension endurance in the immediate term. Psychologically informed rehabilitation and prehabilitation increased physical activity levels in the intermediate term, while physical activity advice showed no benefit. Findings are limited by low certainty of evidence and high risk of bias.
by Emily Tufano, Kondaiah Palsa, Rebecka O. Serpa, Timothy B. Helmuth, Gabriela Remit-Berthet, Sara Mills-Huffnagle, Mathias Kant, Aurosman Sahu, James R. Connor
Iron is essential for normal physiological function, yet dysregulation of iron metabolism is increasingly recognized as a hallmark of cancers such as glioblastoma (GBM). Recent clinical evidence suggests that systemic iron deficiency anemia (IDA) negatively impacts GBM outcomes in a sex-dependent manner, but the mechanisms linking systemic iron availability to tumor iron metabolism remain poorly understood. Here, we interrogate the impact of systemic iron through dietary modulation (control, iron deficiency (ID), and high iron diets), stratified by sex, on tumor iron handling and GBM outcomes utilizing an immune competent (C57BL/6) GBM (GL261) mouse model. Subsequently, we analyzed clinical samples to evaluate translational value. In the preclinical study, we show that iron deficiency decreased survival in males but conferred a slight survival advantage in females, consistent with prior clinical trends. Among circulating iron markers, only ferritin light chain (FTL), but not ferritin heavy chain (FTH) or serum iron, positively correlated with survival in males but not females. In the brain, contralateral iron levels reflected dietary iron status in males but not females, further supporting sex-dependent regulation of local and circulating iron. Notably, tumor iron content remained unchanged in males but was significantly elevated in ID female tumors, complemented by increased transferrin receptor (TfR1) and FTH expression. In clinical GBM samples, we observed non-statistically significant but similar survival trends across varying iron and ferritin levels, suggesting potential translational relevance of our exploratory model. These findings demonstrate that systemic iron availability exerts a sex-specific effect on tumor iron handling, highlighting a critical relationship between systemic and tumor iron regulation in GBM.by Edidiong Orok, Oluwaseun Olumoko, Inimuvie Ekada, Amos Oladunni
Inappropriate use of antimalarial medications can accelerate the development of antimicrobial resistance (AMR), undermining treatment efficacy and public health goals. Artemether-lumefantrine (A/L) is the first-line treatment for uncomplicated malaria in Nigeria, yet its misuse persists, particularly among young adults. This study assessed knowledge gaps in A/L use among university students in Southwestern Nigeria to identify opportunities for targeted intervention. A cross-sectional online survey was conducted among undergraduate students from three universities in Southwestern Nigeria. Respondents’ knowledge of A/L was categorized as good (≥70%), fair (50–69%), or poor (