FreshRSS

🔒
❌ Acerca de FreshRSS
Hay nuevos artículos disponibles. Pincha para refrescar la página.
Ayer — Marzo 4th 2026PLOS ONE Medicine&Health

COVID-19 knowledge, attitudes, and practices among people vulnerable to HIV in Uganda: A cross-sectional cohort analysis

by Job Kasule, Julius L. Tonzel, Natalie Burns, Tyler Hamby, Roger Ying, Grace Mirembe, Immaculate Nakabuye, Hannah Kibuuka, Margaret Yacovone, Betty Mwesigwa, Trevor A. Crowell, for the Multinational Observational Cohort of HIV and other Infections (MOCHI) Study Group

Background

People with behavioral vulnerability to HIV face barriers to healthcare engagement that may impede uptake of non-pharmaceutical and other interventions to prevent COVID-19. Understanding COVID-19 knowledge, attitudes, and practices in this population can inform disease prevention efforts during future pandemics.

Materials and methods

From October 2022 to September 2024, we enrolled participants aged 14–55 years without HIV who endorsed recent sexually transmitted infection, injection drug use, transactional sex, condomless sex, and/or anal sex with male partners. At enrollment, we collected socio-behavioral data, including assessments of COVID-19 knowledge, attitudes, and practices. Robust Poisson regression with purposeful variable selection was used to estimate prevalence ratios with 95% confidence intervals for factors associated with COVID-19 preventive practices.

Results

Among 418 participants, 228 (56.9%) were female, the median age was 21 years (interquartile range 19−24), and 362 (84.9%) reported sex work. Knowledge about SARS-CoV-2 transmission routes was high (95.4%) but lower for the consequences of genetic variants (48.5%−69.7%) and possibility for asymptomatic infection or transmission (66.7%−80.8%). Handwashing was practiced by 90.8% of participants in the preceding month, whereas mask-wearing (76.5%), avoiding symptomatic people (73.7%), and any history of COVID-19 vaccination (46.9%) were less prevalent. Males were more likely to report avoiding symptomatic people (adjusted prevalence ratio 1.16 [95% confidence interval 1.03–1.31]) and COVID-19 vaccination (1.30 [1.05–1.60]). Enrollment during the BQ.1/BQ.1.1 Omicron wave was associated with less mask-wearing (0.81 [0.67–0.99]) but more vaccination (1.59 [1.29–1.95]).

Discussion

We observed variable COVID-19 knowledge and attitudes among Ugandan adolescents and adults with little impact on COVID-19 preventive practices. Efforts to address suboptimal uptake of disease preventive practices during this and future disease outbreaks will require more than just improving knowledge.

Simulation of phased alerting of community first responders for cardiac arrest

by Pieter L. van den Berg, Shane G. Henderson, Hemeng Li, Bridget Dicker, Caroline J. Jagtenberg

Background

Community First Responders (CFRs) are commonly used for out-of-hospital cardiac arrests, and advanced systems send so-called phased alerts: notifications with built-in time delays. The policy that defines these delays affects both response times and volunteer fatigue.

Methods

We compare alert policies by Monte Carlo Simulation, estimating patient survival, coverage, number of alerts and redundant CFR arrivals. In the simulation, acceptance probabilities and response delays are bootstrapped from 29,307 rows of historical data covering all GoodSAM alerts in New Zealand between 1-12-2017 and 30-11-2020. We simulate distances between the patient and CFRs by assuming that CFRs are located uniformly at random in a 1-km circle around the patient, for different CFR densities. Our simulated CFRs travel with a distance-dependent speed that was estimated by linear regression on observed speeds among those responders in the above-mentioned data set that eventually reached the patient.

Results

The alerting policy has a large impact on the four metrics above, and the best choice depends on volunteer density. For each volunteer density, we are able to identify a policy that improves GoodSAM New Zealand’s current policy on all four metrics. For example, when there are 30 volunteers within 1 km from the patient, sending out alerts to 7 volunteers and replacing each volunteer that rejects by a new one, is expected to save 10 additional lives per year compared to the current policy, without increasing volunteer fatigue. Our results also shed light on polices that would improve one metric while worsening another, for example, when there are 10 volunteers within 1 km from the patient, dispatching them all immediately increases our survival estimate by 11% compared to the current policy, with the downside of also increasing the redundant arrivals by 137%.

Conclusions

Monte Carlo simulation can help CFR system managers identify a good policy before implementing it in practice. We recommend balancing survival and volunteer fatigue, aiming to ultimately further improve a CFR system’s effectiveness.

Protocol for a systematic review and meta-analysis of pharmacological and non-pharmacological interventions for chronic pain management in chronic kidney disease

by Chi Peng Chan, Babaniji Omosule, Courtney Lightfoot, Ellesha A. Smith, Ffion Curtis, James O. Burton, Paul Gardner, Sarah Jasat, Sherna F. Adenwalla, Jyoti Baharani, Daniel S. March

Background

Chronic pain affects up to 60% of people with chronic kidney disease (CKD), yet remains under-recognised and under-treated. Pain management in this population is complicated by altered drug pharmacokinetics, polypharmacy, and the potential nephrotoxicity of conventional analgesics. Despite the high prevalence and significant impact on quality of life, evidence-based guidance specific to pain management in CKD remains limited.

Objectives

This systematic review aims to evaluate the effectiveness and safety of both pharmacological and non-pharmacological interventions in reducing chronic pain intensity among people with CKD on dialysis, not on dialysis, and kidney transplant recipients, across all stages of CKD.

Methods

The primary outcome is the effectiveness of interventions in reducing chronic pain intensity as assessed by pain assessment tools. We will conduct a comprehensive search of MEDLINE, Embase, CINAHL, Web of Science, and ClinicalTrials.gov from their inception to the present date to identify studies for chronic pain management in people living with CKD. Study screening will be conducted independently by two reviewers. One reviewer will extract data from each study, with a second reviewer cross-checking for accuracy and completeness. Data will be extracted on study characteristics, participant demographics, intervention components, pain outcomes, and adverse events. The certainty of evidence will be evaluated independently by two reviewers using the GRADE approach. Where applicable, data will be combined in meta-analyses using random-effects models. Additionally, a network meta-analysis will be performed if enough studies are available.

Expected results

This review will synthesise the current evidence for pain management strategies in CKD, by evaluating effectiveness of interventions among people receiving different renal replacement therapy modalities with varying pain and disease phenotypes. Findings will highlight the comparative effectiveness of various interventions while considering their safety profiles specific to the CKD context. The review will identify gaps in the literature and provide recommendations for clinical practice and future research.

Significance

This review seeks to deliver a thorough evaluation of pain management strategies for people living with CKD. This systematic review is supported by the UK Kidney Association (UKKA), and findings will inform the upcoming UKKA guideline on symptoms management in people with CKD, alongside the other symptoms including itch, fatigue, and gastrointestinal symptoms. This review will aid clinicians in making well-informed decisions regarding pain management strategies, ensuring a balance between effectiveness and the specific risks associated with CKD.

Computational frameworks for automated detection and quantification of paroxysmal sympathetic hyperactivity among traumatic brain injury patients

by Xiangxiang Kong, Lujie Karen Chen, Sancharee Hom Chowdhurry, Ryan B. Felix, Shiming Yang, Peter Hu, Neeraj Badjatia, Jamie Erin Podell

Paroxysmal sympathetic hyperactivity (PSH) is a syndrome that occurs in a large subset of critically ill traumatic brain injury (TBI) patients and is associated with complications and poor recovery. PSH is defined by recurrent episodic vital sign elevations in the appropriate clinical context. However, standard diagnostic criteria rely heavily on subjective judgment, leading to challenges and delays in recognition, monitoring, and management. The objective of this study was to develop automated PSH detection and quantification tools that exclusively utilize objective bedside continuous vital sign data. Using a cohort of 221 critically ill acute TBI patients with at least 14 days of continuous physiologic data (of which 107 were clinically diagnosed with PSH) we developed a high-resolution clinical feature scale based on established PSH-Assessment Measure criteria and two artificial intelligence-based episode detection models including an expert system approach and a machine learning model approach, using a clinician-annotated case example as ground truth. For the episode detection methods, PSH was quantified as the number, duration, and overall temporal burden of detected episodes. To evaluate performance, we compared quantifications across PSH cases and controls and explored precision and recall. All three methods demonstrated initial face validity to delineate PSH cases from non-PSH TBI controls. Future optimization and implementation of the described computational frameworks with real-time patient data could improve the standard monitoring and management of this challenging clinical syndrome.

“The system is a bit broken…” a qualitative exploration of barriers in the pathway for diagnosing Developmental Coordination Disorder

by Lucy H. Eddy, Nat K. Merrick, Cara E. Staniforth, Jade L. Jukes, Liam J. B. Hill, Mark Mon-Williams, Farid Bardid, Rebecca Murray

Background

Approximately 5% of children are affected by a neurodevelopmental disorder of their sensorimotor skills. DSM-V and ICD-10, the two most widely used diagnostic systems, define this diagnostically as ‘Developmental Coordination Disorder’ (DCD) or ‘Specific Developmental Disorder of Motor Function’ (SDDMF), respectively. A diagnosis of DCD has been found to have a detrimental impact on a range of outcomes (e.g., health and education). It is therefore crucial that these children receive timely intervention. This is reliant, however, on effective assessment and support pathways. Research has shown there is great parental dissatisfaction, but there has been limited research exploring a clinical and education perspective. This study therefore aimed to understand barriers and facilitators for clinical and education practitioners in the pathway in a diverse district in the UK (Bradford).

Methods

Semi-structured interviews were completed with stakeholders across the pathway to identify barriers and facilitators to assessing, diagnosing, and supporting children with sensorimotor skill difficulties. Theoretical thematic analysis aligned to the Capability, Opportunity, Motivation model of Behaviour change (COM-B) was used to analyse the qualitative data.

Results

Interviews revealed many barriers in the DCD pathway related to capability (confusing terminology, inconsistent knowledge, inappropriate referrals), opportunity (resource constraints, DCD being considered low priority, and disconnected services), and motivation (overlapping job roles, a desire to consider those with difficulties not eligible for a diagnosis). No facilitators were consistently identified across interviews.

Conclusion

Families face multiple barriers to obtaining a diagnosis for their child through existing clinical pathways for assessment and support. These findings are unlikely to be unique to Bradford, due to international research highlighting these issues via parental interviews. These findings therefore may reflect challenges both nationally and internationally within DCD pathways. There is an urgent need for: (i) clear communication across different services (with consistency in terminology), and (ii) a more collaborative and integrated approach to assessment, diagnosis, and support in order to help these children thrive.

Mental health help-seeking intentions among health workers in the east coast of peninsular Malaysia: Perceived barriers and predictive factors

by Muhammad Syafiq Kunyahamu, Aziah Daud, Ijlal Syamim Mohd Basri, Tengku Alina Tengku Ismail, Mohd Faiz Md Tahir

Introduction

Mental health problems among health workers are a growing concern globally, including in Malaysia. Despite the availability of mental health services, some health workers do not seek professional help. This study aims to determine the level of health workers’ intention to seek professional help, examine the barriers they perceive, and identify predictors of mental health help-seeking intention.

Methods

This cross-sectional study involved 470 health workers in the East Coast region of Peninsular Malaysia. Data was collected using a self-administered questionnaire. Linear regression analysis was employed to identify the predictors of professional help-seeking intention.

Results

The mean score for mental health help-seeking intention was 4.90 (SD = 1.03). Perceived need for help positively predicted help-seeking intention (B = 0.532, p  Conclusions

This study highlights the roles of the perceived need for help and perceived stigma barriers in predicting health workers’ help-seeking intentions, offering a basis for targeted interventions and policies to enhance mental health support within Malaysian healthcare settings.

Methodological review of the design, objectives and sample size of Research for Patient Benefit (RfPB) applications that use an external randomised controlled pilot trial design: A protocol

by Claire L. Chan, Saskia Eddy, Jennie Hejdenberg, Ben Morgan, Heather M. Morgan, Gillian Lancaster, Clare Robinson, Sandra M. Eldridge

Background

The National Institute for Health and Care Research accepts applications for pilot and feasibility studies to their Research for Patient Benefit (RfPB) programme. There has been limited work describing the design practices of these applications and funding status. Knowing some of the qualities which may contribute towards a pilot or feasibility study application successfully gaining funding could help researchers improve the quality of their applications. Therefore, this study describes the protocol for a review looking at the characteristics of funded and non-funded external pilot trial applications. In particular, the primary objective is to describe the planned sample size and sample size justifications.

Methods

The study will be conducted on 100 applications from Competition 31–37 with a randomised feasibility design, identified and given access to us by RfPB where the lead applicant has consented. We will screen these applications to identify the external pilot trials, first looking through the titles and then the full text. Following this, we will extract data on information such as medical area, study design, objective(s), sample size, sample size justification, and funding outcome stage one and two. Validation will be performed on 20% of the data extracted; discrepancies will be resolved by discussion or a third reviewer will decide if there is no consensus. We will use descriptive statistics to summarise quantitative data, and will analyse qualitative data using thematic analysis. Findings will be summarised through discussion with the project contributors to produce a reader-friendly guidance document.

Discussion

This work will provide a more complete picture of RfPB external randomised pilot and feasibility trials. The findings will assist researchers when planning their pilot trials, and could help improve the quality of submitted applications.

Protocol Registration

Open Science Framework protocol registration DOI: https://doi.org/10.17605/OSF.IO/PYKVG.

Comprehensive post-marketing safety evaluation of atezolizumab: A disproportionality analysis based on individual case safety reports in the FAERS

by Yu Cui, Yuxuan Gao, Na Meng, Xiaojuan Li, Na Zhao, Lili Yu

Atezolizumab is a widely used immune checkpoint inhibitor (ICI) for cancer treatment, and postmarketing testing is important. This study aims to provide a reference for the safe and rational use of drugs in clinical practice by mining and analyzing the adverse event (AE) signals of atezolizumab on the basis of the FDA Adverse Event Reporting System (FAERS). This research extracted AE reports from the second quarter (Q2) of 2016 to Q2 of 2024 from the FAERS. AEs were standardized and classified on the basis of the System Organ Class (SOC) and Preferred Term (PT) from the Medical Dictionary for Regulatory Activities (MedDRA) version 23.0. This study utilized disproportionality analysis (DPA) for signal mining and analysis, including the reporting odds ratio (ROR) method, the Medicines and Healthcare Products Regulatory Agency (MHRA) method, and the Bayesian confidence propagation neural network (BCPNN) method. We obtained a total of 3,124 AE signals and identified 640 PTs and 21 SOCs for atezolizumab. The highest signal intensity was systemic immune activation (n = 15, ROR = 449.20, PRR = 449.07, IC = 8.06), and the most frequently reported AEs were death, pyrexia, infectious pneumonia, anaemia, and febrile neutropenia. The top 100 PTs in terms of signal intensity involved a total of 16 SOCs, including those associated with endocrine disorders; respiratory, thoracic and mediastinal disorders; and renal and urinary disorders. This study revealed that AEs in the endocrine, respiratory and urinary systems need to be monitored in clinical practice.
AnteayerPLOS ONE Medicine&Health

Response of mid-lactation primiparous Holstein cows to the supplementation of rumen-protected methionine during the summer

by Caio R. Monteiro, Victor Augusto de Oliveira, Rabeche Schmith, João Pedro A. Rezende, Tales L. Resende, João A. Negrão, Marina A. C. Danés

This study aimed to evaluate the effects of rumen-protected methionine (RPM) supplementation on productive and physiological responses of primiparous Holstein cows during summer. We hypothesized that RPM supplementation would maintain or improve milk yield and composition due to beneficial physiological, redox, and inflammatory responses in cows exposed to summer heat. The trial was conducted in a randomized block design during nine weeks in Brazil using 80 primiparous cows (182 ± 64 DIM; 42.9 ± 4.7 kg/d milk). Cows were blocked by milk yield and DIM and assigned to a control diet (CON; no added RPM) or the same diet supplemented with RPM (Mepron®, Evonik) at 0.75 g/kg diet dry matter, targeting 20 g/cow/day (product contains 62% metabolizable methionine) to the average cow. Milk yield and composition, vaginal temperature, respiratory rate, and plasma samples were collected in weeks 3, 6, and 9. Data were analyzed using mixed models including treatment, week, and their interaction as fixed effects, and block and cow as random effects. Cows were maintained under naturally occurring summer conditions. Environmental monitoring during weeks 3, 6, and 9 indicated elevated temperature–humidity index (THI) values, with values remaining above the heat-stress threshold (THI > 68) for 68.3% of the monitored hours (mean THI = 70.6; range 61.0–84.4). Overall (least squares mean across weeks 3, 6, and 9), RPM increased milk yield by 2.0 kg/d (44.9 vs. 42.9 kg/d), protein yield by 50 g/d (1,464 vs. 1,414 g/d), lactose yield by 108 g/d (2,109 vs. 2,001 g/d), and total solids yield by 176 g/d (5,331 vs. 5,155 g/d). Lactose concentration was lower in RPM (4.71 vs. 4.76%). Fat yield was unaffected, but a treatment × week interaction was observed for fat content. Milk fatty acid (FA) profile was unchanged, although treatment × week interactions were observed for individual fatty acids (C16:0, C18:0, C18:1, and preformed FA). Plasma glucose was lower, and insulin was higher in RPM than in CON cows (39.3 vs. 43.2 mg/dL and 0.52 vs. 0.35 ng/mL, respectively). Antioxidant capacity improved, with RPM cows having greater ferric reducing antioxidant power (32.9 vs. 28.5 µM) and lower malondialdehyde (2.48 vs. 2.78 nmol/mL). Other biochemical, inflammatory, and immune markers were unaffected. Respiratory rate was slightly higher in RPM than in CON cows (55 vs. 50 breaths/min). Mean vaginal temperature did not differ between treatments; however, a treatment × time × hour interaction was observed. Supplementation with RPM improved milk and solids yield, and enhanced antioxidant capacity and insulin levels, supporting its use to improve metabolic resilience under warm conditions.

Identification and validation of genes encoding humoral specificity for the chemical allergen toluene diisocyanate

by Adam V. Wisnewski, Jian Liu

A panel of hybridomas specific for different isomers of toluene diisocyanate (TDI), a cross‑linking chemical used in polyurethane production, has been previously described. These hybridomas were originally developed by researchers at the USA’s National Institute for Occupational Safety and Health (NIOSH). We sought to determine the DNA sequence encoding these TDI-specific monoclonal antibodies, enabling identification of germline gene rearrangement resulting in chemical specificity as well as production of the mAbs recombinantly. B cell receptor sequencing (BCR-seq) of hybridoma RNA readily identified productive heavy and light chain antibody sequences. The productive light chains of all 7 hybridomas showed strong identity with different genomic variable (V) and joining (J) region sequences with few changes from germline configuration. However, the productive heavy chains contained more substantial changes in their genomic V and J-region sequences consistent with antigen-driven affinity maturation, as well as N- and P- nucleotide additions comprising their complementarity-determining region 3 (CDR3). The hybridoma-defined TDI-specific mAbs were subsequently produced recombinantly in a human embryonic kidney cell line expression system, purified, and tested for their binding capacity against albumin derivatized with TDI, related diisocyanates, and control antigen. The recombinant versions of the TDI-specific mAbs demonstrated binding capacity for different isomers (2,4 and 2,6) of TDI consistent with that previously reported for the hybridoma secreted clones; one specific for 2,4-TDI, one specific for 2,6-TDI, three that bind both 2,4- and 2,6-TDI, and two that show cross-reactivity with 4,4′‑methylene diphenyl diisocyanate (MDI). None of the recombinant mAbs bound to aliphatic hexamethylene diisocyanate (HDI), its oligomer, or control antigen. Additional recombinant versions of the TDI mAbs, with identical V-regions, but different C-regions, demonstrated the dependence of antigen specificity on the V-region, but also highlighted the potential for C-region sequence to affect their detection in ELISA assays. The DNA sequences defined herein may be useful to other investigators wishing to generate recombinant TDI-specific mAbs as detection reagents for research or as standards for clinical serology tests.

Cultural and linguistic responsiveness in long-term care: A scoping review protocol on programs for residents and staff

by Wenting Yan, Carmel L. Montgomery, Liz Dennett, Stephanie A. Chamberlain

Background

The demographic landscape of Western countries has shifted to a more diverse one. Along with the trend of an aging population, a new problem has emerged, which is the increased linguistic diversity in the aging population in these countries. As people age and their care needs increase, they may not receive optimal care if they don’t speak the same language as their caregivers in long-term care facilities. Culturally and linguistically responsive long-term care services are important to ensure the best care for an aging population, but there is limited evidence in the literature on the scope and practice of these services. The objective of this scoping review is to map out the types of CLR programs in LTC settings and examine their core components and target populations.

Methods

The Arksey and O’Malley framework, further developed by Levac and colleagues, will be employed in this scoping review. The research question was framed using the PCC framework. A comprehensive systematic search was developed with an experienced librarian and will be conducted in Scopus, CINAHL, Embase, Medline, PsycINFO, and Academic Search Complete. All primary study designs, including quantitative, qualitative, and mixed methods, will be included. Studies must focus on culturally and linguistically responsive care programs used or implemented in long-term care services. There will be no date or language limitations. Findings will be thematically synthesized to answer the research question.

Conclusion

This review protocol provides a transparent process for how it will be conducted. We aim to contribute to a better understanding of what culturally and linguistically responsive care programs exist, how cultural and linguistic responsiveness is currently addressed across diverse care environments, and what gaps remain in long-term care.

Exploring the drivers of price variation in orthopaedic radical bone tumor resection: A nationwide database study

by Devika A. Shenoy, William C. Cruz, Shamik Bhat, Katelyn Parsons, Aaron D. Therien, Kevin A. Wu, Christian A. Pean, William C. Eward

Background

Radical resection of bone tumors is a clinically effective but costly procedure. Despite the implementation of federal price transparency mandates, little is known about the nationwide variation in negotiated prices for these specialized oncologic surgeries. This study aimed to quantify the variation in negotiated rates for radical resection of the humerus and femur/knee and identify associated hospital, payor, and state-policy drivers.

Methods

This cross-sectional study analyzed hospital-negotiated payor rates from the Turquoise Health database for current procedural terminology (CPT) codes 24150 (humerus resection) and 27365 (femur/knee resection). Multivariate linear regression was used to determine the associations between hospital size and type, payor class, and state-level policies (Medicaid expansion, Certificate of Need [CoN] laws, All-Payer Claims Database [APCD] mandates, and Nurse Practitioner [NP] scope of practice) on negotiated payor rates.

Results

A total of 285,857 negotiated rates were analyzed. Significant price variation was observed across all factors. Large hospitals (>1000 beds) and Critical Access Hospitals (for femur/knee resection only) had significantly higher rates. CoN laws were associated with higher prices for both procedures (+$348.25 and +$667.98, respectively), as were APCD mandates for femur/knee resections (+$1231.24). Medicare Advantage plans paid inconsistently compared to commercial plans, paying more for humerus but substantially less for femur/knee resections.

Discussion

Negotiated prices for radical bone tumor resection are highly variable and influenced by a complex interplay of market dynamics, challenging the assumption that price transparency alone can standardize healthcare costs for specialized care.

Spatial heterogeneity and spatially varying determinants of childhood stunting in Northern Rwanda: A cross-sectional study to inform targeted interventions

by Clarisse Kagoyire, Albert Ndagijimana, Gilbert Nduwayezu, Jean Nepo Utumatwishima, Jean Pierre Mpatswenumugabo, Marie Anne Mukasafari, Diane Rinda, Vedaste Ndahindwa, Kristina Elfving, Gunilla Krantz, Torbjörn Lind, Ali Mansourian, Renée Båge, Ewa Wredle, Elias Nyandwi, Aline Umubyeyi, Jean Baptiste Ndahetuye, Petter Pilesjö

Despite national progress, stunting remains prevalent in specific regions of Rwanda, highlighting the limitations of coarse-resolution data for effective mapping and intervention planning. This study explored optimal spatial resolution and analytical approach to capture localised dynamics and the multifactorial nature of stunting. A cross-sectional, population-based study was conducted in the Northern Province of Rwanda, focusing on children aged 1–36 months. Data were collected using structured questionnaires covering socio-demographic, economic, health, childcare, livestock factors and anthropometric measurements. Environmental characteristics were obtained from national datasets, while household geographic coordinates were captured using a customized mobile geodata platform (emGeo). After data cleaning, predictors were analysed using univariable and multivariable logistic regression as well as geographically weighted logistic regression (GWLR) to account for spatial heterogeneity. Among 601 children, stunting prevalence was 27% (boys 33.8%; girls 20.9%). GWLR improved model fit, increasing adjusted deviance explained from 34% to 39%. Significant predictors included child age (adjusted OR = 2.46; 95% CI: 1.78–3.39), male sex (OR = 2.83; 95% CI: 1.65–4.86), birthweight (OR = 0.71; 95% CI: 0.54–0.94), maternal autonomy (ability to refuse sexual intercourse; OR = 0.48; 95% CI: 0.27–0.86), inconsistent maternal social support (OR = 2.30; 95% CI: 1.20–4.42), household electricity access (OR = 0.48; 95% CI: 0.27–0.84) and handwashing facilities (OR = 0.21; 95% CI: 0.07–0.67). GWLR revealed substantial spatial heterogeneity in these factors, delineating areas where each factor matters most. This household-level, spatially explicit analysis reveals localised risk patterns often masked by aggregated national data. Prioritising context-specific interventions (such as electrification, hygiene promotion, and enhanced maternal social support), can enhance effectiveness. The proposed analytical workflow provides a model for addressing persistent stunting in other resource-limited settings.

Evaluation of usability and acceptability of a Peruvian telemental health service for early assessment among vulnerable occupational workers: Mixed-method study with a user-centered design approach

by Jimmy Andreyvan Cainamarks-Alejandro, Liliana Cruz-Ausejo, Miguel Angel Burgos-Flores, Jaime Rosales-Rimache, Jonh Astete-Cornejo, David Villarreal-Zegarra

Background

The COVID-19 pandemic marked an increase in depressive, anxiety, and post-traumatic stress disorder symptoms, more specifically among healthcare workers, teachers, and police officers. These workers face external and occupational factors which had a significant impact on mental health, significant increase in workload and direct exposure to the virus, shortage of personnel protective equipment, and instances of abuse, including discrimination. Mental health care in primary care requires a process of early identification and timely referral of complex cases. Telehealth emerges as an effective alternative for addressing challenges in mental health care, although its implementation encounters obstacles.

Objective

To design a telehealth service that facilitates screening, initial management, and timely referral for mental health diagnoses in workers with prior SARS-CoV-2 infection, and to evaluate usability, acceptability, and user satisfaction.

Methods

Mixed-method study with a user-centered design approach involving key external and internal service users in three sequential stages (pre-design, co-design, and post-design). The study phases lasted 6 months, involving a total of 23 participants in the pre-design phase (contextual inquiry and preparation and training), 12 participants in the co-design phase (framing the issue, generative design, and sharing design), and in the post-design phase, 4 participants were involved in service implementation, and 81 participants—drawn from the subgroup of 134 users who received psychoeducation—were included in the efficacy assessment.

Results

The proposal included the development and evaluation of a service model guide and a telehealth software platform. First, the participants took part in a series of workshops (Pre-design, Co-design) where they provided ideas for meeting the product requirements, based on the Design Thinking methodology framework. The telehealth service model was named TelePsico CENSOPAS. It comprised four processes: a) Service promotion; b) User pre-identification; c) Appointment management; d) Psychoeducation counseling and referral. The Telehealth platform was designed through three cycles of an iterative process and integrated a proprietary development platform with third-party service technologies for communication support and information exchange. During post-design, the pilot test involved 698 screened patients; 193 were identified with mental health risks, and 134 of them received psychoeducation sessions. In addition to user acceptance, the usability score of the platform was 86.1 ± 16.9 SD, satisfaction dimensions of the service was 45.1 ± 7.2 SD for satisfaction with care processes, and 36.7 ± 5.2 SD satisfaction with psychological care.

Conclusion

The proposal for mental health telehealth services and its supporting platform was successfully developed and accepted by both internal and external users, particularly within well-structured occupational health services in workplaces serving vulnerable occupational groups. In addition, it achieved higher satisfaction and usability scores than Peru’s outpatient care services. These findings support the replicability of user-centered design frameworks—such as design thinking—within the occupational health sphere.

Effects of tacrolimus treatment on the gut microbiota and metabolites in liver transplant recipients

by Guohui Wang, Lu Liu, Hanshu Zhang, Panpan Mao, Saijuan Lu, Xiaofang Zhang, Xingde Li, Cangsang Song

Background

Liver transplantation (LT) is an effective treatment for patients with end-stage liver disease. In recent years, more and more evidence has supported the association between gut microbiota dysbiosis and the pathogenesis and progression of liver diseases.

Methods

The study included 36 patients who received tacrolimus treatment after liver transplantation. Patients were stratified into subgroups according to three key variables: tacrolimus treatment duration, whole-blood tacrolimus concentration, and tacrolimus concentration-to-dose (C/D) ratio. Fecal samples and whole-blood specimens were collected from all participants. The Illumina HiSeq X platform was used to detect the gut metagenome, analyzing the composition and characteristics of the gut microbiota. Liquid chromatography-tandem mass spectrometry (LC-MS/MS) technology was employed to detect metabolites of the gut microbiota, revealing their metabolic profiles.

Results

As the duration of tacrolimus use increased, the diversity of the gut microbiota also increased, and the abundance of Escherichia coli_D and Bacteroides stercoris rose. Additionally, the abundance of Brunovirus and Uetakevirus tended to decrease. The abundance of gene functions related to chemical carcinogenesis and bacterial invasion of epithelial cells significantly decreased. In the gut microbiota metabolites, 16 substances like Astragaloside A and Acetyl-L-carnitine significantly increased, while 108 substances like Capsaicin and TLK significantly decreased. Within a certain range, as the concentration of tacrolimus in whole blood increased, the diversity of the gut microbiota increased. The abundance of Phocaeicola and Klebsiella increased, and the abundance of Peduovirus among viruses also rose. However, excessively high concentrations may lead to a decrease in the diversity of the gut microbiota and a decrease in the abundance of Phocaeicola. With respect to the C/D ratio, increased ratios were linked to significantly higher levels of 57 fecal metabolites (e.g., PC 34:2, 5-Methyl-2’-deoxycytidine), whereas 13 metabolites (e.g., FAHFA 2:0/16:0) showed substantial declines.

Conclusions

Tacrolimus treatment is associated with distinct alterations in gut microbiota and metabolites among LT recipients. These findings provide a preliminary framework for future investigations aimed at optimizing immunosuppressive regimens, although their clinical translational potential requires validation in larger-scale, prospective cohort studies.

Elucidating key targets and mechanisms of diethyl phthalate-induced colorectal cancer through network toxicology and molecular docking

by Zijing Wang, Liyuan Ma, Zhanyuan Sun, Hengyi Lv, Ruxue Ma, Mengqi Ding, Hai Li, Tao Jiang

Background

Diethyl phthalate (DEP), a widely used plasticizer with endocrine-disrupting properties, has raised concerns regarding its potential carcinogenic effects. However, its precise role in colorectal cancer (CRC) development remains poorly understood.

Methods

The chemical structure of DEP was obtained from the PubChem database. Potential targets of DEP were identified through ChEMBL and STITCH databases and intersected with known CRC-related genes to screen for candidate biomarkers. Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) enrichment analyses were performed to explore the biological functions and signaling pathways involved. Molecular docking was conducted to predict the binding affinities between DEP and core targets. Finally, 200-ns molecular dynamics (MD) simulations using GROMACS were employed to evaluate the binding stability and dynamic behavior of the DEP–target complexes.

Results

A total of 62 overlapping genes were identified between DEP targets and CRC-associated genes. GO and KEGG enrichment analyses indicated enrichment in epigenetic regulation, chromatin remodeling, and cancer-related signaling pathways, including Notch, TGF-β, and FoxO. Protein–protein interaction analysis identified EP300, EZH2, HDAC1, HDAC2, and KDM1A as key epigenetic regulators. Molecular docking predicted moderate binding affinities between DEP and these targets (−6.6 to −5.7 kcal·mol ⁻ ¹). Subsequent 200-ns MD simulations suggested that DEP formed stable complexes with HDAC1, KDM1A, and EZH2, moderate stability with EP300, and partial dissociation with HDAC2, consistent with hydrophobic and hydrogen-bonding interactions at the binding interfaces.

Conclusion

This study provides a theoretical framework for exploring the molecular mechanisms through which DEP may contribute to CRC development, emphasizing the value of network toxicology in cancer research. These findings may inform future investigations into the risks of DEP exposure and support public health policy and the development of targeted therapeutic strategies.

Association between endothelial activation and stress index and mortality in critically ill patients with atrial fibrillation: In MIMIC database: A Retrospective Cohort Study

by Peiling Zuo, Huanhuan Zhu, Chunying Sun, Xiaohan Ma, Sheng Chen, Rong Tang, Tong Wu, Ding Zhang, Xiao Tang, Wenquan Lv, Wenzhong Chen, Xiawei Wei, Encun Hou, Minsheng Wu, Minghe Jiang

Background

Evidence indicates that the Endothelial Activation and Stress Index (EASIX) is a predictor of mortality in endothelium-related conditions; however, its association with mortality risk in atrial fibrillation (AF) remains uncertain. Accordingly, this study examines the relationship between EASIX and mortality risk among patients with AF.

Methods

This retrospective analysis utilized data from the Medical Information Marketplace in Intensive Care IV (MIMIC-IV) database, which includes critically ill patients diagnosed with AF. To examine the association between EASIX scores and mortality, Kaplan–Meier survival analysis, Cox proportional hazards models, and restricted cubic spline regression were applied to evaluate the relationship between EASIX and all-cause mortality. Subgroup analyses were conducted to explore potential interactions with key patient characteristics, and sensitivity analyses were performed to further confirm the robustness of the results.

Results

A total of 3,193 patients were included in the analysis. KM survival analysis showed that elevated EASIX levels were associated with a higher risk of both in-hospital and ICU mortality. After adjusting for potential confounders, increased EASIX levels remained significantly associated with in-hospital mortality [HR, 1.09 (95% CI 1.03, 1.15), P = 0.0002] and ICU mortality [HR, 1.10 (95% CI 1.04, 1.17), P = 0.0002]. Stratified analyses revealed a significant interaction between sepsis, respiratory failure, and EASIX in relation to both in-hospital and ICU mortality. To evaluate the robustness of the findings, a sensitivity analysis was performed. After additionally adjusting for metoprolol and heparin as covariates, patients in the highest EASIX group continued to demonstrate the greatest mortality risk: the HR for in-hospital death was 2.08 (95% CI: 1.51–2.85), and the HR for ICU death was 1.83 (95% CI: 1.21–2.65).

Conclusion

Elevated EASIX levels correlate with higher mortality rates, underscoring its potential as an accessible tool for identifying high-risk patients and informing clinical decisions. However, further studies are needed to explore the underlying mechanisms and validate its applicability across diverse patient populations.

A live cell biosensor protocol for high-resolution screening of therapy-resistant cancer cells

by Viral D. Oza, Colin S. Williams, Jessica S. Blackburn

The Genetically Encoded Death Indicator (GEDI) is a ratiometric, dual-fluorescence biosensor that enables real-time detection of cell death through calcium influx. Originally developed for use in neurodegeneration models, GEDI can be applied to cancer cells to quantify therapy-induced death at single-cell resolution. This protocol details how to generate GEDI-expressing cancer cell lines, empirically determine stress-induced GEDI thresholds using radiation or chemotherapeutic agents, and perform time-resolved imaging and image analysis to track cell fate. This workflow is optimized for high-throughput drug and radiation screening in heterogeneous populations and is especially useful for identifying chemo- and radio-resistant subclones. Key limitations include the need for empirical GEDI threshold calibration for each treatment condition and careful standardization of imaging parameters. The protocol outputs include GEDI ratio values, single-cell time-of-death annotations, and whole-cell morphological data in parallel, which can be linked to downstream applications such as FACS-based isolation of live or dying subpopulations, transcriptomic profiling of resistant clones, or in vivo validation using xenografts or organotypic slice culture.

Persistence of the hepatic benefits of high-intensity interval training (HIIT) during detraining despite body weight regain in mice

by Renata dos Santos Guarnieri, Guilherme Sá de Oliveira, Kaylaine Marques Ferreira, Aline Penna-de-Carvalho, Vanessa Souza-Mello, Sandra Barbosa-da-Silva

High-intensity interval training (HIIT) is an effective intervention for improving metabolic health and mitigating metabolic dysfunction-associated steatotic liver disease (MASLD). Nonetheless, the stability of these benefits throughout detraining periods and upon weight regain remains inadequately characterized. This study aimed to evaluate whether hepatic improvements induced by HIIT are sustained during detraining, even after body weight regain. Eighty male C57BL/6 mice were fed either a control (10% fat) or a high-fat (HF) diet (50% fat) for 12 weeks. Following this period, the animals were allocated to groups subjected to continuous HIIT or intermittent training cycles (each lasting 3 weeks). The outcomes assessed included body mass (BM), glucose tolerance, lipid profiles, liver enzyme levels (aspartate aminotransferase and alanine aminotransferase), hepatic steatosis, and the expression profiles of genes associated with lipogenesis (Srebf1, Mlxpl, and Fas), β-oxidation (Ppara and Cpt1a), and endoplasmic reticulum (ER) stress (Atf4, Ddit3, and Gadd45). Compared with the sedentary HF-NT condition, continuous HIIT reduced BM and improved glucose tolerance. Intermittent training (HF-TNT, HF-NTN) preserved metabolic benefits and reduced triglyceride and cholesterol levels. Notably, hepatic steatosis was significantly alleviated in all training groups but persisted even after detraining. Additionally, HIIT downregulated the expression of lipogenic genes and upregulated the expression of genes involved in β-oxidation. The levels of markers indicating ER stress were attenuated by HIIT, with a sustained reduction during periods of detraining. HIIT-induced metabolic and hepatic improvements persist partially during detraining, despite weight regain. These findings underscore the therapeutic value of continued or periodically repeated physical training in mitigating the adverse effects of an HF diet and preventing the progression of metabolic disorders such as MASLD.

Justifying model complexity: Evaluating transfer learning against classical models for intraoperative nociception monitoring under anesthesia

by Chanseo Lee, Jaihyoung Lee, Kimon-Aristotelis Vogt, Muhammad Munshi

Background

Accurate intraoperative detection of nociceptive events is essential for optimizing analgesic administration and improving postoperative outcomes. Although deep learning approaches promise improved modeling of complex physiologic dynamics, their added computational and operational complexity may not translate into clinically meaningful benefit, particularly in small, high-resolution perioperative datasets.

Methods

We performed a head-to-head evaluation of classical supervised models (L1-regularized logistic regression and 50-, 200-tree Random Forests, with and without drug dosing features) against a Temporal Convolutional Network (TCN) transfer-learning framework for intraoperative nociception detection. Using 101 adult surgical cases with 30 physiologic and 18 drug dosing features sampled in 5-second windows, models were assessed under leave-one-surgery-out cross-validation using AUROC and AUPRC. We further examined probability calibration, multiple ensemble strategies, permutation importance features, and computational cost in terms of inference operations and memory footprint.

Results

Drug-aware Random Forests of various trees (50 trees vs. 200 trees) achieved the highest discrimination (AUROC 0.716; AUPRC 0.399), outperforming the TCN transfer-learning model (AUROC 0.649; AUPRC 0.311). However, increasing personalization windows in the TCN yielded inconsistent and modest gains (p > 0.05). Isotonic calibration substantially improved probability calibration but did not affect discrimination. No ensemble method surpassed the standalone Random Forest; the gated network consistently assigned >84% weight to the classical model. Computational analysis revealed that while the TCN was more compact in total memory footprint, the smaller, 50-tree Random Forest inference required two orders of magnitude fewer operations, with faster training and lower operational complexity.

Conclusions

In this clinically realistic benchmark, interpretable classical models operating on well-engineered features without personalization matched or exceeded the performance of a personalized deep learning approach while remaining computationally cheaper and simpler to deploy. These findings underscore the importance of rigorously justifying model complexity in perioperative machine learning and suggest that, for intraoperative nociception monitoring, classical approaches may offer a more favorable balance of accuracy, interpretability, and operational efficiency.

❌