by Job Kasule, Julius L. Tonzel, Natalie Burns, Tyler Hamby, Roger Ying, Grace Mirembe, Immaculate Nakabuye, Hannah Kibuuka, Margaret Yacovone, Betty Mwesigwa, Trevor A. Crowell, for the Multinational Observational Cohort of HIV and other Infections (MOCHI) Study Group
BackgroundPeople with behavioral vulnerability to HIV face barriers to healthcare engagement that may impede uptake of non-pharmaceutical and other interventions to prevent COVID-19. Understanding COVID-19 knowledge, attitudes, and practices in this population can inform disease prevention efforts during future pandemics.
Materials and methodsFrom October 2022 to September 2024, we enrolled participants aged 14–55 years without HIV who endorsed recent sexually transmitted infection, injection drug use, transactional sex, condomless sex, and/or anal sex with male partners. At enrollment, we collected socio-behavioral data, including assessments of COVID-19 knowledge, attitudes, and practices. Robust Poisson regression with purposeful variable selection was used to estimate prevalence ratios with 95% confidence intervals for factors associated with COVID-19 preventive practices.
ResultsAmong 418 participants, 228 (56.9%) were female, the median age was 21 years (interquartile range 19−24), and 362 (84.9%) reported sex work. Knowledge about SARS-CoV-2 transmission routes was high (95.4%) but lower for the consequences of genetic variants (48.5%−69.7%) and possibility for asymptomatic infection or transmission (66.7%−80.8%). Handwashing was practiced by 90.8% of participants in the preceding month, whereas mask-wearing (76.5%), avoiding symptomatic people (73.7%), and any history of COVID-19 vaccination (46.9%) were less prevalent. Males were more likely to report avoiding symptomatic people (adjusted prevalence ratio 1.16 [95% confidence interval 1.03–1.31]) and COVID-19 vaccination (1.30 [1.05–1.60]). Enrollment during the BQ.1/BQ.1.1 Omicron wave was associated with less mask-wearing (0.81 [0.67–0.99]) but more vaccination (1.59 [1.29–1.95]).
DiscussionWe observed variable COVID-19 knowledge and attitudes among Ugandan adolescents and adults with little impact on COVID-19 preventive practices. Efforts to address suboptimal uptake of disease preventive practices during this and future disease outbreaks will require more than just improving knowledge.
by Pieter L. van den Berg, Shane G. Henderson, Hemeng Li, Bridget Dicker, Caroline J. Jagtenberg
BackgroundCommunity First Responders (CFRs) are commonly used for out-of-hospital cardiac arrests, and advanced systems send so-called phased alerts: notifications with built-in time delays. The policy that defines these delays affects both response times and volunteer fatigue.
MethodsWe compare alert policies by Monte Carlo Simulation, estimating patient survival, coverage, number of alerts and redundant CFR arrivals. In the simulation, acceptance probabilities and response delays are bootstrapped from 29,307 rows of historical data covering all GoodSAM alerts in New Zealand between 1-12-2017 and 30-11-2020. We simulate distances between the patient and CFRs by assuming that CFRs are located uniformly at random in a 1-km circle around the patient, for different CFR densities. Our simulated CFRs travel with a distance-dependent speed that was estimated by linear regression on observed speeds among those responders in the above-mentioned data set that eventually reached the patient.
ResultsThe alerting policy has a large impact on the four metrics above, and the best choice depends on volunteer density. For each volunteer density, we are able to identify a policy that improves GoodSAM New Zealand’s current policy on all four metrics. For example, when there are 30 volunteers within 1 km from the patient, sending out alerts to 7 volunteers and replacing each volunteer that rejects by a new one, is expected to save 10 additional lives per year compared to the current policy, without increasing volunteer fatigue. Our results also shed light on polices that would improve one metric while worsening another, for example, when there are 10 volunteers within 1 km from the patient, dispatching them all immediately increases our survival estimate by 11% compared to the current policy, with the downside of also increasing the redundant arrivals by 137%.
ConclusionsMonte Carlo simulation can help CFR system managers identify a good policy before implementing it in practice. We recommend balancing survival and volunteer fatigue, aiming to ultimately further improve a CFR system’s effectiveness.
by Lucy H. Eddy, Nat K. Merrick, Cara E. Staniforth, Jade L. Jukes, Liam J. B. Hill, Mark Mon-Williams, Farid Bardid, Rebecca Murray
BackgroundApproximately 5% of children are affected by a neurodevelopmental disorder of their sensorimotor skills. DSM-V and ICD-10, the two most widely used diagnostic systems, define this diagnostically as ‘Developmental Coordination Disorder’ (DCD) or ‘Specific Developmental Disorder of Motor Function’ (SDDMF), respectively. A diagnosis of DCD has been found to have a detrimental impact on a range of outcomes (e.g., health and education). It is therefore crucial that these children receive timely intervention. This is reliant, however, on effective assessment and support pathways. Research has shown there is great parental dissatisfaction, but there has been limited research exploring a clinical and education perspective. This study therefore aimed to understand barriers and facilitators for clinical and education practitioners in the pathway in a diverse district in the UK (Bradford).
MethodsSemi-structured interviews were completed with stakeholders across the pathway to identify barriers and facilitators to assessing, diagnosing, and supporting children with sensorimotor skill difficulties. Theoretical thematic analysis aligned to the Capability, Opportunity, Motivation model of Behaviour change (COM-B) was used to analyse the qualitative data.
ResultsInterviews revealed many barriers in the DCD pathway related to capability (confusing terminology, inconsistent knowledge, inappropriate referrals), opportunity (resource constraints, DCD being considered low priority, and disconnected services), and motivation (overlapping job roles, a desire to consider those with difficulties not eligible for a diagnosis). No facilitators were consistently identified across interviews.
ConclusionFamilies face multiple barriers to obtaining a diagnosis for their child through existing clinical pathways for assessment and support. These findings are unlikely to be unique to Bradford, due to international research highlighting these issues via parental interviews. These findings therefore may reflect challenges both nationally and internationally within DCD pathways. There is an urgent need for: (i) clear communication across different services (with consistency in terminology), and (ii) a more collaborative and integrated approach to assessment, diagnosis, and support in order to help these children thrive.
by Claire L. Chan, Saskia Eddy, Jennie Hejdenberg, Ben Morgan, Heather M. Morgan, Gillian Lancaster, Clare Robinson, Sandra M. Eldridge
BackgroundThe National Institute for Health and Care Research accepts applications for pilot and feasibility studies to their Research for Patient Benefit (RfPB) programme. There has been limited work describing the design practices of these applications and funding status. Knowing some of the qualities which may contribute towards a pilot or feasibility study application successfully gaining funding could help researchers improve the quality of their applications. Therefore, this study describes the protocol for a review looking at the characteristics of funded and non-funded external pilot trial applications. In particular, the primary objective is to describe the planned sample size and sample size justifications.
MethodsThe study will be conducted on 100 applications from Competition 31–37 with a randomised feasibility design, identified and given access to us by RfPB where the lead applicant has consented. We will screen these applications to identify the external pilot trials, first looking through the titles and then the full text. Following this, we will extract data on information such as medical area, study design, objective(s), sample size, sample size justification, and funding outcome stage one and two. Validation will be performed on 20% of the data extracted; discrepancies will be resolved by discussion or a third reviewer will decide if there is no consensus. We will use descriptive statistics to summarise quantitative data, and will analyse qualitative data using thematic analysis. Findings will be summarised through discussion with the project contributors to produce a reader-friendly guidance document.
DiscussionThis work will provide a more complete picture of RfPB external randomised pilot and feasibility trials. The findings will assist researchers when planning their pilot trials, and could help improve the quality of submitted applications.
Protocol RegistrationOpen Science Framework protocol registration DOI: https://doi.org/10.17605/OSF.IO/PYKVG.
by Caio R. Monteiro, Victor Augusto de Oliveira, Rabeche Schmith, João Pedro A. Rezende, Tales L. Resende, João A. Negrão, Marina A. C. Danés
This study aimed to evaluate the effects of rumen-protected methionine (RPM) supplementation on productive and physiological responses of primiparous Holstein cows during summer. We hypothesized that RPM supplementation would maintain or improve milk yield and composition due to beneficial physiological, redox, and inflammatory responses in cows exposed to summer heat. The trial was conducted in a randomized block design during nine weeks in Brazil using 80 primiparous cows (182 ± 64 DIM; 42.9 ± 4.7 kg/d milk). Cows were blocked by milk yield and DIM and assigned to a control diet (CON; no added RPM) or the same diet supplemented with RPM (Mepron®, Evonik) at 0.75 g/kg diet dry matter, targeting 20 g/cow/day (product contains 62% metabolizable methionine) to the average cow. Milk yield and composition, vaginal temperature, respiratory rate, and plasma samples were collected in weeks 3, 6, and 9. Data were analyzed using mixed models including treatment, week, and their interaction as fixed effects, and block and cow as random effects. Cows were maintained under naturally occurring summer conditions. Environmental monitoring during weeks 3, 6, and 9 indicated elevated temperature–humidity index (THI) values, with values remaining above the heat-stress threshold (THI > 68) for 68.3% of the monitored hours (mean THI = 70.6; range 61.0–84.4). Overall (least squares mean across weeks 3, 6, and 9), RPM increased milk yield by 2.0 kg/d (44.9 vs. 42.9 kg/d), protein yield by 50 g/d (1,464 vs. 1,414 g/d), lactose yield by 108 g/d (2,109 vs. 2,001 g/d), and total solids yield by 176 g/d (5,331 vs. 5,155 g/d). Lactose concentration was lower in RPM (4.71 vs. 4.76%). Fat yield was unaffected, but a treatment × week interaction was observed for fat content. Milk fatty acid (FA) profile was unchanged, although treatment × week interactions were observed for individual fatty acids (C16:0, C18:0, C18:1, and preformed FA). Plasma glucose was lower, and insulin was higher in RPM than in CON cows (39.3 vs. 43.2 mg/dL and 0.52 vs. 0.35 ng/mL, respectively). Antioxidant capacity improved, with RPM cows having greater ferric reducing antioxidant power (32.9 vs. 28.5 µM) and lower malondialdehyde (2.48 vs. 2.78 nmol/mL). Other biochemical, inflammatory, and immune markers were unaffected. Respiratory rate was slightly higher in RPM than in CON cows (55 vs. 50 breaths/min). Mean vaginal temperature did not differ between treatments; however, a treatment × time × hour interaction was observed. Supplementation with RPM improved milk and solids yield, and enhanced antioxidant capacity and insulin levels, supporting its use to improve metabolic resilience under warm conditions.by Wenting Yan, Carmel L. Montgomery, Liz Dennett, Stephanie A. Chamberlain
BackgroundThe demographic landscape of Western countries has shifted to a more diverse one. Along with the trend of an aging population, a new problem has emerged, which is the increased linguistic diversity in the aging population in these countries. As people age and their care needs increase, they may not receive optimal care if they don’t speak the same language as their caregivers in long-term care facilities. Culturally and linguistically responsive long-term care services are important to ensure the best care for an aging population, but there is limited evidence in the literature on the scope and practice of these services. The objective of this scoping review is to map out the types of CLR programs in LTC settings and examine their core components and target populations.
MethodsThe Arksey and O’Malley framework, further developed by Levac and colleagues, will be employed in this scoping review. The research question was framed using the PCC framework. A comprehensive systematic search was developed with an experienced librarian and will be conducted in Scopus, CINAHL, Embase, Medline, PsycINFO, and Academic Search Complete. All primary study designs, including quantitative, qualitative, and mixed methods, will be included. Studies must focus on culturally and linguistically responsive care programs used or implemented in long-term care services. There will be no date or language limitations. Findings will be thematically synthesized to answer the research question.
ConclusionThis review protocol provides a transparent process for how it will be conducted. We aim to contribute to a better understanding of what culturally and linguistically responsive care programs exist, how cultural and linguistic responsiveness is currently addressed across diverse care environments, and what gaps remain in long-term care.
by Alyssa Howren, Quan L. Tran, Sadaf Sediqi, Saadiya Hawa, Douglas K. Owens, Eleni Linos, Titilola O. Falasinnu, Yashaar Chaichian, Julia F. Simard
BackgroundSystemic lupus erythematosus (SLE) is a heterogeneous autoimmune rheumatic disease whose epidemiology and clinical prognosis vary by race and sex. Observed disparities in SLE may be partly attributable to cognitive processes in clinical decision-making, which can influence diagnostic accuracy and clinical management. We aimed to examine variation in primary care physicians’ (PCP) diagnosis and management of SLE when all content of a clinical case is identical, apart from race and sex.
MethodsWe distributed an online randomized factorial survey from 04/11/2024–06/10/2024 to PCPs across the US. Participants were presented with one of four possible SLE vignettes – Black female, White female, Black male, White male – for which all other clinical content was identical. Block randomization was used to randomly modify the race (Black/White) and sex (female/male) of the SLE “case”. Primary outcomes were correct text-based responses for SLE diagnosis at initial case presentation and after reviewing additional lab results. Secondary outcomes were participants’ review time and planned next steps (treatment, referral, tests) as a proxy for cognitive bias and certainty, respectively. We calculated descriptive statistics for all outcomes stratified by assigned randomized factor and used chi-square tests to evaluate between-group differences.
Results1031 PCPs (42.7% women, mean age 52.1 ± 12.1 years) completed the case. At initial presentation, 63.9% of participants correctly identified SLE as a differential diagnosis. An initial diagnosis of SLE significantly differed by the race and sex of the case (p Conclusion
A patient’s race and sex may influence diagnostic accuracy and clinical decision-making for SLE in primary care. The observed variation in diagnostic accuracy, which aligns with the descriptive epidemiology of SLE, highlights the need for targeted interventions to ensure equitable diagnostic processes.
by Liubov Arbeeva, Virginia B. Kraus, Amanda E. Nelson, Maryalice Nocera, Leigh F. Callahan, Richard F. Loeser, Kenneth L. Cameron, Jesse R. Trump, Stephen W. Marshall, Yvonne M. Golightly
PurposeTo investigate the longitudinal relationships between serum biomarkers of joint metabolism, knee injury, and Knee Injury and Osteoarthritis Outcome Score (KOOS) using novel methodologies.
MethodsData were collected from military officers who enrolled as cadets between 2004–2009, with follow-up conducted between 2015–2017. Analyses included 234 officers who had no history of knee ligament/meniscal injury at the time of military academy matriculation, had serum biomarker measurements at matriculation and graduation, demographic data, and KOOS assessment at follow-up. Biomarkers included Collagen Type II (C2C) and Type I and II (C1,2C) collagenase-generated cleavage epitopes, C-terminal propeptide of Type II collagen (CPII), and C- and N-terminal telopeptides of type I collagen (CTX and NTX). Angle-based Joint and Individual Variation Explained (AJIVE) was used to determine demographic determinants of biomarker levels and individual modes of variation specific to biomarker levels at matriculation and graduation, stratified by sex.
ResultsWe confirmed known associations of joint metabolism biomarkers with age in both sexes and with smoking in males. Matriculation biomarker data in males suggested a protective biomarker profile characterized by high cartilage synthesis and low cleavage of type I and II collagen in association with healthy KOOS scores at follow-up. CPII measured at matriculation was negatively associated with incident injuries after adjustment for smoking status (p = 0.03, logistic regression), confirming results from AJIVE.
ConclusionThese exploratory analyses suggest that CPII alone, or in combination with other joint metabolism biomarkers, may help identify individual risk of knee injury.
by John Paul G. Kolcun, Ricky M. Ditzel Jr, Bradley L. Kolb, Ricardo B. V. Fontes, P. B. Raksin
Study designRetrospective chart review.
ObjectiveDescribe safety/feasibility of implementing a novel clinical protocol for acute spinal cord injury (SCI) management.
Summary of background dataSpinal cord perfusion pressure (SCPP) has emerged as a promising target for the medical management of SCI patients. We report our early experience implementing a pragmatic SCPP-driven clinical protocol to supplant conventional mean arterial pressure (MAP) monitoring in the setting of acute SCI.
MethodsWe retrospectively reviewed charts of all SCI patients managed by our SCPP protocol since its adoption at two clinical sites as of 2/1/2023. The SCPP protocol was applied for all adult SCI patients of any injury grade, at any injury level with cord tissue involvement. Intrathecal pressure (ITP) was transduced by lumbar drain (LD). MAP was determined from invasive blood pressure recordings. SCPP was calculated as the difference between MAP and ITP, with an SCPP goal of >65mmHg.
ResultsEighteen patients have been treated since our SCPP protocol was adopted. Patients were predominately male (77.8%); the average age was 52.0 ± 16.2 years. Most injuries involved the cervical segment (72.2%), all of which were manifest clinically as central cord syndrome. The most common presenting injury severity was ASIA Impairment Scale D (44.4%).All patients underwent surgical intervention. There were no complications related to surgery, LD placement, or LD maintenance/ITP transduction during hospitalization. The SCPP protocol was continued for an average 5.2 ± 1.8 days. Eight patients required vasopressor support (44.4%) during that period, for an average 3.1 ± 2.1 days. Five patients underwent therapeutic CSF drainage to augment SCPP (27.8%). All patients maintained an average SCPP above goal for the duration of monitoring.
ConclusionsThis study further establishes the safety and feasibility of monitoring SCPP via LD measurement of ITP in acute SCI patients treated by clinical protocol at two clinical sites. There were no complications related to LD placement/maintenance or SCPP monitoring.
by Ernest V. Boiko, Elena V. Samkovich, Irina E. Panova, Alexander A. Ivanov, Sergey B. Shevchenko, Sergey L. Vorobyev, Elizaveta S. Kalashnikova, Victoria G. Gvazava, Elizaveta A. Masian, Alexandra E. Kim
PurposeTo define optimal exposure parameters and the therapeutic window for transscleral photodynamic therapy (TSPDT) with chlorin e6 by evaluating clinical, histological, and thermal effects of subthreshold, therapeutic, and suprathreshold settings in rabbit eyes.
MethodsThe study was conducted on 21 healthy rabbits. TSPDT was performed using a 660 nm laser and chlorin e6 (2.5 mg/kg). Transscleral probes (5 mm: 0.1 W, 0.17 W, 0.3 W; 10 mm: 0.3 W, 0.6 W) with integrated thermosensors were used. Enucleation and histological analysis were performed 14 days post-irradiation.
ResultsFundus examination on day 14 revealed distinct treatment zones correlating with laser settings. The therapeutic window was defined as 0.14–0.17 W (5 mm probe; power density: 0.693–0.866 W/cm²; energy density: 415.8–519.6 J/cm²) and 0.48–0.6 W (10 mm probe; 0.611–0.764 W/cm²; 366.6–458.4 J/cm²) with 600 s exposure time, achieving selective choroidal damage without scleral or retinal injury (ΔT ≤ 4.5°C). Suprathreshold settings (≥0.3 W for 5 mm; ≥ 0.6 W for 10 mm) induced retinal necrosis (up to 50%) and scleral coagulation (ΔT ≥ 8°C) with power densities exceeding 0.866 W/cm² (5 mm) and 0.764 W/cm² (10 mm).
ConclusionTSPDT with chlorin e6 enables selective targeting of intraocular pathological tissues while preserving scleral and retinal integrity. Defining the therapeutic window and using real-time thermal monitoring enhances treatment safety. These findings lay a foundation for clinical protocols for uveal melanoma and other intraocular tumors.
by Andrea Valdivia-Gago, Patricia J. García, Sherilee L. Harper, Angela Soria, Carol Zavaleta-Cortijo
Peru issued multiple COVID-19 policies for the Amazon, yet how they worked in practice for Indigenous Peoples remains under-documented. We conducted a sequential multi-method qualitative study, reviewing 20 national and regional policy documents (Mar–Dec 2020) and interviewing 12 implementers, regional and local officials from the health sector (n = 8) and the Ministry of Culture (n = 4), plus one central-level culture representative, in Loreto and Junin. Triangulating top-down policy review with bottom-up practitioner accounts across two contrasting regions strengthened validity. Policies frequently lacked explicit intercultural guidance, clear monitoring indicators, and dedicated budgets. Implementers described budget misalignment, omission of specific health networks, delayed supplies, and connectivity barriers that fostered dissatisfaction and a perception that services prioritized data collection over care. Effects were most acute in remote and low-connectivity settings; Indigenous federations’ participation in Loreto sometimes mitigated challenges, while in Junin travel-fund constraints limited participation. Pandemic preparedness must institutionalize intercultural approaches and secure sustainable funding with clear accountability. Co-design with Indigenous organizations, ring-fenced implementation budgets, practical communication strategies, and routine monitoring are essential to protect Indigenous Peoples in future health emergencies.by Jessica Liu, Sameer Pandya, Andreas Coppi, H. Patrick Young, Harlan M. Krumholz, Wade L. Schulz, Guannan Gong
BackgroundNear real-time electronic health record (EHR) data offers significant potential for secondary use in research, operations, and clinical care, yet challenges remain in ensuring data quality and stability. While prior studies have assessed retrospective EHR datasets, few have systematically examined the integrity of real-time data for research readiness.
MethodsWe developed an automated benchmarking pipeline to evaluate the stability and completeness of real-time EHR data from the Yale New Haven Health clinical data warehouse, transformed into the OMOP common data model. Twenty-nine weekly snapshots of the EHR collected from July to November 2024 and twenty-two daily snapshots collected from April to May 2025 were analyzed. Benchmarks focused on (1) clinical actions such as patient additions, deletions, and merges; (2) changes in demographic variables (date of birth, gender, race, ethnicity); and (3) stability of discharge information (time and status). A synthetic dataset derived from MIMIC-III was used to validate the benchmarking code prior to large-scale analyses.
ResultsBenchmarking revealed frequent updates due to clinical actions and demographic corrections across consecutive snapshots. Demographic changes were most frequently related to race and ethnicity, highlighting potential workflow and data entry inconsistencies. Discharge time and status values demonstrated instability for several days post-encounter, typically reaching a stable state within 4–7 days. These findings indicate that while near real-time EHR data provide valuable insights, the timing of data stabilization is critical for accurate secondary use.
ConclusionsThis study demonstrates the feasibility of automated benchmarking to assess the integrity of real-time EHR data and identify when such data become analysis ready. Our findings highlight key challenges for secondary use of dynamic clinical data and provide an automated framework that can be applied across health systems to support high-quality research, surveillance, and clinical trial readiness.
by Sarah L. Brown, Barry J. McDonnell, David McRae, Paul Angel, Imtiaz Khan, Rhiannon Phillips, Britt Hallingberg, Delyth H. James
Using visualisation to conceptualise a chronic condition can encourage accurate illness beliefs and support treatment adherence. Hi-BP is a digital visual intervention to support adherence to antihypertensive medication, co-produced with patients. The aim of this study was to investigate the feasibility and acceptability of Hi-BP and explore the preliminary direction of effects on illness and treatment beliefs, medication adherence and blood pressure (BP). A two-phased mixed-methods non-randomised feasibility study was conducted from April 2021 to March 2022 in eight community pharmacies across one Health Board in South-East Wales, UK. Hi-BP was delivered as a single researcher-led consultation to 69 patients in Phase 1 and by pharmacists to three patients in Phase 2. Feasibility was determined using predefined criteria, with acceptability explored qualitatively using semi-structured interviews. Quantitative outcome measures (illness perceptions, medication beliefs, medication-adherence, prescription dispensing and collection data, BP) were recorded at baseline and immediately post-intervention.Follow-up outcome measures were collected at two-weeks (medication-adherence) and three-months (all baseline measures). Hi-BP met feasibility criteria for pharmacist recruitment in both phases, and patient recruitment in Phase 1, but not Phase 2. Hi-BP was acceptable to the sub-sample of 15 patient participants interviewed in Phase 1; insufficient data were available to determine patient acceptability at Phase 2. Hi-BP was acceptable to pharmacists in Phase 1 and partially acceptable at Phase 2, due to competing demands on time for intervention delivery. All outcome measures were considered feasible for use, though a ceiling effect was noted for medication adherence. A potentially positive directional effect was found for illness perceptions (X2(2)=10.83,n=54,p=0.004), medication beliefs (BMQ-Necessity (X2(2)=11.71,n=54,p=0.003) and BP (Systolic BP Z=-3.91,n=51,p=2(2)= 2.4,n=45,p=0.299). In the Community Pharmacy setting, Hi-BP was well-accepted and has the potential for significant reductions in BP; however, further research is needed to explore pharmacist capacity to support implementation.by María-Angélica Calderón-Peláez, Myriam L. Velandia-Romero, Jaime E. Castellanos
Zika virus (ZIKV) poses a significant threat to neural tissue, causing substantial damage to unborn children exposed to the virus in utero, with consequences that can manifest even after birth, despite being born with a normal head circumference. Regardless of the extensive research, the interactions between ZIKV and the nervous system cells remain insufficiently understood, particularly regarding how neuronal responses influence broader inflammatory and viral dynamics especially in postnatal stages of development. This study evaluated the susceptibility to ZIKV infection, viral replication, immune response, and survival of neurons, astrocytes and microglial cells during postnatal developmental stages, using both in vivo and in vitro mice models. In vivo, a non-lethal but extensive infection of neurons and microglia was shown. The infection caused a robust but controlled immune response with elevated levels of MCP-1, TNF-α, and IL-6, that prevented severe neuronal damage. In vitro, neurons exhibited high susceptibility to ZIKV, with elevated levels of pro-inflammatory cytokines and IFN-β, indicating a strong inflammatory response. In contrast, astrocytes and microglia displayed varied responses, contributing to a pro-inflammatory feedback loop. These findings offer critical insights into the cellular dynamics of ZIKV infection, enhancing our understanding of its effects during postnatal nervous system development. By clarifying the interactions between ZIKV and neuronal cell types, this study deepens the comprehension of the virus’s pathophysiology and its broader implications for neurodevelopmental outcomes, extending beyond the well-documented association with microcephaly.by Amy Wermert, Theodore M. Brasky, Alison M. Newton, Alice Hinton, Hayley Curran, Amy K. Ferketich, Matthew J. Carpenter, Peter G. Shields, Patrick Tomko, Theodore L. Wagener, Brittney Keller-Hamilton
BackgroundWith the highest cancer incidence and mortality rates in the country, rural Appalachia has experienced a decades-long health decline, due in part to high smoking rates. Cigarette smoking prevalence exceeds 30% in much of the region. Oral nicotine pouches (ONPs), which contain nicotine but no tobacco, present an unexplored opportunity to reduce cigarette smoking and cancer incidence.
ObjectivesWe outline the protocol for the Appalachian Research to Impact Smoking’s Effects (ARISE) study, a randomized controlled trial to determine whether ONPs affect cigarette smoking patterns short- and long-term, and to evaluate their abuse liability versus nicotine replacement therapy (NRT) in a large sample of Appalachian smokers (clinicaltrials.gov: NCT06763536).
MethodsBetween 2025 and 2029, we will recruit 1,000 adult smokers living in rural Appalachian counties across 11 states. Participants will be identified via media outreach, mobile cancer screening, community events, and respondent-driven sampling, then randomized to ONP or NRT and complete four study phases: Baseline, Sampling, Switch, and Observation. In the Sampling phase, participants will receive varied flavors and nicotine strengths of their assigned product and select preferred options for use. During the Switch Phase, they will attempt to quit smoking and switch completely to their assigned product. The Observation phase will monitor tobacco use after discontinuation of study products. Study procedures will be conducted online and by mail, including surveys, expired carbon monoxide verification, and product delivery. The primary outcome is 7-day biochemically verified cigarette abstinence at the end of the Switch Phase. Secondary outcomes include switching rates, product appeal, craving, withdrawal, dependence, and purchases during the Observation phase. An intention-to-treat log-binomial regression model will estimate the effect of intervention assignment on cigarette abstinence.
ConclusionsResults will inform whether and how ONPs should be regulated, approached clinically, and used in public health interventions to reduce the burdens of cigarette smoking in Appalachia.
by Robin A. Pollini, Catherine E. Paquette, Brandon Irvin, Jennifer L. Syvertsen, Christa L. Lilly
Drug use is a highly stigmatized behavior, and drug-related stigma is a key driver of behavioral risk, lower health care utilization, and associated adverse health outcomes among people who inject drugs (PWID). While instruments exist for measuring drug-related stigma, their applicability to community-based PWID across multiple stigma types (enacted, anticipated, internalized) and settings (health care, society, family) is limited, as most were developed using treatment-based samples and all were developed in urban populations. This study sought to develop a Drug Use Stigma Scale (DUSS) that addresses these limitations. We developed an initial list of 39 items based on literature review and qualitative interviews (N = 27) and three focus groups (N = 28) with PWID recruited from syringe services programs and via peer referral in two predominantly rural West Virginia counties. The scale items were administered in a survey to 336 PWID recruited from the same two counties divided into development and validation samples. Responses to the 39-item scale went through a multidimensional refinement process, including examination of internal consistency, Confirmatory Factor Analysis (CFA), and a three-factor CFA based on stigma setting. Next, a set of final measurement CFAs were conducted. Finally, the resulting scale was examined for criterion-related concurrent validation. The final DUSS consisted of 16 items with excellent fit statistics for the development sample: SRMR: 0.03, RMSEA: 0.09, GFI: 0.92, CFI: 0.96, NFI: 0.94. Fit attenuated but remained satisfactory for the validation sample. DUSS scores were significantly associated with increased odds of not seeking healthcare when needed (OR: 1.47, p = 0.001; OR: 1.61, pby Emily Dunlap, Yanbing Zhou, Manny M.Y. Kwok, Billy C.L. So, Hirofumi Tanaka
ObjectiveTo evaluate the effects of aquatic exercise compared with non-exercise controls and land-based exercise on arterial stiffness and endothelial function.
DesignSystematic review and meta-analyses of randomized controlled trials assessed using the Cochrane risk-of-bias tool and Grading of Recommendations Assessment, Development and Evaluation.
Data sourcesPubMed/MEDLINE, CINAHL Plus, SPORTDiscus, and reference lists, searched from database inception to April 16, 2025.
Eligibility criteriaStudies evaluating chronic aquatic exercise (multi-session interventions) compared with land-based exercise or non-exercise comparison groups in adults, measuring arterial stiffness via pulse wave velocity (PWV) or endothelial function via flow-mediated dilation (FMD).
ResultsThis review includes 18 randomized controlled trials with 845 participants (mean age 65 ± 7 years). Studies compared aquatic exercise with non-exercise controls (8 studies), land-based exercise (6 studies), or both (4 studies). Exercise sessions averaged 50 minutes, 3 times weekly for 11 weeks. Most studies (17 out of 18) implemented moderate-to-vigorous intensity protocols. Aquatic exercise resulted in improvements in arterial stiffness compared with non-exercise controls (7 studies; SMD = –2.37, 95% CI: –4.46 to –0.29; I2 = 98%: low certainty), with most evidence reflecting systemic and peripheral PWV. Changes in arterial stiffness did not differ from those observed after land-based exercise (6 studies; SMD = –0.07, 95% CI: –0.34 to 0.20; I2 = 0%, moderate certainty). For endothelial function, aquatic exercise may improve outcomes versus non-exercise controls (6 studies; SMD = 0.91, 95% CI: 0.39 to 1.43; I2 = 68%; low certainty) and may lead to greater improvements than land-based exercise (7 studies; SMD = 0.55, 95% CI: 0.05 to 1.06; I2 = 75%; low certainty).
ConclusionAquatic exercise improves systemic and peripheral arterial stiffness as well as endothelial function compared with non-exercising controls. Changes in arterial stiffness do not differ from those observed after land-based exercise. Aquatic exercise may provide greater improvement in endothelial function than land-based exercise, though this is supported by low-certainty evidence, and substantial heterogeneity limits confidence in the generalizability of this finding.
PROSPERO registrationCRD42025642087.
by Chamberline E. Ozigbu, Zhenlong Li, Bankole Olatosi, James W. Hardin, Nicole L. Hair
While prior studies have identified sociodemographic correlates of zero-dose status within populations in sub-Saharan Africa (SSA), few have applied spatial regression techniques to explore geographic variability in these relationships. We aimed to address this gap using data from Demographic and Health Surveys conducted in SSA between 2010 and 2020. Our sample comprised children aged 12–59 months in 33 countries and 329 survey regions. Data were aggregated to the first-level administrative unit prior to analysis. First, using ordinary least squares regression, we documented global relationships between theoretically important sociodemographic characteristics and zero-dose prevalence. Next, we identified patterns, i.e., geographic clustering, of zero-dose prevalence. Finally, using multiscale geographically weighted regression, we described spatial variability in relationships between sociodemographic characteristics and zero-dose prevalence. We detected 27 regions with higher than expected concentrations of zero-dose children. All but one of these hot spots were observed in 7 Western and Central African countries; only 1 was located in an Eastern African country. Regions with higher proportions of mothers with no antenatal care visits were consistently found to have higher rates of zero-dose children. In contrast, relationships between zero-dose prevalence and indicators of religious affiliation, delivery site, maternal age, maternal education, and maternal employment were found to vary locally in terms of their strength and/or direction. Study findings underscore spatial disparities in zero-dose prevalence within SSA and, further, highlight the importance of geographically informed strategies to effectively address immunization gaps. Implementing targeted interventions based on regional sociodemographic dynamics is crucial for achieving comprehensive vaccination coverage in SSA.by Xin Xu, Ghada Homsi, Sherry T. Liu, Jennifer M. Gaber, Naa A. Inyang, Brian L. Rostron, Caryn F. Nagler, James Nonnemaker
BackgroundIn 2022, 3.7% of U.S. adults currently smoked cigars. This study assesses cigar-smoking-attributable fractions in U.S. healthcare expenditures and associated annual healthcare expenditures overall and by payer, including publicly funded healthcare programs.
MethodsData were obtained from the 2000, 2005, 2010, and 2015–2017 National Health Interview Survey linked with corresponding panels from the Medical Expenditure Panel Survey data through 2018. The final sample (n = 53,733) was restricted to adults aged 25 + . Estimates from four-part models and data from the Personal Health Care component of the 2001–2018 National Health Expenditures Accounts were combined to estimate fractions of and annual healthcare expenditures attributable to cigar smoking. All models controlled for sociodemographic characteristics and health-related behaviors.
ResultsDuring 2001–2018, an estimated 1.8% (95% CI = 0.9%–3.4%) or $29.7 billion annually of U.S. healthcare expenditures could be attributed to cigar smoking. Most of this was funded by other third-party health insurance programs, a mix of private and public payers (e.g., Department of Veterans Affairs).
ConclusionsCigar smoking creates a preventable financial burden on the U.S. healthcare system. Health consequences associated with cigar smoking may remain after successful quitting. The findings underscore the importance of preventing initiation of cigar smoking and providing evidence-based cessation methods to reduce the health and economic burden of cigar smoking.
by Carly E. Hawkins, Thomas P. Hahn, Jessica L. Malisch, Gail L. Patricelli
Males in socially monogamous species can achieve reproductive success through multiple tactics– by defending paternity within the social nest and siring extra-pair offspring, or both. Previous studies have found that sperm morphology may differentially affect fertilization success in extra-pair compared to within-pair matings; therefore, we explored whether sperm morphological traits can predict the probability of success within components of reproductive success. Here, we measured sperm component traits (head length and flagellum length) and derived traits (total length and flagellum:head ratio) in free-living Mountain White-crowned Sparrows (Zonotrichia leucophrys oriantha) and examined how these morphological traits relate to extra-pair and within-pair reproductive components of reproductive success. We found no evidence for correlations between sperm morphology and total seasonal reproductive success. However, we did find that sperm morphology appeared to be associated with whether a male was successful at acquiring extra-pair offspring or defending his own paternity within his nest: males that achieved extra-pair success had longer flagella and longer total length of sperm cells compared to males that did not sire outside of their social nest. In contrast, males that successfully defended all paternity within their social nest tended to have shorter heads and larger flagellum:head ratios compared to males that lost paternity in their social nest. While these patterns suggest that different sperm traits may be linked to success in different components of reproductive success, they should be interpreted with caution given the exploratory nature of this study and limited sample size, and further investigation is warranted.