by Andrea C. Aplasca, Peter B. Johantgen, Christopher Madden, Kilmer Soares, Randall E. Junge, Vanessa L. Hale, Mark Flint
Amphibian skin is integral to promoting normal physiological processes in the body and promotes both innate and adaptive immunity against pathogens. The amphibian skin microbiota is comprised of a complex assemblage of microbes and is shaped by internal host characteristics and external influences. Skin disease is a significant source of morbidity and mortality in amphibians, and increasing research has shown that the amphibian skin microbiota is an important component in host health. The Eastern hellbender (Cryptobranchus alleganiensis alleganiensis) is a giant salamander declining in many parts of its range, and captive-rearing programs are important to hellbender recovery efforts. Survival rates of juvenile hellbenders in captive-rearing programs are highly variable, and mortality rates are overall poorly understood. Deceased juvenile hellbenders often present with low body condition and skin abnormalities. To investigate potential links between the skin microbiota and body condition, we collected skin swab samples from 116 juvenile hellbenders and water samples from two holding tanks in a captive-rearing program. We used 16s rRNA gene sequencing to characterize the skin and water microbiota and observed significant differences in the skin microbiota by weight class and tank. The skin microbiota of hellbenders that were housed in tanks in close proximity were generally more similar than those housed physically distant. A single taxa, Parcubacteria, was differentially abundant by weight class only and observed in higher abundance in low weight hellbenders. These results suggest a specific association between this taxa and Low weight hellbenders. Additional research is needed to investigate how husbandry factors and potential pathogenic organisms, such as Parcubacteria, impact the skin microbiota of hellbenders and ultimately morbidity and mortality in the species.by Carol Kotliar, Lisandro Olmos, Martín Koretzky, Ricardo Jauregui, Tomás Delía, Oscar Cingolani
ObjectiveTo evaluate the effectiveness of the Mental Training Tech 24.5 (MTT24.5) cognitive stimulation program, designed to enhance cognitive performance and neuroplasticity in healthy adults.
BackgroundCognitive decline is a significant concern in aging populations, with research suggesting that neuroplasticity and cognitive reserve can be enhanced through targeted cognitive training. The MTT24.5 program aims to stimulate brain function through a combination of new knowledge acquisition (DATA) and learning techniques (TECHS), organized into a systematic algorithm. This approach may offer a novel way to prevent or mitigate age-related cognitive decline.
DesignPilot clinical study, active-controlled, open randomization.
SettingAdults from the general population with no clinical cognitive deterioration, recruited from three sites within the Autonomous City of Buenos Aires and its metropolitan area.
Participants120 volunteers were enrolled, of which 76 participants (56 in the intervention group, 20 in the control group) met the study requirements and selected a site closest to their residence.
MethodsThe MTT24.5 program consists of 12 weekly in-person sessions (totaling 24.5 hours), during which participants learned 40 knowledge units (DATA) and 100 learning techniques (TECHS). These were organized into binomials, where each unit of DATA was paired with 3–4 TECHS. Pre- and post-intervention assessments included medical history, lifestyle factors, cognitive reserve scale, Addenbrooke’s Cognitive Examination-Revised (ACE-R), and Mini-Mental State Examination (MMSE).
ResultsThe mean age was 59 years for both groups. Baseline ACE-R scores were comparable (91.3). The global cognitive score increased by 4.6 points (5%) in the intervention group compared to a decrease of 0.5 points in the control group (p Conclusions
The MTT24.5 program, based on a systematic algorithm for acquiring new knowledge and skills, significantly enhances cognitive reserve and overall cognitive performance, particularly in individuals with lower baseline cognitive scores. These findings suggest that structured cognitive stimulation could play a critical role in preventing cognitive decline and promoting cognitive health in healthy adults. Given the promising results, future studies involving larger populations and long-term follow-up are essential to validate these effects and explore the potential for mitigating age-related cognitive decline and enhancing quality of life.
RegistrationThe study was registered in accordance with local regulations at the National Council for Scientific and Technological Research (CONICET) – Institute of Biomedical Research (BIOMED), and also in the National Ethics Committee, and at clinicaltrials.gov (NCT06549517).
by Mireia Solé Pi, Luz A. Espino, Péter Szenczi, Marcos Rosetti, Oxána Bánszegi
A long-standing question in the study of quantity discrimination is what stimulus properties are controlling choice. While some species have been found to do it based on the total amount of stimuli and without using numerical information, others prefer numeric rather than any continuous magnitude. Here, we tested cats, dogs, and humans using a simple two-way spontaneous choice paradigm (involving food for the first two, images for the latter) to see whether numerosity or total surface area has a greater influence on their decision. We found that cats showed preference for the larger amount of food when the ratio between the stimuli was 0.5, but not when it was 0.67; dogs did not differentiate between stimuli presenting the two options (smaller vs. larger amount of food) regardless of the ratio between them, but humans did so almost perfectly. When faced with two stimuli of the same area but different shapes, dogs and humans exhibited a preference for certain shapes, particularly the circle, while cats’ choices seemed to be at chance level. Furthermore, cats’ and dogs’ reaction times were equal across conditions, while humans were quicker when choosing between stimuli in trials where the shape was the same, but the surface area was different, and even more so when asked to choose between two differently sized circle shapes. Results suggest that there is no universal rule regarding how to process quantity, but rather that quantity estimation seems to be tied to the ecological context of each species. Future work should focus on testing quantity estimation in different contexts and different sources of motivation.by Linlin Bao, Haibo Zhao, Haiyue Ren, Chong Wang, Su Fang
Hair follicle stem cells (HFSCs) play critical roles in adult hair regeneration, owing to its self-renewal and multipotent differentiation properties. Emerging evidence has shown that long noncoding RNAs (LncRNAs) are implicated in biological processes such as proliferation, differentiation and apoptosis. However, the specific role of LncRNA RP11-818O24.3 in regulating HFSCs remains unclear. To explore the effect of LncRNA RP11-818O24.3 on HFSCs, stable LncRNA RP11-818O24.3 overexpression and knockdown HFSCs were established using a lentivirus vector system. The effect of LncRNA RP11-818O24.3 on proliferation was evaluated by Cell Counting Kit-8 (CCK8) and EdU incorporation experiments. The differentiation of HFSCs into neurons and keratinocyte stem cells was detected by immunofluorescence staining. We showed that LncRNA RP11-818O24.3 overexpression promoted the proliferation and inhibited cell apoptosis in HFSCs. High levels of LncRNA RP11-818O24.3 promoted the differentiation of HFSCs into CD34+K15+ keratinocyte progenitors and CD34+Nestin+neuron-specific enolase (NSE)+ neural stem cells. Additionally, LncRNA RP11-818O24.3 increased fibroblast growth factor 2 (FGF2) expression and the subsequent activation of the PI3K/AKT signaling pathway. These data demonstrated that LncRNA RP11-818O24.3 promotes self-renewal, differentiation, and the capability to inhibit apoptosis of HFSCs via FGF2 mediated PI3K/AKT signaling pathway, highlighting its potential role as a therapeutic strategy for treating hair loss diseases.by Sishir Poudel, Laxman Wagle, Tara Prasad Aryal, Binay Adhikari, Sushan Pokharel, Dipendra Adhikari, Kshitiz Bhandari, Kshitiz Rijal, Jyoti Bastola Paudel
BackgroundMultidrug-resistant tuberculosis (MDR-TB) continues to be a major public health concern, especially in high-burden countries like Nepal. While individual risk factors are known, the cumulative impact of cardiometabolic factors on MDR-TB is not well understood.
MethodsA health-facility-based, age- and sex-matched 1:2 case-control study was conducted at MDR-TB treatment centers in Gandaki Province, Nepal. MDR-TB patients (cases) and drug-sensitive tuberculosis (DS-TB) patients (controls) were enrolled. Cases were defined as adults (≥18 years) with confirmed MDR-TB; controls were adults with sputum-positive DS-TB. Data on sociodemographics, cardiometabolic risk factors (alcohol, tobacco, abnormal body mass index, hypertension, diabetes), TB literacy, and treatment history were collected using a structured, pretested questionnaire by trained medical officers. Data were analyzed using Stata v13.0. Binary logistic regression was used to assess associations between risk factors and MDR-TB. Ethical approval was obtained from the Nepal Health Research Council and written informed consent was obtained from all participants.
ResultsA total of 183 participants (61 cases, 122 controls) were included. Mean age of participants was 42.5 years (SD = 18.5); 73.8% were male. Most participants were from urban areas (74.9%), and 66.7% were unemployed. Cardiometabolic risk factors were present in 79.2% of participants. Alcohol and tobacco use were reported by 59.6% and 45.9%, respectively; 9.8% had diabetes and 7.1% had hypertension. Known TB contact and prior TB history were reported by 26.8% and 31.1% respectively. In multivariate analysis, unemployment (AOR: 5.24, 95% CI: 1.33–20.64), and known TB contact (AOR: 8.89, 95% CI: 2.46–32.15) were significantly associated with MDR-TB. Cardiometabolic risk factors were not significantly associated.
ConclusionKnown TB contact and unemployment were significantly associated with MDR-TB, while the cumulative effect of cardiometabolic risk factors showed no significant impact, indicating that interventions should prioritize established TB-related risk factors.
by Simon Knobloch, Philipp Haul, Saskia Rusche, Heiko Paland, Darius Zokai, Moritz Haaf, Jonas Rauh, Christoph Mulert, Gregor Leicht
When confronted with dichotically presented syllables, right-handed healthy individuals tend to consciously perceive syllables presented to the right ear more often. This phenomenon, known as the right-ear advantage, is driven by delayed processing of information from the left ear in left temporal auditory cortex due to its indirect relay through the corpus callosum. In contrast, less is known about about the corresponding mechanisms for stimuli processed in the right temporal hemisphere. In this study, we developed a melody-based dichotic listening paradigm designed to induce a left-ear advantage. This novel paradigm, alongside a classical syllable-based paradigm was tested in 40 healthy right-handed participants. We also examined the influence of musical education on lateralization of auditory processing. Our results revealed a significant left-ear advantage for the perception of dichotically presented melodies and replicated established findings of a right-ear advantage for syllables. No group differences emerged between participants with or without current or past musical practice. However, among those with musical training, a greater number of years of practice was associated with a reduced right-ear advantage for syllables and an increased report of melodies presented to the left-ear. These findings suggest that the left-ear advantage in dichotic perception of melodies reflects right hemispheric processing of musical stimuli. Moreover, monitoring of the left ear seems to be altered by musical practice. Future research using neuroimaging techniques will be necessary to confirm this finding.by Muluken Chanie Agimas, Mekuriaw Nibret Aweke, Berhanu Mengistu, Lemlem Daniel Baffa, Elsa Awoke Fentie, Ever Siyoum Shewarega, Aysheshim Kassahun Belew, Esmael Ali Muhammad
IntroductionMalaria is a global public health problem, particularly in sub-Saharan African countries. It is responsible for 90% of all deaths worldwide. To reduce the impact and complications associated with delayed treatment of malaria among children under five, comprehensive evidence about the magnitude and determinants of delayed treatment for malaria could be the solution. But there are no national-level studies in the Horn of Africa for decision-makers.
ObjectiveTo assess the prevalence and associated factors of delay in seeking malaria treatment among under-five children in the Horn of Africa.
MethodPublished and unpublished papers were searched on Google, Google Scholar, PubMed/Medline, EMBASE, SCOPUS, and the published articles’ reference list. The search mechanism was established using Medical Subject Heading (MeSH) terms by combining the key terms of the title. Joana Brigg’s Institute critical appraisal checklist was used to assess the quality of articles. A sensitivity test was conducted to evaluate the heterogeneity of the studies. The visual funnel plot test and Egger’s and Begg’s statistics in the random effect model were done to evaluate the publication bias and small study effect. The I2 statistics were also used to quantify the amount of heterogeneity between the included studies.
ResultsThe pooled prevalence of delayed treatment for malaria among under-five children in the Horn of Africa was 48% (95% CI: 34%–63%). History of child death (OR =2.5, 95% CI: 1.73–3.59), distance >3000 meters (OR = 2.59, 95% CI: 2.03–3.3), drug side effect (OR = 2.94, 95% CI: 1.86–4.67), formal education (OR = 0.69, 95% CI: 0.49–0.96), middle income (OR = 0.42, 95% CI: 0.28–0.63), expensiveness (OR = 4.39, 95% CI: 2.49–7.76), and affordable cost (OR = 2.13, 95% CI: 1.41–3.2) for transport were factors associated with malaria treatment delay among children.
Conclusion and recommendationsAbout one out of two parents in the Horn of Africa put off getting their kids treated for malaria. High transportation expenses, long travel times (greater than 3,000 meters) to medical facilities, and anxiety about drug side effects were major risk factors that contributed to this delay. On the other hand, a middle-class income was found to be protective of treatment delays. These results highlight how crucial it is to improve access to healthcare services, both financially and physically, to minimize delays in treating malaria in the area’s children.
by Tadesse Tarik Tamir, Berhan Tekeba, Alebachew Ferede Zegeye, Deresse Abebe Gebrehana, Mulugeta Wassie, Gebreeyesus Abera Zeleke, Enyew Getaneh Mekonen
IntroductionSolitary childbirth—giving birth without any form of assistance—remains a serious global public health issue, especially in low-resource settings. It is associated with preventable maternal complications such as hemorrhage and sepsis, and poses significant risks to newborns, including birth asphyxia, infection, and early neonatal death. In Ethiopia, where many births occur outside health facilities, understanding the spatial and socio-demographic patterns of solitary childbirth is vital for informing targeted interventions to improve maternal and child health outcomes. This study aims to identify and map the spatial distribution of solitary childbirth across Ethiopia and to analyze its determinants using data from the 2019 national Interim Demographic and Health Survey.
MethodWe analyzed data from the 2019 Interim Ethiopian Demographic and Health Survey to determine the spatial distribution and factors of solitary birth in Ethiopia. A total weighted sample of 3,884 women was included in the analysis. Spatial analysis was used to determine the regional distribution of solitary birth, and multilevel logistic regression was employed to identify its determinants. ArcGIS 10.8 was used for spatial analysis, and Stata 17 was used for multilevel analysis. The fixed effect was analyzed by determining the adjusted odds ratio with a 95% confidence interval.
ResultThe prevalence of solitary childbirths in Ethiopia was 12.73%, with a 95% confidence interval spanning from 11.71% to 13.81%. The western and southern parts of Oromia, all of Benishangul-Gumuz, most parts of the SNNPR, and the west of Amhara regions were hotspot areas for solitary birth. Having no formal education, not attending ANC visits, and residing in pastoral regions were significantly associated with higher odds of solitary birth in Ethiopia.
CocnlusionA notable proportion of women are experiencing childbirth alone, which highlights a significant aspect of maternal health in the country, reflecting both the challenges and improvements in childbirth practices. The distribution of solitary births exhibited spatial clustering with its hotspot areas located in western and southern parts of Oromia, all of Benishangul-Gumuz, most parts of the SNNPR, and west of Amhara regions. Lack of education, not having an ANC visit, and being a resident of pastoral regions were significant determinants of solitary birth. The implementation of maternal and child health strategies in Ethiopia could benefit from considering the hotspot areas and determinants of solitary birth.
by Ismat Tasnim, Md. Asif Iqbal, Ismat Ara Begum, Mohammad Jahangir Alam, Morten Graversgaard, Paresh Kumar Sarma, Kiril Manevski
Food insecurity and agriculture in South Asia, including Bangladesh, pose significant threats to the well-being and livelihoods of its people. Building adaptive capacities and resilient food systems is crucial for sustainable livelihoods. This study employs the Resilience Index Measurement and Analysis II framework to construct a Resilience Capacity Index (RCI) and analyze its relationship with food security using data from the Bangladesh Integrated Household Survey 2018. The study applies Exploratory Factor Analysis and Structural Equation Modeling to examine the impact of key resilience components such as Access to Basic Services, Adaptive Capacity, and Assets on household resilience. The findings reveal that access to basic services, land assets, and farm equipment positively influences households’ resilience capacity. However, the presence of livestock assets has a negative impact, potentially due to market volatility, climate vulnerability, and disease outbreaks. Additionally, adaptive capacity has a positive but insignificant influence on RCI, suggesting that without enhancing economic opportunities, institutional support, and inclusive development strategies, adaptive capacity could not be enough to foster resilience. However, resilient capacity enhances food security metrics such as the Food Consumption Score and Expenditure. These findings underscore the importance of policies that focus on increasing and maintaining access to basic services, promoting sustainable land management practices, and strengthening social safety nets. This study emphasizes the importance of focusing on livestock assets to ensure their sustainability by stabilizing the livestock market, improving veterinary services, and providing subsidies to reduce maintenance costs.by Denis Sereno, Tahar Kernif, Renato Leon, Kholoud Kahime, Souad Guernaoui, Chaymaa Harkat, Mario J. Grijalva, Omar Hamarsheh, Anita G. Villacis, Bachir Medrouh, Thiago Vasconcelos Dos Santos, Razika Beniklef, Naouel Eddaikra, Phlippe Holzmuller
IntroductionLeishmaniases are a vector-borne parasitic diseases with diverse clinical manifestations involving multiple Leishmania species and animal hosts. While most leishmaniasis cases are caused by a few well characterized Leishmania species, reports describe infections by unconventional or emerging Leishmania taxa, atypical clinical presentations from classical species, and occurrences of atypical Leishmania in animal hosts. These underrecognized infections present diagnostic and therapeutic challenges and are rarely reflected in surveillance systems or clinical guidelines. A systematic mapping of this evolving landscape is needed to guide future diagnostics, policy, and research priorities.
Methods and analysisFollowing the Joanna Briggs Institute (JBI) methodology and PRISMA-ScR guidelines, we will search PubMed, Embase, Cochrane Library (CENTRAL), PROSPERO, Web of Science, and Global Index Medicus, as well as relevant grey literature. Eligible studies will include human cases with clinical presentations that diverge from those typically associated with well-characterized Leishmania species, reports involving unconventional or emerging Leishmania species, and animal cases of veterinary relevance caused by non-classical species, regardless of study design. Dual independent screening of records and data extraction using a standardized charting form will be conducted. Discrepancies between reviewers will be resolved by consensus. Data will be summarized descriptively through tables, figures, and thematic synthesis. Research gaps will be identified to inform future studies and public health strategies.
DisseminationThis review will use data from published sources and findings will be disseminated through publication in a peer-reviewed journal, presentations at scientific conferences, and sharing with relevant stakeholders. The results are intended to inform clinicians, researchers, and policymakers about the evolving landscape of leishmaniasis and to highlight priorities for future research and surveillance.
by Changseok Lee, Liam Redden, Vivian Eng, Brennan Eadie
PurposeTo investigate the luminance capacity and achievable threshold levels of commercially employed virtual reality (VR) devices for visual field testing.
MethodsThis two-part study included (1) a literature review of VR headsets used for perimetry with luminance data extracted from technical specifications in publications and manufacturers; and (2) empirical evaluation of three most employed VR headsets in the literature using a custom virtual testing environment.
ResultsThree most employed VR devices for visual field testing were Pico Neo, Oculus Quest, and HTC Vive. The maximum reported luminance was 250 cd/m2 for the HTC Vive Pro. Information on luminance measurement was not consistently available, reporting only handheld luminance meters. Empirical measurements show that handheld luminance meters significantly overestimate luminance compared to standard spectroradiometers. Measured luminance varies significantly across aperture size and decreases for peripheral stimuli up to 30 degrees peripherally. Assuming conventional background of 10 cd/m2, the best performance with lowest possible thresholding was with HTC Vive at 16dB, corresponding to luminance of 80 cd/m2 centrally. Oculus Quest 2 and Pico Neo 3 had minimum threshold of 20dB.
ConclusionCommercially available VR devices do not meet luminance requirements or threshold sensitivities for visual field testing. Current VR technology is not designed—nor has the capacity—to threshold at mid-to-low dB ranges, which limits accuracy in diagnosing and monitoring defects seen in glaucoma. Translational Relevance: This study highlights the technical limitations of current commercially available VR devices for visual field testing and significant variables in evaluating luminance performance in these devices.
by Mario Gómez-Martínez, Greta Arias-Merino, Juan Benito-Lozano, Ana Villaverde-Hueso, Renata Linertová, Verónica Alonso-Ferreira
Inherited Epidermolysis Bullosa (EB) is a group of rare, genetic skin diseases characterized by extreme fragility of the skin and mucous membranes, leading to blistering and wounds in response to minimal trauma or friction. These clinical manifestations significantly reduce health-related quality of life (HRQoL). The objective of this protocol article is to provide information about the methods planned to be used to assess the measurement properties of HRQoL instruments specifically developed for EB patients of all age groups through a systematic review and meta-analysis. The protocol followed the Preferred Reporting Items for Systematic Reviews and Meta-analyses Protocols (PRISMA-P) guideline. The literature search will be conducted in PubMed, Web of Science (WOS) and EMBASE, including terminology that aligns with the four key elements of the COSMIN (COnsensus-based Standards for the selection of health Measurement INstruments) research question (construct, target population, measurement properties and type of PROM), as well as the terminology proposed by COSMIN for measurement properties. Studies that include information on measurement properties (specifically, validity and/or reliability) with a sample of patients with inherited EB will be selected. Both title and abstract screening and full text review, will be conducted by two independent reviewers using the Rayyan tool. In addition, the risk of bias will be assessed using the COSMIN-Risk of Bias checklist. The data from each study and each measurement property will be summarized in accordance with the COSMIN guidelines. The evidence gathered will strive to adjudicate data on measurements properties of HRQoL instruments used in EB patients, and the limitations of the future systematic review will be discussed. Ultimately, results of the future systematic review will help develop more personalized guidelines for the assessment of HRQoL in EB patients of all age groups. The protocol is registered in OSF with registration number vrm87: https://osf.io/vrm87/by Drew Gorenz, Norbert Schwarz
People hear jokes live and pre-recorded in a variety of settings, from comedy clubs, bars, outdoor venues, cafes, to their own home or car. While a lot of research has analyzed the significance of the content of jokes, we know less about the significance of the setting one hears them in. Some settings can have interfering background noise or poor acoustics, reducing an audience’s ease of processing heard jokes. Would this affect how funny the jokes seem? Two experiments with audio clips of stand-up comedy performances show that participants found jokes less funny when background noise interfered with their listening.by Juliana Rodrigues Tovar Garbin, Franciéle Marabotti Costa Leite, Ana Paula Brioschi dos Santos, Larissa Soares Dell’Antonio, Cristiano Soares da Silva Dell’Antonio, Luís Carlos Lopes-Júnior
A comprehensive understanding of the factors influencing the epidemiological dynamics of COVID-19 across the pandemic waves—particularly in terms of disease severity and mortality—is critical for optimizing healthcare services and prioritizing high-risk populations. Here we aim to analyze the factors associated with short-term and prolonged hospitalization for COVID-19 during the first three pandemic waves. We conducted a retrospective observational study using data from individuals reported in the e-SUS-VS system who were hospitalized for COVID-19 in a state in a southeast state of Brazil. Hospitalization duration was classified as short or prolonged based on a 7-day cutoff, corresponding to the median length of hospital stay during the second pandemic wave. Bivariate analyses were performed using the chi-square test for heterogeneity. Logistic regression models were used to estimate odds ratios (ORs) and their respective 95% confidence intervals (CIs), with statistical significance set at 5%. When analyzing hospitalization duration across the three waves, we found that 51.1% (95%CI: 49.3–53) of hospitalizations in the first wave were prolonged. In contrast, short-duration hospitalizations predominated in the second (54.7%; 95% CI: 52.4–57.0) and third (51.7%; 95% CI: 50.2–53.2) waves. Factors associated with prolonged hospitalization varied by wave. During the first wave, older adults (≥60 years) (OR=1.67; 95%CI: 1.35–2.06), individuals with ≥10 symptoms (OR=2.03; 95%CI: 1.04–3.94), obese individuals (OR=2.0; 95%CI: 1.53–2.74), and those with ≥2 comorbidities (OR=2.22; 95%CI: 1.71–2.89) were more likely to experience prolonged hospitalization. In the second wave, he likelihood of extended hospital stays was higher among individuals aged ≥60 years (OR=2.04; 95%CI: 1.58–2.62) and those with ≥2 comorbidities (OR=1.77; 95%CI: 1.29–2.41). In the third wave, prolonged hospitalization was more frequent among older adults (OR=1.89; 95%CI: 1.65–2.17,), individuals with 5–9 symptoms (OR=1.52; 95%CI: 1.20–1.92), obese individuals (OR=2.2; 95%CI: 1.78–2.73), and those with comorbidities (OR=1.45; 95%CI: 1.22–1.72 and OR=2.0; 95%CI: 1.69–2.45). In conclusion, we identified variations in hospitalization patterns across the pandemic waves, although the differences were relatively subtle. These variations likely reflect gradual shifts in the risk factors associated with prolonged hospital stays. Our findings highlight t the importance of implementing targeted public health interventions, particularly those designed to reduce disease severity and improve clinical outcomes among vulnerable populations at greater risk of extended hospitalization.by Esther Ba-Iredire, James Atampiiga Avoka, Luke Abanga, Abigail Awaitey Darkie, Emmanuel Junior Attombo, Eric Agboli
IntroductionThe alarming rate of drug-resistant tuberculosis (DR-TB) globally is a threat to treatment success among positive tuberculosis (TB) cases. Studies aimed at determining the prevalence, trend of DR-TB and socio-demographic and clinical risk factors contributing to DR-TB in the four regions of Ghana are currently unknown. This study sought to determine the prevalence and trend of DR-TB, identify socio-demographic and clinical risk factors that influence DR-TB, and analyse the relationship between underweight and adverse drug reactions and treatment outcomes among DR-TB patients in four regions of Ghana.
MethodIt was a retrospective review conducted over 5 years, from January 2018 to the end of December 2022. The data were retrieved from the DR-TB registers and folders at the Directly Observed Treatment (DOT) centres in the four regions. Analysis of the data was conducted using STATA version 17.
ResultsThe prevalence of DR-TB in Ashanti was 10.1%, Eastern 5.3%, 27.8% in Central, and 2.7% in the Upper West region for the year 2022. The overall prevalence rate of DR-TB for the period 2018–2022 was 13.8%. The socio-demographic and clinical risk factors that influence DR-TB in the four regions are: age, marital status (aOR 3.58, P-value Conclusion
The study shows that the prevalence of DR-TB in Ghana is low, probably not because the cases have reduced but due to inadequate GeneXpert machines to detect the cases. Age, marital status, education, alcohol intake, previously treated TB cases, adverse drug reactions, underweight, and treatment outcome are factors influencing the development of DR-TB. Therefore, interventions aimed at improving the nutritional status of DR-TB cases and minimising adverse drug reactions will improve treatment outcomes.
by Cheyenne R. Wagi, Renee McDowell, Anyssa Wright, Kathleen L. Egan, Christina S. Meade, April M. Young, Madison N. Enderle, Angela T. Estadt, Kathryn E. Lancaster
BackgroundHepatitis C virus (HCV) and injection drug use among young women are dramatically rising in the rural United States. From 2004 to 2017, heroin use among non-pregnant women increased 22.4% biennially, mirroring increases in HCV cases, especially among younger populations. Young women who inject drugs (YWID, ages 18–35) face elevated HCV risk due to biological, behavioral, and socio-cultural factors. Barriers to HCV testing and treatment services further delay diagnoses, fuel transmission, and limit access to harm reduction services. This study applies the Theoretical Domains Framework (TDF) to identify factors influencing HCV testing and treatment among YWID in rural Appalachia Ohio.
MethodsWe conducted in-depth interviews with YWID (n = 30) in 2023 to understand their HCV testing and treatment experiences in rural Appalachia Ohio. Interviews were transcribed, inductively coded, and analyzed using grounded theory. Identified themes were mapped onto the TDF domains.
ResultsKey TDF domains influencing HCV care included knowledge, beliefs about consequences, and intentions. While YWID knew where to get tested, they expressed uncertainty about treatment value and access while actively using drugs. Social influences, stigma, and mistreatment by healthcare providers created barriers to treatment. Environmental context and resources, such as transportation, also influenced access to care.
ConclusionsYWID in rural Appalachia face barriers to HCV care, such as gaps in knowledge about HCV treatment, which is compounded by gendered stigma, and logistical challenges. Rapidly changing treatment restrictions led to misinformation about treatment access. These gaps highlight the need for interventions specifically designed to address YWID lived experiences.
by Juan P. Hernández, Fredy Mesa, Andre J. Riveros
Honey bees (Apis mellifera) are essential pollinators threatened by sublethal effects of pesticides such as imidacloprid, a widely used neonicotinoid that disrupts the central nervous system. However, many of the systemic effects are poorly understood, especially on the physiological homeostasis of the honey bee. We evaluated the effects of oral administration of imidacloprid and the flavonol rutin on the properties of extracellular fluid (ECF) in Apis mellifera. We measured water content, evaporation rate, electrical impedance, and ion mobility of the ECF. Our results show impacts of imidacloprid consumption, such as water content decrease, slowed evaporation, and altered electrical characteristics of the thorax segment. All these events suggest disruption of osmotic and electrochemical balance. Particularly, the rutin consumption partially mitigated the imidacloprid effects in a dose-dependent manner, enhancing detoxification. Our results point out that imidacloprid alters ionic and osmotic homeostasis beyond neural targets; and on the other hand, rutin may protect against these disruptions through physiological mechanisms beyond neuroprotection. These findings highlight new alternatives and evaluations for protecting pollinators via dietary strategies.by Faith Morley, Lauren Mount, Anjile An, Erica Phillips, Rulla M. Tamimi, Kevin H. Kensler
The rising prevalence of individuals reporting extreme stress has major public health implications as it increases vulnerability to accelerated premature biological aging, thus increasing risk of chronic disease. To examine the impact of stress on premature biological aging, we assessed the association between exposure to increased stress, quantified by the Perceived Stress Scale, and odds of high allostatic load (AL). To illuminate previously unexplored socio-contextual factors, we controlled for self-reported individual and neighborhood social determinants of health that included discrimination, loneliness, food insecurity, neighborhood disorder, and neighborhood social cohesion. We utilized a cross-sectional design to examine the association between perceived stress and AL among 7,415 participants ages 18–65 in the All of Us Research Program, who enrolled from 2017–2022. We used logistic regression to evaluate the association between stress and high AL, controlling for sociodemographic factors and self-reported social determinants of health. Participants who were younger, receiving Medicaid, or Hispanic had increased prevalence of high stress. High stress was associated with elevated odds of high AL in age and sex-adjusted models (OR=2.18, 95%CI = 1.78, 2.66, high stress vs. low), an association which remained significant after adjusting for social determinants of health (OR=1.29, 95%CI = 1.01, 1.65). Using restricted cubic splines, high stress was significantly associated with increased odds of high AL, even after controlling for upstream individual and neighborhood-level determinants of health. While individuals living below the medium poverty-to-income ratio demonstrated little appreciable association between high stress and increased odds of high allostatic load, those living above the median poverty-to-income ratio reporting increased stress appeared to have increased odds of high allostatic load. Through addressing the upstream factors causing undue burdens of stress, which particularly affect marginalized communities and younger generations, we can begin to address premature biological aging and the comorbid conditions it accompanies.by Esther Mofiyinfoluwa Ola, Temitope Helen Balogun, Rasheed Olayinka Isijola, Oluwaremilekun Grace Ajakaye
Parasitic infections are a major cause of morbidity and mortality in Nigeria, with malaria and schistosomiasis having the highest burden. This study investigated the prevalence of malaria, urogenital schistosomiasis, and co-infections and their impact on the nutritional status of schoolchildren in two communities in Ondo State. A total of 185 participants from Ipogun and Oke Igbo were screened for malaria and schistosomiasis infection using the ParaHit malaria rapid diagnostic test kit and urine microscopy. Anthropometric measurements were used to assess the nutritional status of the participants. In this study, a higher prevalence of malaria was recorded in Oke Igbo, with 36 individuals (57.1%), compared to 60 individuals (49.2%) in Ipogun. Urogenital schistosomiasis was also more prevalent in Oke Igbo, affecting 18 individuals (28.6%), while only 5 individuals (4.1%) were affected in Ipogun. Co-infection with both diseases was more common in Oke Igbo, with 13 cases (20.6%), compared to 4 cases (3.3%) in Ipogun. However, malnutrition rates were similar between the two communities, with 60 cases (77.9%) in Ipogun and 28 cases (75.5%) in Oke Igbo. Notably, participants with either malaria or urogenital schistosomiasis, as well as those co-infected, exhibited a higher frequency of chronic malnutrition. The likelihood of co-infection was significantly associated with gender and locality, with individuals in Oke Igbo being 0.78 times less likely to be co-infected (P = 0.00; CI = 0.09–0.49), while males were 2.19 times more likely to have co-infections (P = 0.02; CI = 1.13–4.28). This study emphasised the significant health burden posed by malaria and urogenital schistosomiasis co-infections among schoolchildren in Ondo State, highlighting the need for comprehensive health and nutritional interventions to address the challenges associated with these parasitic diseases.by Okkeun Jung, Angelene Soto, Andrew L. Wolfe, Shahana S. Mahajan
KRAS mutations, which induce proliferative signaling driving many human cancers, are detectable in a small subset of osteosarcoma patients. The recently developed pan-KRAS inhibitor daraxonrasib, also known as RMC-6236, is capable of targeting a wide array of KRAS mutations and shows promise against pancreatic and lung cancers. However, the efficacy and mechanisms of action of daraxonrasib in osteosarcoma (OS) remain unclear. We evaluated the effects of daraxonrasib on the viability, proliferation, and metastatic potential of wild-type and KRAS mutant OS cells. We assayed the effects of treatment on downstream targets using qPCR, immunoblotting, and activity assays to explore the underlying mechanism by which daraxonrasib selectively suppresses the metastatic potential of KRAS mutant osteosarcoma. Finally, we investigated how the increased prevalence of GTP-bound KRAS enhanced the sensitivity of KRAS wild-type osteosarcoma cells to daraxonrasib using siRNA targeting RASA1. Daraxonrasib selectively attenuated the proliferation and migratory ability of KRAS mutant HOS-143B cells without affecting KRAS wild-type controls. Additionally, daraxonrasib suppressed the expression of the matrix metalloproteases MMP9 and MMP1, which promote cell motility and metastasis. Daraxonrasib selectively inhibited the AKT/ETS1 pathway in HOS-143B cells, whereas no such effect was observed in HOS cells. HOS cells were sensitized to daraxonrasib by knocking down the GTPase-activating protein RASA1. In osteosarcoma, KRAS inhibition decreased MMP1, MMP9, and AKT/ETS1 signaling. Daraxonrasib is a promising agent for treating osteosarcoma with KRAS mutations.