FreshRSS

🔒
❌ Acerca de FreshRSS
Hay nuevos artículos disponibles. Pincha para refrescar la página.
AnteayerPLOS ONE Medicine&Health

Contemporary national outcomes of hyperbaric oxygen therapy in necrotizing soft tissue infections

by William Toppen, Nam Yong Cho, Sohail Sareh, Anders Kjellberg, Anthony Medak, Peyman Benharash, Peter Lindholm

Background

The role of hyperbaric oxygen therapy (HBOT) in necrotizing soft tissue infections (NSTI) is mainly based on small retrospective studies. A previous study using the 1998–2009 National Inpatient Sample (NIS) found HBOT to be associated with decreased mortality in NSTI. Given the argument of advancements in critical care, we aimed to investigate the continued role of HBOT in NSTI.

Methods

The 2012–2020 National Inpatient Sample (NIS) was queried for NSTI admissions who received surgery. 60,481 patients between 2012–2020 were included, 600 ( Results

Age, gender, and comorbidities were similar between the two groups. On bivariate comparison, the HBOT group had lower mortality rate ( Conclusions

After correction for differences, HBOT was associated with decreased mortality, amputations, and non-home discharges in NSTI with the tradeoff of increase to costs and length of stay.

Retinoid orphan receptor gamma t (rorγt) promotes inflammatory eosinophilia but is dispensable for innate immune-mediated colitis

by Alvaro Torres-Huerta, Katelyn Ruley-Haase, Theodore Reed, Antonia Boger-May, Derek Rubadeux, Lauren Mayer, Arpitha Mysore Rajashekara, Morgan Hiller, Madeleine Frech, Connor Roncagli, Cameron Pedersen, Mary Catherine Camacho, Lauren Hollmer, Lauren English, Grace Kane, David L. Boone

Inflammatory bowel diseases (IBD) result from uncontrolled inflammation in the intestinal mucosa leading to damage and loss of function. Both innate and adaptive immunity contribute to the inflammation of IBD and innate and adaptive immune cells reciprocally activate each other in a forward feedback loop. In order to better understand innate immune contributions to IBD, we developed a model of spontaneous 100% penetrant, early onset colitis that occurs in the absence of adaptive immunity by crossing villin-TNFAIP3 mice to RAG1-/- mice (TRAG mice). This model is driven by microbes and features increased levels of innate lymphoid cells in the intestinal mucosa. To investigate the role of type 3 innate lymphoid cells (ILC3) in the innate colitis of TRAG mice, we crossed them to retinoid orphan receptor gamma t deficient (Rorγt-/-) mice. Rorγt-/- x TRAG mice exhibited markedly reduced eosinophilia in the colonic mucosa, but colitis persisted in these mice. Colitis in Rorγt-/- x TRAG mice was characterized by increased infiltration of the intestinal mucosa by neutrophils, inflammatory monocytes, macrophages and other innate cells. RNA and cellular profiles of Rorγt-/- x TRAG mice were consistent with a lack of ILC3 and ILC3 derived cytokines, reduced antimicrobial factors, increased activation oof epithelial repair processes and reduced activation of epithelial cell STAT3. The colitis in Rorγt-/- x TRAG mice was ameliorated by antibiotic treatment indicating that microbes contribute to the ILC3-independent colitis of these mice. Together, these gene expression and cell signaling signatures reflect the double-edged sword of ILC3 in the intestine, inducing both proinflammatory and antimicrobial protective responses. Thus, Rorγt promotes eosinophilia but Rorγt and Rorγt-dependent ILC3 are dispensable for the innate colitis in TRAG mice.

Perceptions of diabetes risk and prevention in Nairobi, Kenya: A qualitative and theory of change development study

by Anthony Muchai Manyara, Elizabeth Mwaniki, Jason M. R. Gill, Cindy M. Gray

Background

Type 2 diabetes is increasing in Kenya, especially in urban settings, and prevention interventions based on local evidence and context are urgently needed. Therefore, this study aimed to explore diabetes risk and co-create a diabetes prevention theory of change in two socioeconomically distinct communities to inform future diabetes prevention interventions.

Methods

In-depth interviews were conducted with middle-aged residents in two communities in Nairobi (one low-income (n = 15), one middle-income (n = 14)), and thematically analysed. The theory of change for diabetes prevention was informed by analysis of the in-depth interviews and the Behaviour Change Wheel framework, and reviewed by a sub-set (n = 13) of interviewees.

Results

The key factors that influenced diabetes preventive practices in both communities included knowledge and skills for diabetes prevention, understanding of the benefits/consequences of (un)healthy lifestyle, social influences (e.g., upbringing, societal perceptions), and environmental contexts (e.g., access to (un)healthy foods and physical activity facilities). The proposed strategies for diabetes prevention included: increasing knowledge and understanding about diabetes risk and preventive measures particularly in the low-income community; supporting lifestyle modification (e.g., upskilling, goal setting, action planning) in both communities; identifying people at high risk of diabetes through screening in both communities; and creating social and physical environments for lifestyle modification (e.g., positive social influences on healthy living, access to healthy foods and physical activity infrastructure) particularly in the low-income community. Residents from both communities agreed that the strategies were broadly feasible for diabetes prevention but proposed the addition of door-to-door campaigns and community theatre for health education. However, residents from the low-income community were concerned about the lack of government prioritisation for implementing population-level interventions, e.g., improving access to healthy foods and physical activity facilities/infrastructure.

Conclusion

Diabetes prevention initiatives in Kenya should involve multicomponent interventions for lifestyle modification including increasing education and upskilling at individual level; promoting social and physical environments that support healthy living at population level; and are particularly needed in low-income communities.

Biomechanical comparison of two surgical methods for Hallux Valgus deformity: Exploring the use of artificial neural networks as a decision-making tool for orthopedists

by Katarzyna Kaczmarczyk, Maria Zakynthinaki, Gabor Barton, Mateusz Baran, Andrzej Wit

Hallux Valgus foot deformity affects gait performance. Common treatment options include distal oblique metatarsal osteotomy and chevron osteotomy. Nonetheless, the current process of selecting the appropriate osteotomy method poses potential biases and risks, due to its reliance on subjective human judgment and interpretation. The inherent variability among clinicians, the potential influence of individual clinical experiences, or inherent measurement limitations may contribute to inconsistent evaluations. To address this, incorporating objective tools like neural networks, renowned for effective classification and decision-making support, holds promise in identifying optimal surgical approaches. The objective of this cross-sectional study was twofold. Firstly, it aimed to investigate the feasibility of classifying patients based on the type of surgery. Secondly, it sought to explore the development of a decision-making tool to assist orthopedists in selecting the optimal surgical approach. To achieve this, gait parameters of twenty-three women with moderate to severe Hallux Valgus were analyzed. These patients underwent either distal oblique metatarsal osteotomy or chevron osteotomy. The parameters exhibiting differences in preoperative and postoperative values were identified through various statistical tests such as normalization, Shapiro-Wilk, non-parametric Wilcoxon, Student t, and paired difference tests. Two artificial neural networks were constructed for patient classification based on the type of surgery and to simulate an optimal surgery type considering postoperative walking speed. The results of the analysis demonstrated a strong correlation between surgery type and postoperative gait parameters, with the first neural network achieving a remarkable 100% accuracy in classification. Additionally, cases were identified where there was a mismatch with the surgeon’s decision. Our findings highlight the potential of artificial neural networks as a complementary tool for surgeons in making informed decisions. Addressing the study’s limitations, future research may investigate a wider range of orthopedic procedures, examine additional gait parameters and use more diverse and extensive datasets to enhance statistical robustness.

Deep learning-based correction of cataract-induced influence on macular pigment optical density measurement by autofluorescence spectroscopy

by Akira Obana, Kibo Ote, Yuko Gohto, Hidenao Yamada, Fumio Hashimoto, Shigetoshi Okazaki, Ryo Asaoka

Purpose

Measurements of macular pigment optical density (MPOD) using the autofluorescence spectroscopy yield underestimations of actual values in eyes with cataracts. Previously, we proposed a correction method for this error using deep learning (DL); however, the correction performance was validated through internal cross-validation. This cross-sectional study aimed to validate this approach using an external validation dataset.

Methods

MPODs at 0.25°, 0.5°, 1°, and 2° eccentricities and macular pigment optical volume (MPOV) within 9° eccentricity were measured using SPECTRALIS (Heidelberg Engineering, Heidelberg, Germany) in 197 (training dataset inherited from our previous study) and 157 eyes (validating dataset) before and after cataract surgery. A DL model was trained to predict the corrected value from the pre-operative value using the training dataset, and we measured the discrepancy between the corrected value and the actual postoperative value. Subsequently, the prediction performance was validated using a validation dataset.

Results

Using the validation dataset, the mean absolute values of errors for MPOD and MPOV corrected using DL ranged from 8.2 to 12.4%, which were lower than values with no correction (P Conclusion

The usefulness of the DL correction method was validated. Deep learning reduced the error for a relatively good autofluorescence image quality. Poor-quality images were not corrected.

Institutional capacity assessment in the lens of implementation research: Capacity of the local institutions in delivering WASH services at Cox’s Bazar district, Bangladesh

by Mahbubur Rahman, Mahbub-Ul Alam, Sharmin Khan Luies, Sharika Ferdous, Zahidul Mamun, Musarrat Jabeen Rahman, Debashish Biswas, Tazrina Ananya, Asadullah, Abul Kamal, Ritthick Chowdhury, Eheteshamul Russel Khan, Dara Johnston, Martin Worth, Umme Farwa Daisy, Tanvir Ahmed

Background

The influx of Forcibly Displaced Myanmar Nationals (FDMNs) has left the Southwest coastal district of Cox’s Bazar with one of the greatest contemporary humanitarian crises, stressing the existing water, sanitation, and hygiene (WASH) resources and services. This study aimed to assess the existing capacity of local institutions involved in delivering WASH services and identify relevant recommendations for intervention strategies.

Methods

We used a qualitative approach, including interviews and capacity assessment workshops with institutions engaged in WASH service delivery. We conducted five key informant interviews (KII) with sub-district level officials of the Department of Public Health Engineering (DPHE), Directorate General of Health Services (DGHS), Directorate of Primary Education (DPE) and Bangladesh Rural Advancement Committee (BRAC) to have a general idea of WASH service mechanisms. Seven capacity assessment workshops were organized with the relevant district and sub-district level stakeholders from August 2019 to September 2019. These workshops followed three key areas: i) knowledge of policy, organizational strategy, guidelines, and framework; ii) institutional arrangements for service delivery such as planning, implementation, coordination, monitoring, and reporting; and iii) availability and management of human, financial and infrastructural resources. Data were categorized using thematic content analysis.

Results

The majority of stakeholders lacked awareness of national WASH policies. Furthermore, the top-down planning approaches resulted in activities that were not context-specific, and lack of coordination between multiple institutions compromised the optimal WASH service delivery at the local level. Shortage of human resources in delivering sustainable WASH services, inadequate supervision, and inadequate evaluation of activities also required further improvement, as identified by WASH stakeholders.

Conclusion

Research evidence suggests that decision-makers, donors, and development partners should consider learning from the WASH implementers and stakeholders about their existing capacity, gaps, and opportunities before planning for any WASH intervention in any particular area.

Inflammatory markers in world trade center workers with asthma: Associations with post traumatic stress disorder

by Juan P. Wisnivesky, Nikita Agrawal, Jyoti Ankam, Adam Gonzalez, Alex Federman, Steven B. Markowitz, Janette M. Birmingham, Paula J. Busse

Background

Post-traumatic stress disorders (PTSD) is associated with worse asthma outcomes in individuals exposed to the World Trade Center (WTC) site.

Research question

Do WTC workers with coexisting PTSD and asthma have a specific inflammatory pattern that underlies the relationship with increased asthma morbidity?

Study design and methods

We collected data on a cohort of WTC workers with asthma recruited from the WTC Health Program. Diagnosis of PTSD was ascertained with a Structured Clinical Interview for DSM-5 (Diagnostic and Statistical Manuel of Mental Disorders) and the severity of PTSD symptoms was assessed with the PTSD Checklist 5. We obtained blood and sputum samples to measure cytokines levels in study participants.

Results

Of the 232 WTC workers with diagnosis of asthma in the study, 75 (32%) had PTSD. PTSD was significantly associated with worse asthma control (p = 0.002) and increased resource utilization (p = 0.0002). There was no significant association (p>0.05) between most blood or sputum cytokines with PTSD diagnosis or PCL-5 scores both in unadjusted and adjusted analyses.

Interpretation

Our results suggest that PTSD is not associated with blood and sputum inflammatory markers in WTC workers with asthma. These findings suggest that other mechanisms likely explain the association between PTSD and asthma control in WTC exposed individuals.

College preparation for a medical career in the United States

by Madelyn Malvitz, Noreen Khan, Lewis B. Morgenstern

Purpose

A college degree is required to enter medical school in the United States. A remarkably high percentage of students entering college have pre-medical aspirations but relatively few end up as medical students. As an “applied science”, education about medicine is usually thought to be beyond the purview of a liberal arts curriculum. Students therefore receive little education about a medical career, or information about the many alternative careers in health science. Instead, they take courses for Medical College Admission Test (MCAT) preparation and medical school application prerequisites in biology, chemistry, physics, and math. These classes give them little insight into a real medical career. The current report considers this mismatch between student needs in health science and available resources in colleges across the United States.

Methods

A Collective Case Series framework was used to obtain qualitative data. Key informant interviews were requested from a convenience sample of representatives from 20 colleges, with six colleges providing extensive data. Three institutions collected data specifically on students who matriculated college interested in a career as a physician.

Results

At these schools, one-half to one-quarter of students who said they were interested in medicine at the beginning of college ended up not applying to medical school. At each of the six schools, we saw a wide range of generally sparse academic and professional advising involvement and a very limited number of classes that discussed concepts directly related to careers in health science.

Conclusions

Looking at this data, we provide a novel conceptual model as a potential testable solution to the problem of an underexposed and unprepared student population interested in medicine. This includes a brief series of courses intended to inform students about what a career in medicine would fully entail to help foster core competencies of empathy, compassion and resilience.

Caffeinated beverages intake and risk of deep vein thrombosis: A Mendelian randomization study

by Tong Lin, Haiyan Mao, Yuhong Jin

This study aimed to explore the potential link between coffee and tea consumption and the risk of deep vein thrombosis (DVT) through Mendelian randomization (MR) analysis. Employing the MR, we identified 33 single nucleotide polymorphisms (SNPs) as instrumental variables (IVs) for coffee intake and 38 SNPs for tea intake. The investigation employed the inverse-variance weighted (IVW) method to evaluate the causal impact of beverage consumption on DVT risk. Additionally, MR-Egger and MR-PRESSO tests were conducted to assess pleiotropy, while Cochran’s Q test gauged heterogeneity. Robustness analysis was performed through a leave-one-out approach. The MR analysis uncovered a significant association between coffee intake and an increased risk of DVT (odds ratio [OR] 1.008, 95% confidence interval [CI] = 1.001–1.015, P = 0.025). Conversely, no substantial causal effect of tea consumption on DVT was observed (OR 1.001, 95% CI = 0.995–1.007, P = 0.735). Importantly, no significant levels of heterogeneity, pleiotropy, or bias were detected in the instrumental variables used. In summary, our findings suggest a modestly heightened risk of DVT associated with coffee intake, while tea consumption did not exhibit a significant impact on DVT risk.

Potential efficacy of caffeine ingestion on balance and mobility in patients with multiple sclerosis: Preliminary evidence from a single-arm pilot clinical trial

by Afsoon Dadvar, Melika Jameie, Mehdi Azizmohammad Looha, Mohammadamin Parsaei, Meysam Zeynali Bujani, Mobina Amanollahi, Mahsa Babaei, Alireza Khosravi, Hamed Amirifard

Objectives

Caffeine’s potential benefits on multiple sclerosis (MS), as well as on the ambulatory performance of non-MS populations, prompted us to evaluate its potential effects on balance, mobility, and health-related quality of life (HR-QoL) of persons with MS (PwMS).

Methods

This single-arm pilot clinical trial consisted of a 2-week placebo run-in and a 12-week caffeine treatment (200 mg/day) stage. The changes in outcome measures during the study period (weeks 0, 2, 4, 8, and 12) were evaluated using the Generalized Estimation Equation (GEE). The outcome measures were the 12-item Multiple Sclerosis Walking Scale (MSWS-12) for self-reported ambulatory disability, Berg Balance Scale (BBS) for static and dynamic balance, Timed Up and Go (TUG) for dynamic balance and functional mobility, Multiple Sclerosis Impact Scale (MSIS-29) for patient’s perspective on MS-related QoL (MS-QoL), and Patients’ Global Impression of Change (PGIC) for subjective assessment of treatment efficacy. GEE was also used to evaluate age and sex effect on the outcome measures over time. (Iranian Registry of Clinical Trials, IRCT2017012332142N1).

Results

Thirty PwMS were included (age: 38.89 ± 9.85, female: 76.7%). Daily caffeine consumption significantly improved the objective measures of balance and functional mobility (BBS; P-value Conclusions

Caffeine may enhance balance, functional mobility, and QoL in PwMS. Being male was associated with a sharper increase in self-reported ambulatory disability over time. The effects of aging on balance get more pronounced over time.

Trial registration

This study was registered with the Iranian Registry of Clinical Trials (Registration number: IRCT2017012332142N1), a Primary Registry in the WHO Registry Network.

Landscape use by large grazers in a grassland is restructured by wildfire

by Aishwarya Subramanian, Rachel M. Germain

Animals navigate landscapes based on perceived risks vs. rewards, as inferred from features of the landscape. In the wild, knowing how strongly animal movement is directed by landscape features is difficult to ascertain but widespread disturbances such as wildfires can serve as natural experiments. We tested the hypothesis that wildfires homogenize the risk/reward landscape, causing movement to become less directed, given that fires reduce landscape complexity as habitat structures (e.g., tree cover, dense brush) are burned. We used satellite imagery of a research reserve in Northern California to count and categorize paths made primarily by mule deer (Odocoileus hemionus) in grasslands. Specifically, we compared pre-wildfire (August 2014) and post-wildfire (September 2018) image history layers among locations that were or were not impacted by wildfire (i.e., a Before/After Control/Impact design). Wildfire significantly altered spatial patterns of deer movement: more new paths were gained and more old paths were lost in areas of the reserve that were impacted by wildfire; movement patterns became less directed in response to fire, suggesting that the risk/reward landscape became more homogenous, as hypothesized. We found evidence to suggest that wildfire affects deer populations at spatial scales beyond their scale of direct impact and raises the interesting possibility that deer perceive risks and rewards at different spatial scales. In conclusion, our study provides an example of how animals integrate spatial information from the environment to make movement decisions, setting the stage for future work on the broader ecological implications for populations, communities, and ecosystems, an emerging interest in ecology.

Physical activity and cognitive function in adults born very preterm or with very low birth weight–an individual participant data meta-analysis

by Kristina Anna Djupvik Aakvik, Silje Dahl Benum, Marjaana Tikanmäki, Petteri Hovi, Katri Räikkönen, Sarah L. Harris, Lianne J. Woodward, Brian A. Darlow, Marit S. Indredavik, Stian Lydersen, Paul Jarle Mork, Eero Kajantie, Kari Anne I. Evensen

Objective

Individuals born very preterm ( Study design

Cohorts with data on physical activity and cognitive function in adults born very preterm/very low birth weight and term-born controls were recruited from the Research on European Children and Adults Born Preterm, and the Adults Born Preterm International Collaboration Consortia. A systematic literature search was performed in PubMed and Embase.

Results

Five cohorts with 1644 participants aged 22–28 years (595 very preterm/very low birth weight and 1049 controls) were included. Adults born very preterm/very low birth weight reported 1.11 (95% CI: 0.68 to 1.54) hours less moderate to vigorous physical activity per week than controls, adjusted for cohort, age and sex. The difference between individuals born very preterm/very low birth weight and controls was larger among women than among men. Neither intelligence quotient nor self-reported executive function mediated the association between very preterm/very low birth weight and moderate to vigorous physical activity. Results were essentially the same when we excluded individuals with neurosensory impairments.

Conclusion

Adults born very preterm/very low birth weight, especially women, reported less moderate to vigorous physical activity than their term-born peers. Cognitive function did not mediate this association. Considering the risk of adverse health outcomes among individuals born preterm, physical activity could be a target for intervention.

Developing a PRogram to Educate and Sensitize Caregivers to Reduce the Inappropriate Prescription Burden in the Elderly with Alzheimer’s Disease (D-PRESCRIBE-AD): Trial protocol and rationale of an open-label pragmatic, prospective randomized controlled

by Sonal Singh, Noelle M. Cocoros, Xiaojuan Li, Kathleen M. Mazor, Mary T. Antonelli, Lauren Parlett, Mark Paullin, Thomas P. Harkins, Yunping Zhou, Paula A. Rochon, Richard Platt, Inna Dashevsky, Carly Massino, Cassandra Saphirak, Sybil L. Crawford, Jerry H. Gurwitz

Context

Potentially inappropriate prescribing of medications in older adults, particular those with dementia, can lead to adverse drug events including falls and fractures, worsening cognitive impairment, emergency department visits, and hospitalizations. Educational mailings from health plans to patients and their providers to encourage deprescribing conversations may represent an effective, low-cost, “light touch”, approach to reducing the burden of potentially inappropriate prescription use in older adults with dementia.

Objectives

The objective of the Developing a PRogram to Educate and Sensitize Caregivers to Reduce the Inappropriate Prescription Burden in Elderly with Alzheimer’s Disease (D-PRESCRIBE-AD) trial is to evaluate the effect of a health plan based multi-faceted educational outreach intervention to community dwelling patients with dementia who are currently prescribed sedative/hypnotics, antipsychotics, or strong anticholinergics.

Methods

The D-PRESCRIBE-AD is an open-label pragmatic, prospective randomized controlled trial (RCT) comparing three arms: 1) educational mailing to both the health plan patient and their prescribing physician (patient plus physician arm, n = 4814); 2) educational mailing to prescribing physician only (physician only arm, n = 4814); and 3) usual care (n = 4814) among patients with dementia enrolled in two large United States based health plans. The primary outcome is the absence of any dispensing of the targeted potentially inappropriate prescription during the 6-month study observation period after a 3-month black out period following the mailing. Secondary outcomes include dose-reduction, polypharmacy, healthcare utilization, mortality and therapeutic switching within targeted drug classes.

Conclusion

This large pragmatic RCT will contribute to the evidence base on promoting deprescribing of potentially inappropriate medications among older adults with dementia. If successful, such light touch, inexpensive and highly scalable interventions have the potential to reduce the burden of potentially inappropriate prescribing for patients with dementia.ClinicalTrials.gov Identifier: NCT05147428.

Knowledge, attitudes, and practices regarding antibiotic use in Bangladesh: Findings from a cross-sectional study

by Md. Abu Raihan, Md. Saiful Islam, Shariful Islam, A. F. M. Mahmudul Islam, Khandaker Tanveer Ahmed, Tania Ahmed, Md. Nahidul Islam, Shamsunnahar Ahmed, Mysha Samiha Chowdhury, Dipto Kumar Sarker, Anika Bushra Lamisa

Background

Escalating antibiotic resistance presents a notable worldwide dilemma, pointing a large involvement of general population. The objective of this study was to assess knowledge, attitudes, and practices regarding the utilization of antibiotics among Bangladeshi residents.

Methods

A cross-sectional study, conducted from January 01 to April 25, 2022, included 1,947 Bangladeshi adults with a history of antibiotic use, via online surveys and face-to-face interviews using a pretested semi-structured questionnaire. Descriptive statistics, Chi-square tests, and multivariate linear regression models were employed.

Results

Mean scores for knowledge, attitudes, and practices were 6.59±1.20, 8.34±1.19, and 12.74±2.59, with correct rates of 73.22%, 92.67%, and 57.91%. Positive predictors for knowledge included being unmarried (β = 0.10, p = 0.001), higher education (College: β = 0.09, p = 0.025; Bachelor: β = 0.22, p Conclusions

Participants exhibited adequate knowledge and positive attitudes but lagged behind in proper practice of antibiotic use. Proper initiatives should be tailored to enhance prudent antibiotic use and mitigate the risk of antimicrobial resistance.

Time to tuberculosis development and its predictors among HIV-positive patients: A retrospective cohort study

by Abraham Teka Ajema, Yilkal Simachew, Meiraf Daniel Meshesha, Taye Gari

Objectives

To assess the incidence and predictors of time to Tuberculosis (TB) development among Human Immunodeficiency Virus (HIV) positive patients attending follow-up care in health facilities of Hawassa, Ethiopia.

Methods

We conducted a retrospective cohort study from April 1–30, 2023. A total of 422 participants were selected using a simple random sampling method. Data was collected from the medical records of patients enrolled between January 1, 2018 –December 31, 2022, using the Kobo toolbox. We used Statistical Package for Social Studies (SPSS) version 26.0 for data analysis. To estimate the duration of TB-free survival, we applied the Kaplan-Meier survival function and fitted Cox proportional hazard models to identify the predictors of time to TB development. Adjusted hazard ratios (AHR) with 95% confidence intervals were calculated and statistical significance was declared at a P-value of 0.05.

Results

The overall incidence rate of TB among HIV-positive patients was 6.26 (95% CI: 4.79–8.17) per 100 person-years (PYs). Patients who did not complete TB Preventive Therapy (TPT) were more likely to have TB than those who did (AHR = 6.2, 95% CI: 2.34–16.34). In comparison to those who began antiretroviral therapy (ART) within a week, those who began after a week of linkage had a lower risk of TB development (AHR = 0.44, 95% CI: 0.21–0.89). Patients who received ART for six to twelve months (AHR = 0.18, 95% CI: 0.05–0.61) and for twelve months or longer (AHR = 0.004, 95% CI: 0.001–0.02) exhibited a decreased risk of TB development in comparison to those who had ART for less than six months.

Conclusion

The incidence of TB among HIV-positive patients is still high. To alleviate this burden, special attention should be given to regimen optimization and provision of adherence support for better completion of TPT, sufficient patient preparation, thorough clinical evaluation for major (Opportunistic Infections) OIs prior to starting ART, and ensuring retention on ART.

Allogeneic limbo-deep anterior lamellar keratoplasty (Limbo-DALK)—A novel surgical technique in corneal stromal disease and limbal stem cell deficiency

by Verena Schöneberger, Volkan Tahmaz, Mario Matthaei, Sigrid Roters, Simona L. Schlereth, Friederike Schaub, Claus Cursiefen, Björn O. Bachmann

Purpose

To describe a novel corneal surgical technique combining Deep Anterior Lamellar Keratoplasty (DALK) with grafting of allogeneic limbus (Limbo-DALK) for the treatment of eyes with corneal stromal pathology and limbal stem cell deficiency (LSCD).

Methods

Clinical records of six Limbo-DALKs performed in five patients diagnosed with LSCD and corneal stromal pathology requiring keratoplasty were retrospectively reviewed. All patients were diagnosed with LSCD due to various pathologies including thermal and chemical burns, congenital aniridia or chronic inflammatory ocular surface disease. Parameters analysed included demographics, diagnoses, clinical history, thickness measurements using anterior segment OCT, visual acuity, and epithelial status. Regular follow-up visits were scheduled at 6 weeks as well as 3, 6, 9, and 12 and 18 months postoperatively. Main outcome measures were time to graft epithelialisation and the occurrence of corneal endothelial decompensation.

Results

Two grafts showed complete epithelial closure at 2 days, two at 14 days. In one eye, complete epithelial closure was not achieved after the first Limbo-DALK, but was achieved one month after the second Limbo-DALK. No endothelial decompensation occurred except in one patient with silicone oil associated keratopathy. Endothelial graft rejection was not observed in any of the grafts.

Conclusion

Based on the data from this pilot series, limbo-DALK appears to be a viable surgical approach for eyes with severe LSCD and corneal stromal pathology, suitable for emergency situations (e.g. corneal ulceration with impending corneal perforation), while minimising the risk of corneal endothelial decompensation.

Mental health and risk of death and hospitalization in COVID–19 patients. Results from a large-scale population-based study in Spain

by Aida Moreno-Juste, Beatriz Poblador-Plou, Cristina Ortega-Larrodé, Clara Laguna-Berna, Francisca González-Rubio, Mercedes Aza-Pascual-Salcedo, Kevin Bliek-Bueno, María Padilla, Concepción de-la-Cámara, Alexandra Prados-Torres, Luis A. Gimeno-Feliú, Antonio Gimeno-Miguel

The COVID–19 pandemic has created unprecedented challenges for health care systems globally. This study aimed to explore the presence of mental illness in a Spanish cohort of COVID-19-infected population and to evaluate the association between the presence of specific mental health conditions and the risk of death and hospitalization. This is a retrospective cohort study including all individuals with confirmed infection by SARS-CoV-2 from the PRECOVID (Prediction in COVID–19) Study (Aragon, Spain). Mental health illness was defined as the presence of schizophrenia and other psychotic disorders, anxiety, cognitive disorders, depression and mood disorders, substance abuse, and personality and eating disorders. Multivariable logistic regression models were used to examine the likelihood of 30-day all-cause mortality and COVID–19 related hospitalization based on baseline demographic and clinical variables, including the presence of specific mental conditions, by gender. We included 144,957 individuals with confirmed COVID–19 from the PRECOVID Study (Aragon, Spain). The most frequent diagnosis in this cohort was anxiety. However, some differences were observed by sex: substance abuse, personality disorders and schizophrenia were more frequently diagnosed in men, while eating disorders, depression and mood, anxiety and cognitive disorders were more common among women. The presence of mental illness, specifically schizophrenia spectrum and cognitive disorders in men, and depression and mood disorders, substance abuse, anxiety and cognitive and personality disorders in women, increased the risk of mortality or hospitalization after COVID–19, in addition to other well-known risk factors such as age, morbidity and treatment burden. Identifying vulnerable patient profiles at risk of serious outcomes after COVID–19 based on their mental health status will be crucial to improve their access to the healthcare system and the establishment of public health prevention measures for future outbreaks.

Factors affecting the efficiency of equine embryo transfer (EET) in polo mares under subtropical conditions of Pakistan

by Khalid Mahmood, Aijaz Ali Channa, Aamir Ghafoor, Amjad Riaz

Equine embryo transfer (EET) is a prominent technology in the equine breeding industry, and its efficacy is affected by a number of factors. The current study aimed to determine the effects of the breed of donor/recipient mares, estrus/ovulation induction treatment, cooled transportation of embryos, and synchrony between donor and recipient mares on the efficiency of the EET under subtropical conditions of Pakistan. A total of eighty-four (n = 84) Polo-playing donor mares (Argentino-polo = 41 and Anglo-Arab = 43) and seventy (n = 70) recipient mares (light breed = 26 and heavy breed = 44) were used for EET. The donor mares exhibiting natural estrus (n = 28) were detected by teaser a stallion, and corpus luteum (CL) having mares (n = 56) were treated with prostaglandin (150 μg of Cloprostenol) for estrus induction. The mares’ follicular growth was monitored through ultrasonography until the dominant follicle’s size reached 35 mm or more with a moderate to obvious uterine edema score. Afterward, the mares were treated either with GnRH, i.e., 50 μg of Lecirelin acetate (n = 41) or Ovusyn, i.e., 1500 IU hCG (n = 43). Insemination with chilled semen was performed 24 hours later. The embryos were collected non-surgically, 7 or 8 days after ovulation, from the donor mares. The collected embryos were transferred into the well-synchronized recipient mares as fresh (n = 44) or chilled (n = 26) embryos. The pregnancy after ET was checked through ultrasonography. Statistical analysis revealed that the embryo recovery rate (ERR) remained significantly higher (P0.05) affect the ERR. There was no significant effect of the type (fresh vs chilled), classification, and stage of development of embryo on pregnancy outcomes (P>0.05). ET pregnancy rate was significantly affected by the breed of recipient mares and ovulation synchrony between donor and recipient mares (P

Ceragenin-mediated disruption of <i>Pseudomonas aeruginosa</i> biofilms

by Urszula Wnorowska, Dawid Łysik, Ewelina Piktel, Magdalena Zakrzewska, Sławomir Okła, Agata Lesiak, Jakub Spałek, Joanna Mystkowska, Paul B. Savage, Paul Janmey, Krzysztof Fiedoruk, Robert Bucki

Background

Microbial biofilms, as a hallmark of cystic fibrosis (CF) lung disease and other chronic infections, remain a desirable target for antimicrobial therapy. These biopolymer-based viscoelastic structures protect pathogenic organisms from immune responses and antibiotics. Consequently, treatments directed at disrupting biofilms represent a promising strategy for combating biofilm-associated infections. In CF patients, the viscoelasticity of biofilms is determined mainly by their polymicrobial nature and species-specific traits, such as Pseudomonas aeruginosa filamentous (Pf) bacteriophages. Therefore, we examined the impact of microbicidal ceragenins (CSAs) supported by mucolytic agents–DNase I and poly-aspartic acid (pASP), on the viability and viscoelasticity of mono- and bispecies biofilms formed by Pf-positive and Pf-negative P. aeruginosa strains co-cultured with Staphylococcus aureus or Candida albicans.

Methods

The in vitro antimicrobial activity of ceragenins against P. aeruginosa in mono- and dual-species cultures was assessed by determining minimum inhibitory concentration (MIC) and minimum bactericidal/fungicidal concentration (MBC/MFC). Inhibition of P. aeruginosa mono- and dual-species biofilms formation by ceragenins alone and in combination with DNase I or poly-aspartic acid (pASP) was estimated by the crystal violet assay. Additionally, the viability of the biofilms was measured by colony-forming unit (CFU) counting. Finally, the biofilms’ viscoelastic properties characterized by shear storage (G’) and loss moduli (G”), were analyzed with a rotational rheometer.

Results

Our results demonstrated that ceragenin CSA-13 inhibits biofilm formation and increases its fluidity regardless of the Pf-profile and species composition; however, the Pf-positive biofilms are characterized by elevated viscosity and elasticity parameters.

Conclusion

Due to its microbicidal and viscoelasticity-modifying properties, CSA-13 displays therapeutic potential in biofilm-associated infections, especially when combined with mucolytic agents.

Infiltration of CD3+ and CD8+ lymphocytes in association with inflammation and survival in pancreatic cancer

by Gerik W. Tushoski-Alemán, Kelly M. Herremans, Patrick W. Underwood, Ashwin Akki, Andrea N. Riner, Jose G. Trevino, Song Han, Steven J. Hughes

Background

Pancreatic ductal adenocarcinomas (PDAC) have heterogeneous tumor microenvironments relatively devoid of infiltrating immune cells. We aimed to quantitatively assess infiltrating CD3+ and CD8+ lymphocytes in a treatment-naïve patient cohort and assess associations with overall survival and microenvironment inflammatory proteins.

Methods

Tissue microarrays were immunohistochemically stained for CD3+ and CD8+ lymphocytes and quantitatively assessed using QuPath. Levels of inflammation-associated proteins were quantified by multiplexed, enzyme-linked immunosorbent assay panels on matching tumor and tissue samples.

Results

Our findings revealed a significant increase in both CD3+ and CD8+ lymphocytes populations in PDAC compared with non-PDAC tissue, except when comparing CD8+ percentages in PDAC versus intraductal papillary mucinous neoplasms (IPMN) (p = 0.5012). Patients with quantitatively assessed CD3+ low tumors (lower 50%) had shorter survival (median 273 days) compared to CD3+ high tumors (upper 50%) with a median overall survival of 642.5 days (p = 0.2184). Patients with quantitatively assessed CD8+ low tumors had significantly shorter survival (median 240 days) compared to CD8+ high tumors with a median overall survival of 1059 days (p = 0.0003). Of 41 proteins assessed in the inflammation assay, higher levels of IL-1B and IL-2 were significantly associated with decreased CD3+ infiltration (r = -0.3704, p = 0.0187, and r = -0.4275, p = 0.0074, respectively). Higher levels of IL-1B were also significantly associated with decreased CD8+ infiltration (r = -0.4299, p = 0.0045), but not IL-2 (r = -0.0078, p = 0.9616). Principal component analysis of the inflammatory analytes showed diverse inflammatory responses in PDAC.

Conclusion

In this work, we found a marked heterogeneity in infiltrating CD3+ and CD8+ lymphocytes and individual inflammatory responses in PDAC. Future mechanistic studies should explore personalized therapeutic strategies to target the immune and inflammatory components of the tumor microenvironment.

❌