FreshRSS

🔒
❌ Acerca de FreshRSS
Hay nuevos artículos disponibles. Pincha para refrescar la página.
AnteayerTus fuentes RSS

Transcriptome profiling indicates varied gene responses to <i>Pasteurella multocida</i> mutant infections in cattle

by Hao Ma, Fred M. Tatum, Robert E. Briggs, Rohana P. Dassanayake, Tasia M. Kendrick, Eduardo Casas

Pasteurella multocida is a pathogen that causes bovine respiratory disease, and the development of an effective vaccine is important for improving animal health. Live-attenuated vaccines induce a long-lasting immune response with minimal side effects. The objective of this study was to evaluate potential live vaccine candidates from three P. multocida mutants produced by separately disrupting the genes of filamentous hemagglutinin 2 (fhaB2), hydrogenase-1 operon (hyaE), and n-acylneuraminate-9-phosphatase (nanP) of a serogroup 3 strain (P1062, WT) by clinical testing and transcriptome analysis. Challenge with WT and the three mutants conferred protection against P. multocida, with less lung lesions (4.7–6.2%) compared to 22.4% in the sham group. Transcriptome analysis identified 807 differentially expressed protein-coding transcripts (DETs) in the blood and 6473 DETs in the liver compared to the sham, WT, and each of the mutants. In total, 15 and 64 differentially expressed microRNAs (DEmiRNAs) and 12 and 74 differentially expressed long non-coding RNAs (DElncRNAs) were identified in blood and liver, respectively. The DEmiRNAs were not significantly associated with the DETs within each comparison. DElncRNAs were associated with 12 and 170 DETs in blood and liver respectively. The greatest number of unique DETs were found between hyaE and sham groups in the liver, which agreed with the low colonization rate in the nares and palatine tonsils. For the DETs between sham and WT the under-enriched gene ontology terms in blood were all included in the liver for the DETs identified by WT vs. sham, nanP vs. sham, and hyaE vs. sham, and were related to the signaling pathway, stimulus, and sensory perceptions in biological processes with the molecular function of olfactory receptor activity. The number of identified DETs, decreased percentage of lung lesions, and colonization rates indicate that fhaB2 could be a promising vaccine candidate.

Prevalence and determinants of assistive device use among older adults in India: a cross-sectional analysis of a nationally representative survey

Por: Ravi · R. · Olickal · J. J. · Adoor · A. · Sireesha · V. N. · Devasia · J. · Thankappan · K. R.
Objectives

To estimate the prevalence and identify the determinants of assistive device usage in daily life among older adults in India.

Design

Cross-sectional analysis of nationally representative survey data.

Setting

India

Participants

A total of 66 316 adults aged ≥45 years with complete information on assistive device use from Wave 1 of the Longitudinal Ageing Study in India, 2017–2018.

Primary and secondary outcome measures

The primary outcome was self-reported use of any assistive device, including visual, hearing, mobility or other assistive devices. There were no predefined secondary outcome measures. Sociodemographic and health-related variables were analysed as covariates to assess factors associated with assistive device use.

Results

The prevalence of assistive device use was 38.61% (95% CI: 37.73% to 39.50%). Use increased with age, from 34.48% among adults aged 45–59 years to 52.07% among those aged ≥75 years (adjusted prevalence ratios (aPR) 1.30; 95% CI: 1.25 to 1.35). Prevalence was higher among men (40.94%) than women (37.51%) (aPR 1.06; 95% CI: 1.03 to 1.09), among individuals with education above primary level (54.28%) compared with those with up to primary education (28.35%) (aPR 1.42; 95% CI: 1.36 to 1.48), and among urban residents (53.88%) vs rural residents (31.16%) (aPR 1.18; 95% CI: 1.14 to 1.22). A clear socioeconomic gradient was observed, with prevalence increasing from 27.65% in the poorest to 50.66% in the richest wealth quintile (aPR 1.32; 95% CI: 1.25 to 1.39). Assistive device use was higher among participants with chronic conditions (47.30%) than those without (28.16%) (aPR 1.15; 95% CI: 1.11 to 1.19) and was markedly higher among those with a prior eye or vision diagnosis (64.93%) compared with those without (14.61%) (aPR 3.94; 95% CI: 3.78 to 4.11). Among users, spectacles or contact lenses were most common (89.26%), followed by walking sticks or walkers (11.62%) and dentures (6.15%). State-level prevalence varied widely, ranging from 71.27% in Goa to 13.44% in Arunachal Pradesh.

Conclusion

Assistive device use was reported by less than half of Indian adults aged ≥45 years. The findings reveal clear socioeconomic and geographic inequities in access to assistive devices, with substantially lower use among older adults with less education, those in poorer wealth quintiles and rural residents. These disparities highlight the need for equity-focused interventions that improve accessibility to assistive devices, particularly for socially and economically disadvantaged groups and individuals with chronic conditions.

Transscleral photodynamic therapy with a chlorin e6: An experimental study of exposure parameters and therapeutic window

by Ernest V. Boiko, Elena V. Samkovich, Irina E. Panova, Alexander A. Ivanov, Sergey B. Shevchenko, Sergey L. Vorobyev, Elizaveta S. Kalashnikova, Victoria G. Gvazava, Elizaveta A. Masian, Alexandra E. Kim

Purpose

To define optimal exposure parameters and the therapeutic window for transscleral photodynamic therapy (TSPDT) with chlorin e6 by evaluating clinical, histological, and thermal effects of subthreshold, therapeutic, and suprathreshold settings in rabbit eyes.

Methods

The study was conducted on 21 healthy rabbits. TSPDT was performed using a 660 nm laser and chlorin e6 (2.5 mg/kg). Transscleral probes (5 mm: 0.1 W, 0.17 W, 0.3 W; 10 mm: 0.3 W, 0.6 W) with integrated thermosensors were used. Enucleation and histological analysis were performed 14 days post-irradiation.

Results

Fundus examination on day 14 revealed distinct treatment zones correlating with laser settings. The therapeutic window was defined as 0.14–0.17 W (5 mm probe; power density: 0.693–0.866 W/cm²; energy density: 415.8–519.6 J/cm²) and 0.48–0.6 W (10 mm probe; 0.611–0.764 W/cm²; 366.6–458.4 J/cm²) with 600 s exposure time, achieving selective choroidal damage without scleral or retinal injury (ΔT ≤ 4.5°C). Suprathreshold settings (≥0.3 W for 5 mm; ≥ 0.6 W for 10 mm) induced retinal necrosis (up to 50%) and scleral coagulation (ΔT ≥ 8°C) with power densities exceeding 0.866 W/cm² (5 mm) and 0.764 W/cm² (10 mm).

Conclusion

TSPDT with chlorin e6 enables selective targeting of intraocular pathological tissues while preserving scleral and retinal integrity. Defining the therapeutic window and using real-time thermal monitoring enhances treatment safety. These findings lay a foundation for clinical protocols for uveal melanoma and other intraocular tumors.

Co-developing SHELTER (Safe, Healthy Environments and Local Transformation for Equity and Resilience) with families with lived experience of homelessness in the New York City shelter system: A community needs assessment and data collection protocol

by Diana Margot Rosenthal, Kate Guastaferro, Jasia Kubik, Melody Goodman

In January 2025, the nightly census revealed that over 120,000 people were staying in New York City (NYC) shelters, including more than 41,000 children, of whom almost half were aged 0–5 years. Children under five years old (under-5s) experiencing homelessness are especially vulnerable because the first five years of life are a critical period for child growth, including approximately 90% of brain development. Furthermore, under-5s experiencing homelessness have a higher risk for multiple adverse childhood experiences, developing chronic health conditions, and recurrent homelessness across the life course. Data available for under-5s experiencing homelessness is generally lacking, and what is available is of notably poor quality in the United States, leaving a wide evidence gap and an inability to determine the actual needs of this population. This proposed protocol employs community-based participatory research and was co-developed with families with under-5s who have lived experience of homelessness in NYC shelters. The aim is to determine what barriers exist in the physical and social environments to optimizing health and wellbeing (e.g., milestones, child mental health, parental mental health, safety) among under-5s living in NYC shelters. Using a sequential mixed-methods design, we propose to address a gap in the current literature by conducting an assets- and deficits-based health needs assessment comprising a quantitative survey and qualitative semi-structured interviews. In the long term, our objective is to enhance the quality and quantity of data for this vulnerable population, thereby laying the groundwork for the future co-development of a comprehensive, optimized intervention addressing the needs of under-5s experiencing homelessness.

The Omission of Nursing Care in Emergency Departments: A Conceptual Analysis Using Walker & Avant's Methodology

ABSTRACT

Aim(s)

To analyse the dimensions of the omission of nursing care in emergency departments, including its attributes, antecedents, and consequences, using Walker & Avant's concept analysis method.

Design Concept Analysis

Methods: Walker and Avant's eight-step method defined attributes, antecedents, and consequences of the omission of nursing care in emergency departments.

Data Sources

A comprehensive literature review was conducted using CINAHL, MEDLINE, Embase, Health Management Database, and Cochrane Library, covering publications from 2001 to 2024. The search was conducted in August 2024.

Results

Key attributes were delayed, incomplete, or interrupted care, mostly due to insufficient staffing or unpredictable patient volumes. Antecedents included high workloads, inadequate skill mixes, and understaffing. Consequences were increased patient morbidity and mortality, nurse burnout, and job dissatisfaction. A research gap exists in paediatric-specific measurement tools.

Conclusion

Identifying dimensions of omitted nursing care in emergency departments informs interventions to improve patient safety and care quality. Developing paediatric-specific measurement tools is essential.

Implications for the Profession and/or Patient Care

The findings emphasise the need for improved staffing and resource allocation policies, reducing risks to patients and enhancing nurse satisfaction.

Impact

This study addressed the gap in understanding omitted nursing care specifically in emergency departments. Findings highlight systemic issues impacting patient outcomes and nurse well-being. The results will guide organisational improvements and future research globally.

Reporting Method

This study adhered to EQUATOR guidelines, following Walker and Avant's method for concept analysis.

Patient or Public Contribution

This study did not include patient or public involvement.

Impact Statement

This study underscores the critical impact of the omission of nursing care (ONC) in emergency departments (EDs) on patient safety, nurse well-being, and healthcare efficiency. ONC contributes to increased morbidity, mortality, and adverse events, highlighting the urgent need for improved staffing models and resource allocation. Training programmes should equip emergency nurses with prioritisation strategies to mitigate care omissions. Policymakers must recognise ONC as a key quality indicator, ensuring adequate workforce support. Additionally, this study identifies a gap in measuring ONC in paediatric EDs, calling for the development of tailored assessment tools and further research on intervention strategies.

Factor Structure and Longitudinal Invariance of the Cancer Behaviour Inventory: Assessing Cancer‐Coping Self‐Efficacy in Patients With Moderate‐to‐High Symptoms

ABSTRACT

Background

The Cancer Behaviour Inventory–Brief Version was designed to assess cancer-coping self-efficacy in clinical and research settings where minimising patient burden is essential. However, there is no evidence of its longitudinal validity. Although widely used in cancer research, the lack of evidence for longitudinal invariance significantly undermines its validity in studies spanning multiple time points. Establishing longitudinal invariance enables valid comparisons over time, enhancing our confidence in applying it in longitudinal research.

Aim

To examine the factor structure of the measurement and test its longitudinal invariance across four time points in cancer patients experiencing moderate-to-high symptoms during curative cancer treatment.

Design

A longitudinal psychometric evaluation.

Methods

This is a secondary data analysis of a randomised controlled trial in patients with moderate-to-high symptoms undergoing cancer treatment (N = 534). We conducted longitudinal invariance tests for the measurement using four time points. Other psychometric tests included confirmatory factor analysis, reliability analyses and correlations.

Results

Our confirmatory factor analysis supported the four-factor, 12-item structure for the Cancer Behaviour Inventory–Brief Version. Items 1 and 6 were found to be moderately correlated. The resulting 12-item measure demonstrated good internal consistency, with convergent and divergent validity supported by correlations with selected instruments. Finally, longitudinal invariance was tested, which revealed strict measurement invariance across four time points (CFI = 0.930, RMSEA = 0.045, SRMA = 0.056).

Conclusion

We found that the factor structure of the Cancer Behaviour Inventory–Brief Version remained stable over four time points in a sample of patients having moderate to high symptoms under cancer treatment. This supports its accountability for examining the changes in cancer-coping self-efficacy among cancer patients over time in longitudinal studies.

Implications

This study confirms that Cancer Behaviour Inventory–Brief Version has adequate internal consistency and demonstrated evidence of construct validity. Our conclusion of strict longitudinal invariance supports its credibility for continuous assessment of cancer-coping self-efficacy to evaluate patient outcomes and intervention processes over time in clinical and research settings.

Patient or Public Contribution

No patient or public contribution.

Assessing the utility of fractional excretion of urea in distinguishing intrinsic and prerenal acute kidney injury in hospitalised patients: a systematic review and meta-analysis

Por: Pan · H.-C. · Jiang · Z.-H. · Chen · H.-Y. · Liu · J.-H. · Chen · Y.-W. · Peng · K.-Y. · Wu · V.-C. · Hsiao · C.-C.
Objective

Acute kidney injury (AKI) is a significant challenge in hospital settings, and accurately differentiating between intrinsic and prerenal AKI is crucial for effective management. The fractional excretion of urea (FEUN) has been proposed as a potential biomarker for this purpose, offering an alternative to traditional markers such as fractional excretion of sodium. This study aimed to assess the diagnostic accuracy of FEUN for differentiating intrinsic from prerenal AKI in hospitalised patients.

Designs

We conducted a systematic review and bivariate random effects meta-analysis of diagnostic accuracy studies. The study followed the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach.

Data sources

PubMed, Embase and Cochrane databases were searched from inception to 1 November 2023.

Eligibility criteria for selecting studies

We included observational studies that focused on patient with AKI and reported FEUN data sufficient to reconstruct a complete 2x2 contingency table (true positives, true negatives, false positives and false negatives) for evaluating its diagnostic accuracy.

Data extraction and synthesis

Two reviewers extracted data, assessed risk of bias with Quality Assessment of Diagnostic Accuracy Studies-2 and graded certainty of evidence using the GRADE approach. Pooled sensitivity, specificity, positive and negative likelihood ratios, and the area under the summary receiver operating characteristic curve (SROC) were calculated; heterogeneity was measured with I². A prespecified subgroup restricted to patients receiving diuretics served as a sensitivity analysis.

Results

12 studies involving 1240 patients were included, with an overall occurrence rate of intrinsic AKI of 38.8%. FEUN had a pooled sensitivity of 0.74 (95% CI 0.60 to 0.84) and specificity of 0.78 (95% CI 0.66 to 0.87), with positive predictive value and negative predictive value of 0.76 (95% CI 0.68 to 0.83) and 0.74 (95% CI 0.66 to 0.81), respectively. The SROC curve showed a pooled diagnostic accuracy of 0.83. Heterogeneity was substantial (I²>90%) for sensitivity and specificity. In a diuretic-only subgroup (six studies) specificity rose to0.87 and heterogeneity declined (I²=56%). Overall certainty of evidence was low owing to inconsistency.

Conclusions

FEUN is a biomarker with moderate diagnostic accuracy for differentiating between intrinsic and prerenal AKI in hospitalised patients. Its application could enhance AKI management; however, the high heterogeneity observed in our study highlights the need for further research to evaluate its utility across diverse patient populations and clinical settings.

PROSPERO registration number

CRD42024496083.

Client‐as‐Partner Care: A Grounded Theory Study of Formal Care Service Providers for Persons With Early‐Onset Dementia

ABSTRACT

Aims

To develop a grounded theory that explains how formal care service providers experience caring for and supporting persons with early-onset dementia (EOD).

Design

A grounded theory approach.

Methods

Thirty formal care service providers of persons with EOD were recruited from community-based dementia care facilities in northern and central Taiwan from August 2021 and February 2022 using purposive and theoretical sampling. Transcribed face-to-face, semi-structured interview data were analysed with constant comparative analysis. A theoretical framework was constructed from the data to describe the experience of being a formal care service provider for persons with EOD.

Results

The core category of ‘client-as-partner care’ was the theoretical framework that explained the experience of formal care service providers and described how participants met the needs of persons with EOD. Five categories described the components of the process: (1) identifying clients' characteristics; (2) establishing a personal relationship; (3) enhancing self-esteem; (4) maintaining dignity; and (5) the influence of family members and community members. The first four categories were interactive and key to delivering client-as-partner care; the fifth category could alter any key component and reduce or improve the quality of care. Reflections shared by participants offered a window into the outcomes of successful client-as-partner care: quality of life improved for clients and job satisfaction increased for providers.

Conclusion

The client-as-partner care model for persons with EOD required knowledge of the client's unique characteristics, a strong provider-client relationship, offering strategies tailored to the client's abilities and interests, and fostering independence.

Practice Implications

Client-as-partner care provides a person-centred approach that enhances support quality for persons with EOD and increases job satisfaction for formal care providers. Successful strategies can inform case management, strengthen support for this population and indirectly improve family caregivers' competencies.

Patient or Public Contribution

No patient or public contribution.

Reporting Method

COREQ (COnsolidated criteria for REporting Qualitative research).

Health and lifestyle in the Iron Age Italian community of Pontecagnano (Campania, Italy, 7th-6th century BCE)

by Roberto Germano, Owen Alexander Higgins, Emanuela Cristiani, Alessia Galbusera, Carmen Esposito, Dulce Neves, Carmine Pellegrino, Alessandra Sperduti, Giorgio Manzi, Luca Bondioli, Alessia Nava

This study investigates health, dental development, diet, and human-environment interactions in individuals buried in the necropolises of Pontecagnano (Campania, Italy, 7th-6th century BCE), using an integrated approach merging dental histomorphometry and calculus micro-residue analysis. The sample consists of 30 permanent teeth (canines, first and second molars) from 10 individuals. Histomorphometric analysis of dental thin sections allowed the estimation of crown formation times, initial cusp formation, crown completion, and enamel extension rates. The prevalence of Accentuated Lines, marking physiological stress events, was analyzed chronologically across tooth classes. Dental calculus analysis was performed on five individuals, identifying plant micro-remains and fungal spores. Crown formation times varied by tooth class, with canines forming the longest (mean = 1,977 ± 295 days), followed by second molars (mean = 1,176 ± 179 days) and first molars (mean = 1,094 ± 154 days). Initial cusp formation values, estimated through chronological overlap between teeth, allowed for a more accurate reconstruction of crown completion timing. Accentuated Lines prevalence peaked at 12 and 44 months, likely reflecting early childhood dietary transitions and the differential recording of stress events across different crown regions. Calculus analysis identified starch granules from cereals (Triticeae) and legumes (Fabaceae), fungal spores (Saccharomyces), and plant fibers, indicating diverse dietary practices, food processing, and extra-masticatory activities. This interdisciplinary approach reinforces the validity of combining histomorphometric and micro-residue analyses to reconstruct childhood health, adult diet, and lifestyle. Our findings align with previous research while emphasizing population-specific variations. This study enhances understanding of Iron Age biocultural adaptations, offering insights into developmental and dietary behaviors in this ancient Italian community.

Economic evaluation protocol for the PRevention Of sudden cardiac death aFter myocardial Infarction by Defibrillator implantation: the PROFID EHRA trial

Por: Qian · Y. · Roque · C. R. · Woods · B. · Iglesias Urrutia · C. P. · Gc · V. S. · Gur Arie · M. · Fischer · D. · Dagres · N. · Hindricks · G. · Manca · A.
Introduction

The implantable cardioverter defibrillator (ICD) is a cardiac device recommended for use to prevent the occurrence of sudden cardiac death (SCD) in post-myocardial infarction (MI) patients with reduced left ventricular ejection fraction (LVEF). The evidence informing this guidance comes from landmark trials that are now more than 20 years old. The risk-benefit profile of ICD for the contemporary target population may have changed substantially since then, which raises the question of whether there is evidence for sparing patients a procedure associated with potentially severe complications and high healthcare costs. A main part of the PRevention Of sudden cardiac death aFter myocardial Infarction by Defibrillator implantation (PROFID) project is the PROFID EHRA trial, which is supported by the European Heart Rhythm Association. PROFID EHRA is a European Union-funded, prospective, randomised, multi-centre, non-inferiority study designed to compare optimal medical therapy (OMT) alone to ICD with OMT, for post-MI patients with reduced LVEF. The study also describes economic evaluation methods to quantify the cost and health implications of using OMT alone in place of ICD implantation plus OMT in this group of patients.

Methods and analysis

The economic evaluation has been designed to conduct a pre-trial cost-effectiveness analysis (CEA) prior to the availability of trial data, followed by a within-trial cost-consequences analysis (CCA) and a long-term post-trial CEA, conducted from the National Health Service and Personal Social Service perspective in England. The pre-trial CEA uses simulation modelling informed by available evidence to assess the lifetime costs and quality-adjusted life years of OMT alone and ICD+OMT in post-MI patients with reduced LVEF at risk of SCD, as defined in the PROFID EHRA trial. The within-trial CCA is intended to summarise the health-related quality of life (HRQoL), healthcare resource use and associated costs observed during the PROFID EHRA trial follow-up period. The post-trial CEA updates the pre-trial model by incorporating contemporary evidence about the HRQoL and costs observed during the trial and the occurrence of those events and outcomes accruing during the trial follow-up period and projecting them into the expected lifetime of the patients. Sensitivity analyses are performed to assess the robustness of the CEA results with respect to both model assumptions and uncertainty in the value of the model input parameters. Finally, a value of information analysis will identify the key drivers of uncertainty surrounding the model conclusions regarding the optimal treatment strategy, establishing if further research may be required.

Ethics and dissemination

The PROFID EHRA trial, under legal sponsorship of Charité—Universitätsmedizin Berlin, Germany, received its first ethics approval by the Medicine Research Ethics Committee of the La Paz University Hospital in Madrid, Spain (reference number LHS-2019-0209). Before including patients, for all participating study centres, the required local, central and/or national ethical approval has to be obtained. As of the date 13 November 2025, at least one participating study centre in the following countries has received ethical approvals from relevant ethics committees: Austria, Belgium, Czech Republic, Denmark, France, Germany, Great Britain, Hungary, Israel, the Netherlands, Poland and Spain. Results will be shared with the general public through various media channels and additionally with healthcare professionals and the scientific community through scientific meetings, conferences and publications.

Trial registration number

NCT05665608.

Association between pneumoconiosis and cataract risk: a nationwide retrospective cohort study in Taiwan

Por: Cheng · J.-S. · Lin · Y.-S. · Lin · C.-L. · Hsia · N.-Y. · Shen · T.-C. · Cho · D.-Y.
Objectives

To investigate whether pneumoconiosis increases the risk of cataract.

Design

Nationwide population-based retrospective cohort study.

Setting

Taiwan’s National Health Insurance database, which covers >99% of the population.

Participants

The study included 19 841 adults newly diagnosed with pneumoconiosis between 2001 and 2020 and 79 364 age-matched and sex-matched individuals without pneumoconiosis. Participants with a prior history of cataract were excluded.

Outcome measures

The primary outcome was incident cataract identified through International Classification of Diseases diagnostic codes. Subgroup analyses were performed to evaluate cataract risk across different strata of age, sex and comorbidity. In addition, among patients with pneumoconiosis, we conducted a secondary analysis evaluating the association between systemic corticosteroid use and cataract development.

Results

During follow-up, the incidence of cataract was significantly higher in the pneumoconiosis cohort (38.9 vs 35.3 per 1000 person-years). Patients with pneumoconiosis had an increased risk of cataract after adjustment for age, sex and comorbidities (adjusted HR (aHR)=1.22, 95% CI 1.18 to 1.26). Elevated risks were observed in both men (aHR=1.22, 95% CI 1.18 to 1.26) and women (aHR=1.20, 95% CI 1.13 to 1.29). All age groups showed increased risks, with the highest estimate observed among patients aged ≥75 years (aHR=1.24, 95% CI 1.19 to 1.30). Subgroup analyses showed an increased risk in patients with pneumoconiosis who had no comorbidities (aHR=1.12, 95% CI 1.07 to 1.18). In a secondary analysis, systemic corticosteroid exposure was not significantly associated with cataract development (adjusted OR=0.65, 95% CI 0.39 to 1.09).

Conclusions

Pneumoconiosis is associated with an increased risk of cataract. Routine ophthalmologic surveillance should be considered in pneumoconiosis management.

The impact of Pasifikas in Medicine on Pacific Islander medical student experiences

by Devon Hori Harvey, Micah Ngatuvai, Siale Vaitohi, Paige E. Faasuamalie, Maegan Tupinio, Lisa H. Smith

Background

Pacific Islanders experience significant health disparities. One contributing aspect to these disparities is the lack of racial concordance as Pacific Islanders are underrepresented in the U.S. physician work force. Several factors contribute to this underrepresentation including lack of support systems for Pacific Islander premed and medical students. Pasifikas in Medicine (PiM) is a recently established national student organization founded to provide support for Pacific Islander premed students, medical students, residents, fellows and attending physicians. This study seeks to understand the impact of PiM on medical student experiences.

Methods

An anonymous survey was distributed to the PiM listserv and to Diversity, Equity and Inclusion offices of allopathic and osteopathic medical schools across the U.S. The survey included seven questions for demographic data, ten 5-point ordinal questions to evaluate the impact of PiM on medical student experiences, and three free text questions.

Results

A total of 34 individuals participated in the study with 21 individuals completing the evaluative portion of the survey. Of 28 who responded, 27 (96.4%) were the first in their family to attend medical school, and 25 (89.2%), planned to serve Pacific Islander patient populations in their medical career. For the 10 evaluative questions, 7 scored ≥ 4.0 of of 5.0. Identifying Mentors, Faculty Networking, and Research Opportunities scored less well. Qualitative data was favorable of PiM and demonstrated significant camaraderie, community, and connection to other Pacific Islander physicians and medical students.

Conclusion

Pasifikas in Medicine fills an unmet need by creating a space dedicated to addressing the challenges unique to Pacific Islander students, separate from other minority groups. Improvements to PiM should begin with creating more mentorship opportunities, faculty networking and research opportunities. Additionally, increasing PiM presence nationally and locally within medical schools could further strengthen Pacific Islander medical student experience.

Clinical validation of a frailty management mHealth tool in a cohort of community-dwelling older adults: the Geras Fit-Frailty App

Por: Kennedy · C. C. · Ioannidis · G. · Rockwood · K. · Relan · A. · Adachi · J. · Papaioannou · A. · Fit-Frailty App Working Group · Fisher · Park · Hewston · Lee · McArthur · Marr · Misiaszek · Woo · Patterson · Wang · Sidhu · Theou · Vinson
Objectives

This study describes the prototype testing and clinical validation of the Fit-Frailty App, a fully guided, interactive mobile health (mHealth) app to assess frailty and sarcopenia. This multi-dimensional tool is freely available on the App Store and considers medical history, physical performance, cognition, nutrition, daily function and psychosocial domains. To guide management, a total frailty score and clinical summary of underlying "risk flags" are provided. Our objectives were to examine usability, feasibility, criterion and construct validity.

Design

Cross-sectional

Setting

Outpatient geriatric medicine clinic

Participants

Community-dwelling older adults, age 65 years or older

Methods

The primary outcome of the clinical validation study was criterion validity. A research nurse administered the Fit-Frailty App during a routine clinic appointment. Clinicians simultaneously completed a paper-based frailty index (FI) tool with similar items from a comprehensive geriatric assessment (FI-CGA). Total scores for both assessments were computed using the cumulative deficits frailty index scoring method. Intraclass and Pearson correlation coefficients and 95% CIs were calculated to examine criterion validity. Secondary outcomes were construct validity, feasibility (eg, completion rates, safety occurrences, resources) and usability (eg, ratings on ease of use, time to complete the app).

Results

In the clinical validation study (n=75, mean age 79.2, SD=7.0, 53% female), the mean total Fit-Frailty App score was 0.33 (SD=0.13) with 73% of our sample considered frail or severely frail. The app presented comparable results to FI-CGA (moderate to good validity; ICC=0.65, 95%CI=0.50–0.76) with a strong association between the measures (r=0.74, 95%CI=0.62–0.83). In our prototype and clinical cohorts, the app had a 100% completion rate with no safety occurrences and had high usability ratings.

Conclusions

The Fit-Frailty App is a feasible and valid tool that can be used in research and clinical settings to comprehensively assess frailty and sarcopenia by non-geriatricians and could assist with developing targeted interventions.

Adaptation and evaluation of a digital dialectical behaviour therapy for youth at clinical high risk for psychosis: A protocol for a feasibility randomized controlled trial

by Thea Lynne Hedemann, Yun Lu, Sofia Campitelli, Lisa D. Hawke, Nelson Shen, Sarah Saperia, Brett D. M. Jones, Gillian Strudwick, Chelsey R. Wilks, Wei Wang, Marco Solmi, Michael Grossman, Muhammad Ishrat Husain, Nicole Kozloff, George Foussias, Muhammad Omair Husain

Background

Youth at clinical high risk (CHR) for psychosis often experience emotional dysregulation, psychiatric symptoms, substance use, suicidality, and functional impairment. Dialectical behaviour therapy (DBT) is an evidence-based intervention that improves emotion regulation, clinical outcomes, and functioning across psychiatric populations. Digital adaptations (d-DBT) may enhance accessibility and engagement for CHR youth, but acceptability and potential benefits in this group are unknown.

Objective

To adapt d-DBT for CHR youth and evaluate the acceptability of delivering it to this population, as well as the feasibility of a larger-scale clinical trial.

Methods

This mixed-methods clinical trial has two phases. In Phase 1, d-DBT will be adapted for CHR youth in collaboration with a lived-experience youth advisory group. In Phase 2, an assessor-masked randomized controlled trial will compare d-DBT (n = 30) with treatment as usual (n = 30). The intervention consists of eight weekly modules, with primary outcomes assessing acceptability, usability, and trial feasibility. Secondary outcomes include changes in emotional dysregulation, psychiatric symptoms, substance use, suicidality, and functioning.

Conclusions

We anticipate that d-DBT will be acceptable to CHR youth and that conducting a larger trial will be feasible. Preliminary findings may demonstrate improvements in emotion regulation, psychiatric symptoms, suicidality, and functioning. Results will guide further refinement of the intervention and inform the design of a confirmatory clinical trial.

Trial registration

ClinicalTrials.gov #NCT06928935

Exploring a panel of serum biomarkers for cancer risk in patients with non-specific symptoms: a comparative analysis of feature selection methods

Por: Monroy-Iglesias · M. J. · Santaolalla · A. · Martin · S. · North · B. · Moss · C. · Haire · K. · Jones · G. · Steward · L. · Cargaleiro · C. · Bruno · F. · Millwaters · J. · Basyal · C. · Weild · S. · Russell · B. · Van Hemelrijck · M. · Dolly · S.
Objectives

Delays in cancer diagnosis for patients with non-specific symptoms (NSSs) lead to poorer outcomes. Rapid Diagnostic Clinics (RDCs) expedite care, but most NSS patients do not have cancer, highlighting the need for better risk stratification. This study aimed to develop biomarker-based clinical prediction scores to differentiate high-risk and low-risk NSS patients, enabling more targeted diagnostics.

Design

Retrospective and prospective cohort study.

Setting

Secondary care RDC in London.

Participants

Adult patients attending an RDC between December 2016 and September 2023 were included. External validation used data from another RDC.

Outcome measures

The primary outcome was a cancer diagnosis. Biomarker-based risk scores were developed using Latent Class Analysis (LCA) and Least Absolute Shrinkage and Selection Operator (LASSO). Model performance was assessed using logistic regression, receiver operating characteristic curves (AUROC) and decision curve analysis.

Results

Among 5821 RDC patients, LCA identified high white cell count, low haemoglobin, low albumin, high serum lambda light chain, high neutrophil-to-lymphocyte ratio, high serum kappa light chain (SKLC), high erythrocyte sedimentation rate (ESR), high C-reactive protein (CRP) and high neutrophils as cancer risk markers. LASSO selected high platelets, ESR, CRP, SKLC, alkaline phosphatase and lactate dehydrogenase. Each one-point increase in score predicted higher odds of cancer (LCA: AOR 1.19, 95% CI 1.16 to 1.23; LASSO: AOR 1.29, 95% CI 1.25 to 1.34). Scores ≥2 predicted significantly higher cancer odds (LCA: AOR 3.79, 95% CI 2.91 to 4.95; LASSO: AOR 3.44, 95% CI 2.66 to 4.44). Discrimination was good (AUROC: LCA 0.74; LASSO 0.73). External validation in 573 patients confirmed predicted increases in cancer risk per one-point LASSO score rise (AOR 1.28, 95% CI 1.15 to 1.42), with a borderline increase for LCA (AOR 1.16, 95% CI 1.06 to 1.27).

Conclusion

Biomarker-based scores effectively identified NSS patients at higher cancer risk. LCA captured a broader biomarker range, offering higher sensitivity, while LASSO achieved higher specificity with fewer markers. These scores may also help detect severe benign conditions, improving RDC triage. Further validation is needed before broader clinical implementation.

Towards interprofessional medication safety risk management: a qualitative interview study for physicians in primary and secondary care

Por: Saavalainen · A. · Sirenius · H. · Linden-Lahti · C. · Laukkanen · E. · Hosia · H. · Holmström · A.-R.
Objectives

Investigate interprofessional medication safety risk management from the perspective of physicians in healthcare settings.

Design

Qualitative, semistructured interview study. Data analysed with an inductive content analysis.

Setting

Wellbeing Services County in Central Finland.

Participants

17 physicians working in different healthcare settings or specialties.

Results

Physicians’ overall perception of interprofessional medication safety risk management was generally positive. They considered their own responsibility for medication safety as both comprehensive, encompassing the safety of the entire unit and limited, focused primarily on prescribing the correct medication. Organisational barriers to participating in medication safety promotion comprised insufficient healthcare resources and unclear distribution of tasks and responsibilities. Personal barriers included prioritisation of clinical work, considering medication safety as an administrative task and experiencing the process to be slow and complex. Strong leadership, increased visibility of medication safety, framing the topic positively, targeted education and fostering physicians’ intrinsic motivation were identified as means to increase physicians’ participation in medication safety risk management.

Conclusions

This study emphasises the importance of integrating physicians into interprofessional, systems-based medication safety risk management as a core element of high-quality care. Despite recognising their broad role, physicians face barriers such as organisational constraints and limited identification with medication safety advocacy. Addressing these challenges requires enhancing their understanding of the medication management and use process and fostering shared responsibility through time allocation and interprofessional leadership structures.

A Late Pleistocene archaic human tooth from Gua Dagang (Trader’s Cave), Niah national park, Sarawak (Malaysia)

by Darren Curnoe, Mohammed S. Sauffi, Hsiao Mei Goh, Xue-feng Sun, Roshan Peiris

The rarity of Late Pleistocene hominin remains from Insular Southeast Asia (ISEA) has hampered our ability to understand a crucial episode of human evolutionary history, namely, the global dispersal of Homo sapiens from Africa. Moreover, recent discoveries indicate a surprising level of taxic diversity during this time with at least two species—H. floresiensis and H. luzonensis—endemic to the region when H. sapiens first arrived. A third hominin dubbed the ‘Denisovans’ is shown from DNA evidence to have interbred with the ancestors of contemporary Indigenous populations across ISEA, New Guinea and Australia. Yet, the Denisovans have not been identified from the fossil record of the area despite recent breakthroughs in this regard on mainland East Asia. New excavations by our team at the Trader’s Cave in the Niah National Park (‘Niah Caves’), northern Borneo, have yielded an isolated hominin upper central permanent incisor dated with Optically Stimulated Luminescence dating of sediments to about 52 − 55 thousand years ago. Specimen SMD-TC-AA210 has a massive crown absolutely and relative to its root size, the crown is wide (mesiodistally) and relatively short (labiolingually). Morphologically, it exhibits a very strong degree of labial convexity, pronounced shovelling, and the bulging basal eminence exhibits several upward finger-like projections. Labial enamel wrinking on the enamel-dentine junction is expressed as two large ridges exhibiting numerous spine-like projections, and the lingual extensions on the enamel surface of the basal eminence are expressed as six extensions. This combination of crown size and morphological traits is not normally found in H. sapiens and instead characterises archaic members of Homo such as H. erectus, H. neanderthalensis and Middle Pleistocene hominins sharing a clade with H. heidelbergensis. The Trader’s Cave tooth suggests that an archaic hominin population inhabited northern Borneo just prior to or coincident with the arrival of H. sapiens as documented at the nearby West Mouth of the Niah Great Cave.

Differential contribution of α2δ auxiliary subunits of voltage-gated calcium channels in mouse models of pain and itch

by Joao M. Braz, Madison Jewell, Karnika Bhardwaj, Sian Rodriguez-Rosado, Veronica Craik, Allan I. Basbaum

Voltage-gated calcium channels (VGCCs) are multimeric proteins composed of alpha 1, β and γ subunits, as well as one of four auxiliary α2δ subunits. Although there is considerable preclinical and clinical evidence for a contribution of VGCCs to nociceptive processing, notably the gabapentin-targeted α2δ-1 subunit, unclear is the extent to which other α2δ subunits contribute to baseline or injury-altered pain and itch processing. Here, we investigated the anatomical and behavioral consequences of deleting α2δ-2, α2δ-3 or α2δ-4 in the mouse and report that selectively ablating each α2δ subunit leads to different, and in some cases, opposite effects on behavioral indices of pain and itch. Specifically, deleting α2δ2 resulted in mechanical and heat hypersensitivity, and an increase in spinal cord microglial immunoreactivity, but reduced scratching (presumptive) itch in response to a pruritogen. In contrast, ablation of α2δ3 led to thermal hyposensitivity, but no change in mechanical responsiveness or indices of itch. Mice deficient for α2δ4 exhibited hyposensitivity across pain modalities and only minor itch deficits. Interestingly, these differential effects were limited to baseline nociceptive responses, therefore we conclude that the α2δ-2, α2δ-3 and α2δ-4 subunits of VGCCs differentially contribute to pain and itch processing. The mechanisms underlying these differences remain however to be determined.
❌