AXpert: human expert facilitated privacy-preserving large language models for abdominal X-ray report labeling

AXpert:由人类专家协助构建的、保护隐私的大型语言模型,用于腹部X光片报告标注

阅读:1

Abstract

IMPORTANCE: The lack of a publicly accessible abdominal X-ray (AXR) dataset has hindered necrotizing enterocolitis (NEC) research. While significant strides have been made in applying natural language processing (NLP) to radiology reports, most efforts have focused on chest radiology. Development of an accurate NLP model to identify features of NEC on abdominal radiograph can support efforts to improve diagnostic accuracy for this and other rare pediatric conditions. OBJECTIVES: This study aims to develop privacy-preserving large language models (LLMs) and their distilled version to efficiently annotate pediatric AXR reports. MATERIALS AND METHODS: Utilizing pediatric AXR reports collected from C.S. Mott Children's Hospital, we introduced AXpert in 2 formats: one based on the instruction-fine-tuned 7-B Gemma model, and a distilled version employing a BERT-based model derived from the fine-tuned model to improve inference and fine-tuning efficiency. AXpert aims to detect NEC presence and classify its subtypes-pneumatosis, portal venous gas, and free air. RESULTS: Extensive testing shows that LLMs, including Axpert, outperforms baseline BERT models on all metrics. Specifically, Gemma-7B (F1 score: 0.9 ± 0.015) improves upon BlueBERT by 132% in F1 score for detecting NEC positive samples. The distilled BERT model matches the performance of the LLM labelers and surpasses expert-trained baseline BERT models. DISCUSSION: Our findings highlight the potential of using LLMs for clinical NLP tasks. With minimal expert knowledge injections, LLMs can achieve human-like performance, greatly reducing manual labor. Privacy concerns are alleviated as all models are trained and deployed locally. CONCLUSION: AXpert demonstrates potential to reduce human labeling efforts while maintaining high accuracy in automating NEC diagnosis with AXR, offering precise image labeling capabilities.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。