IRMA-International.org: Creator of Knowledge
Information Resources Management Association
Advancing the Concepts & Practices of Information Resources Management in Modern Organizations

Extract Clinical Lab Tests From Electronic Hospital Records Through Featured Transformer Model

Extract Clinical Lab Tests From Electronic Hospital Records Through Featured Transformer Model
View Sample PDF
Author(s): Lucy M. Lu (Arkansas Bioscience Institute, USA)and Richard S. Segall (Arkansas State University, USA)
Copyright: 2024
Volume: 10
Issue: 1
Pages: 18
Source title: International Journal of Practical Healthcare Innovation and Management Techniques (IJPHIMT)
Editor(s)-in-Chief: Francesco Longo (University of Calabria, Italy)and Letizia Nicoletti (CAL-TEK, Italy)
DOI: 10.4018/IJPHIMT.336529

Purchase

View Extract Clinical Lab Tests From Electronic Hospital Records Through Featured Transformer Model on the publisher's website for pricing and purchasing information.

Abstract

Natural language, as a rich source of information, has been used as the foundation of the product review, the demographic trend, and the domain specific knowledge bases. To extract entities from texts, the challenge is, free text is so sparse that missing features always exist which makes the training processing incomplete. Based on attention mechanism in deep learning architecture, the authors propose a featured transformer model (FTM) which adds category information into inputs to overcome missing feature issue. When attention mechanism performs Markov-like updates in deep learning architecture, the importance of the category represents the frequency connecting to other entities and categories and is compatible with the importance of the entity in decision-making. They evaluate the performance of FTM and compare the performance with several other machine learning models. FTM overcomes the missing feature issue and performs better than other models.

Related Content

Marlon Luca Machal. © 2024. 16 pages.
Dantong Li, Guixin Li, Shuang Li, Ashley Bang. © 2024. 12 pages.
David Opeoluwa Oyewola, Emmanuel Gbenga Dada, Sanjay Misra. © 2024. 21 pages.
Bin Hu, Gregory T. MacLennan. © 2024. 11 pages.
Neetu Singh, Upkar Varshney. © 2024. 17 pages.
Long Liu, Zhankui Zhai, Weihua Zhu. © 2024. 10 pages.
Lucy M. Lu, Richard S. Segall. © 2024. 18 pages.
Body Bottom