The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
Extract Clinical Lab Tests From Electronic Hospital Records Through Featured Transformer Model
Abstract
Natural language, as a rich source of information, has been used as the foundation of the product review, the demographic trend, and the domain specific knowledge bases. To extract entities from texts, the challenge is, free text is so sparse that missing features always exist which makes the training processing incomplete. Based on attention mechanism in deep learning architecture, the authors propose a featured transformer model (FTM) which adds category information into inputs to overcome missing feature issue. When attention mechanism performs Markov-like updates in deep learning architecture, the importance of the category represents the frequency connecting to other entities and categories and is compatible with the importance of the entity in decision-making. They evaluate the performance of FTM and compare the performance with several other machine learning models. FTM overcomes the missing feature issue and performs better than other models.
Related Content
Marlon Luca Machal.
© 2024.
16 pages.
|
Dantong Li, Guixin Li, Shuang Li, Ashley Bang.
© 2024.
12 pages.
|
David Opeoluwa Oyewola, Emmanuel Gbenga Dada, Sanjay Misra.
© 2024.
21 pages.
|
Bin Hu, Gregory T. MacLennan.
© 2024.
11 pages.
|
Neetu Singh, Upkar Varshney.
© 2024.
17 pages.
|
Long Liu, Zhankui Zhai, Weihua Zhu.
© 2024.
10 pages.
|
Lucy M. Lu, Richard S. Segall.
© 2024.
18 pages.
|
|
|