The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
A Lexical Knowledge Representation Model for Natural Language Understanding
|
Author(s): Ping Chen (University of Houston-Downtown, USA), Wei Ding (University of Massachusetts-Boston, USA)and Chengmin Ding (IBM Business Consulting, USA)
Copyright: 2012
Pages: 18
Source title:
Software and Intelligent Sciences: New Transdisciplinary Findings
Source Author(s)/Editor(s): Yingxu Wang (University of Calgary, Canada)
DOI: 10.4018/978-1-4666-0261-8.ch012
Purchase
|
Abstract
Knowledge representation is essential for semantics modeling and intelligent information processing. For decades researchers have proposed many knowledge representation techniques. However, it is a daunting problem how to capture deep semantic information effectively and support the construction of a large-scale knowledge base efficiently. This paper describes a new knowledge representation model, SenseNet, which provides semantic support for commonsense reasoning and natural language processing. SenseNet is formalized with a Hidden Markov Model. An inference algorithm is proposed to simulate human-like natural language understanding procedure. A new measurement, confidence, is introduced to facilitate the natural language understanding. The authors present a detailed case study of applying SenseNet to retrieving compensation information from company proxy filings.
Related Content
Bhargav Naidu Matcha, Sivakumar Sivanesan, K. C. Ng, Se Yong Eh Noum, Aman Sharma.
© 2023.
60 pages.
|
Lavanya Sendhilvel, Kush Diwakar Desai, Simran Adake, Rachit Bisaria, Hemang Ghanshyambhai Vekariya.
© 2023.
15 pages.
|
Jayanthi Ganapathy, Purushothaman R., Ramya M., Joselyn Diana C..
© 2023.
14 pages.
|
Prince Rajak, Anjali Sagar Jangde, Govind P. Gupta.
© 2023.
14 pages.
|
Mustafa Eren Akpınar.
© 2023.
9 pages.
|
Sreekantha Desai Karanam, Krithin M., R. V. Kulkarni.
© 2023.
34 pages.
|
Omprakash Nayak, Tejaswini Pallapothala, Govind P. Gupta.
© 2023.
19 pages.
|
|
|