The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
Abstractive Turkish Text Summarization and Cross-Lingual Summarization Using Transformer
Abstract
Abstractive summarization aims to comprehend texts semantically and reconstruct them briefly and concisely where the summary may consist of words that do not exist in the original text. This chapter studies the abstractive Turkish text summarization problem by a transformer attention-based mechanism. Moreover, this study examines the differences between transformer architecture and other architectures as well as the attention block, which is the heart of this architecture, in detail. Three summarization datasets were generated from the available text data on various news websites for training abstractive summarization models. It is shown that the trained model has higher or comparable ROUGE scores than existing studies, and the summaries generated by models have better structural properties. English-to-Turkish translation model has been created and used in a cross-lingual summarization model which has a ROUGE score that is comparable to the existing studies. The summarization structure proposed in this study is the first example of cross-lingual English-to-Turkish text summarization.
Related Content
Wasswa Shafik.
© 2024.
25 pages.
|
Muthmainnah Muthmainnah, Eka Apriani, Prodhan Mahbub Ibna Seraj, Ahmed J. Obaid, Ahmad M. Al Yakin.
© 2024.
17 pages.
|
Arkar Htet, Sui Reng Liana, Theingi Aung, Amiya Bhaumik.
© 2024.
26 pages.
|
Shwetha Baliga, Harshith K. Murthy, Apoorv Sadhale, Dhruti Upadhyaya.
© 2024.
18 pages.
|
Manoj Kumar Pandey, Jyoti Upadhyay.
© 2024.
21 pages.
|
R. Angeline, S. Aarthi, Rishabh Jain, Muzamil Faisal, Abishek Venkatesan, R. Regin.
© 2024.
16 pages.
|
Gagan Deep, Jyoti Verma.
© 2024.
20 pages.
|
|
|