IRMA-International.org: Creator of Knowledge
Information Resources Management Association
Advancing the Concepts & Practices of Information Resources Management in Modern Organizations

Prediction of Rice Yield via Stacked LSTM

Prediction of Rice Yield via Stacked LSTM
View Sample PDF
Author(s): Xiangyan Meng (College of Science, Northeast Agricultural University, Harbin, China), Muyan Liu (College of Engineering, Northeast Agricultural University, Harbin, China)and Qiufeng Wu (College of Science, Northeast Agricultural University, Harbin, China)
Copyright: 2020
Volume: 11
Issue: 1
Pages: 10
Source title: International Journal of Agricultural and Environmental Information Systems (IJAEIS)
Editor(s)-in-Chief: Frederic Andres (National Institute of Informatics, Japan), Chutiporn Anutariya (Asian Institute of Technology, Thailand), Teeradaj Racharak (Japan Advanced Institute of Science and Technology, Japan)and Watanee Jearanaiwongkul (National institute of Informatics, Japan)
DOI: 10.4018/IJAEIS.2020010105

Purchase

View Prediction of Rice Yield via Stacked LSTM on the publisher's website for pricing and purchasing information.

Abstract

In order to guarantee the rice yield more effectively, the prediction of rice yield should be taken into account. Because the rice yield every year can be seen as a sequence of time series, many methods applied in prediction of time series can be considered. Long Short-Term Memory recurrent neural network (LSTM) is one of the most popular methods of time series prediction. In consideration of its own characteristics and the popularity of deep learning, an improved LSTM architecture called Stacked LSTM which has multiple layers is proposed in this article. It is based on the idea of increasing the depth of LSTM. The comparison among the Stacked LSTM architectures which have different numbers of LSTM layers and other methods including ARIMA, GRU, and ANN has been carried out on the data of rice yield in Heilongjiang Province, China, from 1980 to 2017. The results showed the superior performance of Stacked LSTM and the effectiveness of increasing the depth of LSTM.

Related Content

Vincent Soulignac, François Pinet, Mathilde Bodelet, Hélène Gross. © 2023. 28 pages.
Haiying Liu, Yongcai Lai, Zhenhua Xu, Zhonliang Yang, Yanmin Yu, Ping Yan. © 2023. 12 pages.
Ren Wang. © 2023. 14 pages.
Daidyi Wang, Fengsong Zhang. © 2022. 15 pages.
Takahiro Kawamura, Tetsuo Katsuragi, Akio Kobayashi, Motoko Inatomi, Masataka Oshiro, Hisashi Eguchi. © 2022. 19 pages.
Cédric Baudrit, Patrice Buche, Nadine Leconte, Christophe Fernandez, Maëllis Belna, Geneviève Gésan-Guiziou. © 2022. 22 pages.
Jingfa Wang, Huishi Du. © 2022. 11 pages.
Body Bottom