Effect of gradient descent optimizers and dropout technique on deep learning LSTM performance in rainfall-runoff modeling

Duong Tran Anh, Dat Vi Thanh, Hoang Minh Le, Bang Tran Sy, Ahad Hasan Tanim, Quoc Bao Pham, Thanh Duc Dang, Son T. Mai, Nguyen Mai Dang*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review


Machine learning and deep learning (ML-DL) based models are widely used for rainfall-runoff prediction and they have potential to substitute process-oriented physics based numerical models. However, developing an ML model has also performance uncertainty because of inaccurate choices of hyperparameters and neural networks architectures. Thus, this study aims to search for best optimization algorithms to be used in ML-DL models namely, RMSprop, Adagrad, Adadelta, and Adam optimizers, as well as dropout techniques to be integrated into the Long Short Term Memory (LSTM) model to improve forecasting accuracy of rainfall-runoff modeling. A deep learning LSTMs were developed using 480 model architectures at two hydro-meteorological stations of the Mekong Delta, Vietnam, namely Chau Doc and Can Tho. The model performance is tested with the most ideally suited LSTM optimizers utilizing combinations of four dropout percentages respectively, 0%, 10%, 20%, and 30%. The Adagrad optimizer shows the best model performance in the model testing. Deep learning LSTM models with 10% dropout made the best prediction results while significantly reducing overfitting tendency of the forecasted time series. The findings of this study are valuable for ML-based hydrological models set up by identifying a suitable gradient descent (GD) optimizer and optimal dropout ratio to enhance the performance and forecasting accuracy of the ML model.
Original languageEnglish
Pages (from-to)639-657
Number of pages19
JournalWater Resources Management
Issue number2
Early online date02 Dec 2022
Publication statusPublished - Jan 2023

Bibliographical note

Funding Information:
The first author acknowledges the financial support from the Fulbright Visiting Scholar program at the University of South Florida, USA. We also thank the Southern Regional Hydro-meteorological Center and National meteorological center for providing daily rainfall and runoff data in this study.

Publisher Copyright:
© 2022, The Author(s), under exclusive licence to Springer Nature B.V.


  • Dropout technique
  • LSTM
  • Mekong delta
  • Optimizers
  • Rainfall-runoff

ASJC Scopus subject areas

  • Civil and Structural Engineering
  • Water Science and Technology


Dive into the research topics of 'Effect of gradient descent optimizers and dropout technique on deep learning LSTM performance in rainfall-runoff modeling'. Together they form a unique fingerprint.

Cite this