Drought is a hazardous natural disaster that can negatively affect the environment, water resources, agriculture, and the economy. Precise drought forecasting and trend assessment are essential for water management to reduce the detrimental effects of drought. However, some existing drought modeling techniques have limitations that hinder precise forecasting, necessitating the exploration of suitable approaches. This study examines two forecasting models, Long Short-Term Memory (LSTM) and a hybrid model integrating regularized extreme learning machine and Snake algorithm, to forecast hydrological droughts for one to six months in advance. Using the Multivariate Standardized Streamflow Index (MSSI) computed from 58 years of streamflow data for two drier Malaysian stations, the models forecast droughts and were compared to classical models such as gradient boosting regression and K-nearest model for validation purposes. The RELM-SO model outperformed other models for forecasting one month ahead at station S1, with lower root mean square error (RMSE = 0.1453), mean absolute error (MAE = 0.1164), and a higher Nash-Sutcliffe efficiency index (NSE = 0.9012) and Willmott index (WI = 0.9966). Similarly, at station S2, the hybrid model had lower (RMSE = 0.1211 and MAE = 0.0909), and higher (NSE = 0.8941 and WI = 0.9960), indicating improved accuracy compared to comparable models. Due to significant autocorrelation in the drought data, traditional statistical metrics may be inadequate for selecting the optimal model. Therefore, this study introduced a novel parameter to evaluate the model's effectiveness in accurately capturing the turning points in the data. Accordingly, the hybrid model significantly improved forecast accuracy from 19.32 % to 21.52 % when compared with LSTM. Besides, the reliability analysis showed that the hybrid model was the most accurate for providing long-term forecasts. Additionally, innovative trend analysis, an effective method, was used to analyze hydrological drought trends. The study revealed that October, November, and December experienced higher occurrences of drought than other months. This research advances accurate drought forecasting and trend assessment, providing valuable insights for water management and decision-making in drought-prone regions.
Liquefaction is a devastating consequence of earthquakes that occurs in loose, saturated soil deposits, resulting in catastrophic ground failure. Accurate prediction of such geotechnical parameter is crucial for mitigating hazards, assessing risks, and advancing geotechnical engineering. This study introduces a novel predictive model that combines Extreme Learning Machine (ELM) with Dingo Optimization Algorithm (DOA) to estimate strain energy-based liquefaction resistance. The hybrid model (ELM-DOA) is compared with the classical ELM, Adaptive Neuro-Fuzzy Inference System with Fuzzy C-Means (ANFIS-FCM model), and Sub-clustering (ANFIS-Sub model). Also, two data pre-processing scenarios are employed, namely traditional linear and non-linear normalization. The results demonstrate that non-linear normalization significantly enhances the prediction performance of all models by approximately 25% compared to linear normalization. Furthermore, the ELM-DOA model achieves the most accurate predictions, exhibiting the lowest root mean square error (484.286 J/m3), mean absolute percentage error (24.900%), mean absolute error (404.416 J/m3), and the highest correlation of determination (0.935). Additionally, a Graphical User Interface (GUI) has been developed, specifically tailored for the ELM-DOA model, to assist engineers and researchers in maximizing the utilization of this predictive model. The GUI provides a user-friendly platform for easy input of data and accessing the model's predictions, enhancing its practical applicability. Overall, the results strongly support the proposed hybrid model with GUI serving as an effective tool for assessing soil liquefaction resistance in geotechnical engineering, aiding in predicting and mitigating liquefaction hazards.
The Colorado River has experienced a significant streamflow reduction in recent decades due to climate change, resulting in pronounced hydrological droughts that pose challenges to the environment and human activities. However, current models struggle to accurately capture complex drought patterns, and their accuracy decreases as the lead time increases. Thus, determining the reliability of drought forecasting for specific months ahead presents a challenging task. This study introduces a robust approach that utilizes the Beluga Whale Optimization (BWO) algorithm to train and optimize the parameters of the Regularized Extreme Learning Machine (RELM) and Random Forest (RF) models. The applied models are validated against a KNN benchmark model for forecasting drought from one- to six-month ahead across four hydrological stations distributed over the Colorado River. The achieved results demonstrate that RELM-BWO outperforms RF-BWO and KNN models, achieving the lowest root-mean square error (0.2795), uncertainty (U95 = 0.1077), mean absolute error (0.2104), and highest correlation coefficient (0.9135). Also, the current study uses Global Multi-Criteria Decision Analysis (GMCDA) as an evaluation metric to assess the reliability of the forecasting. The GMCDA results indicate that RELM-BWO provides reliable forecasts up to four months ahead. Overall, the research methodology is valuable for drought assessment and forecasting, enabling advanced early warning systems and effective drought countermeasures.
Vacuum membrane distillation (VMD) has attracted increasing interest for various applications besides seawater desalination. Experimental testing of membrane technologies such as VMD on a pilot or large scale can be laborious and costly. Machine learning techniques can be a valuable tool for predicting membrane performance on such scales. In this work, a novel hybrid model was developed based on incorporating a spotted hyena optimizer (SHO) with support vector machine (SVR) to predict the flux pressure in VMD. The SVR-SHO hybrid model was validated with experimental data and benchmarked against other machine learning tools such as artificial neural networks (ANNs), classical SVR, and multiple linear regression (MLR). The results show that the SVR-SHO predicted flux pressure with high accuracy with a correlation coefficient (R) of 0.94. However, other models showed a lower prediction accuracy than SVR-SHO with R-values ranging from 0.801 to 0.902. Global sensitivity analysis was applied to interpret the obtained result, revealing that feed temperature was the most influential operating parameter on flux, with a relative importance score of 52.71 compared to 17.69, 17.16, and 14.44 for feed flowrate, vacuum pressure intensity, and feed concentration, respectively.