Indexed by:
Abstract:
Rockburst is a common geological disaster in underground engineering, posing significant challenges. Data-driven methods provide powerful tools for the assessment and early warning of rockburst. However, existing LSTM and its derivative architectures face a trade-off between accuracy and stability. We propose a Hybrid Convolutional Long Short-Term Memory Network (HCLSTM) to address this. Unlike previous models, HCLSTM innovatively integrates onedimensional and two-dimensional convolutions, leveraging their strengths in short-term feature extraction and high-level abstract feature extraction, combined with the long-term memory capability of LSTM. This architecture achieves both high accuracy and high robustness. To fairly evaluate HCLSTM, a sliding window method was used to process the data and compare it with single LSTM models (Vanilla LSTM, Stacked LSTM, and Bidirectional LSTM) and convolutional LSTM hybrid models (CNN1d-LSTM and CNN2d-LSTM). All models were based on Optuna's optimal architecture. The results showed that macro metrics (accuracy and Kappa coefficient) indicated HCLSTM as the best-performing predictive model, while micro metrics (Precision, Recall, and F1) demonstrated their superior performance in predicting minority classes. Additionally, in three scenarios exploring the impact of data distribution, data partitioning methods, and sequence length on the model, HCLSTM exhibited the highest stability. Overall, experimental results indicated that HCLSTM achieved a balance between accuracy and stability when handling long sequences with complex spatiotemporal characteristics, outperforming CNN1d-LSTM and CNN2d-LSTM. This provides new insights for the extended design of LSTM models and offers a powerful tool for the analysis and evaluation of rockburst in engineering applications.
Keyword:
Reprint Author's Address:
Email:
Source :
ENGINEERING FAILURE ANALYSIS
ISSN: 1350-6307
Year: 2025
Volume: 169
4 . 0 0 0
JCR@2022
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 2
Affiliated Colleges: