Tytuł artykułu
Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Języki publikacji
Abstrakty
Despite the great promise of machine learning in the structural seismic analysis, the deployment of advanced neural networks has been limited in practical applications because of the high costs of data acquisition. This paper introduces a new framework that integrates the powerful learning ability of physics-informed neural networks (PINNs) with the effectiveness of pseudo-labelling in data augmentation to improve the accuracies of seismic response predictions of structures. The architecture of PINNs consists of two blocks of gated recurrent unit-fully connected neural networks (GRU-FCNNs) and one block of ordinary differential equation (ODE) that leverages the knowledge of structural dynamics. The first block of GRU-FCNNs serves as a generator of pseudo-labels when drawing upon the input of unlabelled datasets. The second block of GRU-FCNNs in combination with the ODE block is a selector of reliable pseudo-labels. The performance of PINNs trained on limited labelled data can be significantly improved by successively selecting reliable pseudo-labels from the generator and selector to supplement training datasets. The effectiveness of pseudo-labelling in PINNs is validated and compared with PINNs without pseudo-labelling through case studies with simulation datasets and real datasets from experiment. The results show that the proposed framework is effective and robust in improving prediction accuracies of structural seismic responses with limited labelled data.
Czasopismo
Rocznik
Tom
Strony
art. no. e7, 2024
Opis fizyczny
Bibliogr. 62 poz., rys., wykr.
Twórcy
autor
- Department of Infrastructure Engineering, The University of Melbourne, Melbourne, Australia
autor
- School of Engineering, Swinburne University of Technology, Melbourne, Australia
autor
- Department of Infrastructure Engineering, The University of Melbourne, Melbourne, Australia
autor
- Department of Infrastructure Engineering, The University of Melbourne, Melbourne, Australia
Bibliografia
- 1. Burak B, Comlekoglu HG. Effect of shear wall area to floorare a ratio on the seismic behavior of RC buildings. J Struct Eng. 2013;139(11):1928–37.
- 2. Hidalgo PA, Jordan RM, Martinez MP. An analytical model to predict the inelastic seismic behavior of shear-wall, reinforced concrete structures. Eng Struct. 2002;24(1):85–98.
- 3. Massone LM, López CN, Kolozvari K. Formulation of an efficient shear-flexure interaction model for planar reinforced concrete walls. Eng Struct. 2021;243:112680.
- 4. Jiang H, Kurama YC. Analytical modeling of medium-rise reinforced concrete shear walls. ACI Struct J. 2010;107:400–10.
- 5. Arteta CA, Araújo GA, Torregroza AM, Martínez AF, Lu Y. Hybrid approach for simulating shear–flexure interaction in RC walls with nonlinear truss and fiber models. Bull Earthq Eng. 2019;17(12):6437–62.
- 6. Feng DC, Xu J. An efficient fiber beam-column element considering flexure–shear interaction and anchorage bond-slip effect for cyclic analysis of RC structures. Bull Earthq Eng.2018;16(11):5425–52.
- 7. Guo W, Hu Y, Li Y, Zhai Z, Shao P. Seismic performance evaluation of typical dampers designed by Chinese Code subjected to the main shock-after shocks. Soil Dyn Earthq Eng. 2019;126:105829.
- 8. Soleimani-Babakamali MH, Esteghamati MZ. Estimating seismic demand models of a building inventory from nonlinear static analysis using deep learning methods. Eng Struct. 2022;266:114576.
- 9. Hwang SH, Mangalathu S, Shin J, Jeon JS. Machine learning-based approaches for seismic demand and collapse of ductile reinforced concrete building frames. J Build Eng. 2021;34:101905.
- 10. Sun H, Burton HV, Huang H. Machine learning applications for building structural design and performance assessment: state-of-the-art review. J Build Eng. 2021;33:101816.
- 11. Scarselli F, Tsoi AC. Universal approximation using feed forward neural networks: a survey of some existing methods, and some new results. Neural Netw. 1998;11(1):15–37.
- 12. Olu-Ajayi R, Alaka H, Sulaimon I, Sunmola F, Ajayi S. Building energy consumption prediction for residential buildings using deep learning and other machine learning techniques. J Build Eng. 2022;45:103406.
- 13. Šipoš TK, Sigmund V, Hadzima-Nyarko M. Earthquake performance of infilled frames using neural networks and experimental database. Eng Struct. 2013;51:113–27.
- 14. Demertzis K, Kostinakis K, Morfidis K, Iliadis L. An interpretable machine learning method for the prediction of R/C buildings’seismic response. J Build Eng. 2023;63:105493.
- 15. Zhang D, Chen Y, Zhang C, Xue G, Zhang J, Zhang M, Li N. Prediction of seismic acceleration response of precast segmental self-centering concrete filled steel tube single-span bridges basedon machine learning method. Eng Struct. 2023;279:115574.
- 16. Siam A, Ezzeldin M, El-Dakhakhni W. Machine learning algo-rithms for structural performance classifications and predictions: application to reinforced masonry shear walls. Structures. 2019;22:252–65.
- 17. Huang H, Burton HV. Classification of in-plane failure modes for reinforced concrete frames with infills using machine learning. J Build Eng. 2019;25:100767.
- 18. Zhang R, Liu Y, Sun H. Physics-informed multi-LSTM networks for metamodeling of nonlinear structures. Comput Methods Appl Mech Eng. 2020;369:113226.
- 19. Cuomo S, Cola V, Giampaolo F et al. Scientific machine learning through physics-informed neural networks: where we are and What’s next. 2022. arXiv:2201.05624v4.
- 20. Tsai LW, Alipour A. Physics-informed long short-term memory networks for response prediction of a wind-excited flexible structure. Eng Struct. 2023;275:114968.
- 21. Chaudhari P, Oberman A, Osher S, Soatto S, Carlier G. Deep relaxation: partial differential equations for optimizing deep neural networks. Res Math Sci. 2018;5(3):1–30.
- 22. Yao H, Gao Y, Liu Y. FEA-Net: a physics-guided data-driven model for efficient mechanical response prediction. Comput Methods Appl Mech Eng. 2020;363:112892.
- 23. Zhang R, Liu Y, Sun H. Physics-guided convolutional neural network (PhyCNN) for data-driven seismic response modeling. Eng Struct. 2020;215:110704.
- 24. Eshkevari SS, Takáč M, Pakzad SN, Jahani M. DynNet: Physics-based neural architecture design for nonlinear structural response modeling and prediction. Eng Struct. 2021;229:111582.
- 25. Zhang Z, Sun C. Structural damage identification via physics-guided machine learning: a methodology integrating pattern recognition with finite element model updating. Struct Health Monit. 2021;20(4):1675–88.
- 26. Hu Y, Guo W, Long Y, Li S. Physics-informed deep neural networks for simulating S-shaped steel dampers. Comput Struct.2022;267:106798.
- 27. Yucesan YA, Viana F, Manin L, et al. Adjusting a torsional vibration damper model with physics-informed neural networks. Mech Syst Signal Process. 2021;154:107552. https://doi.org/10.1016/j.ymssp.2020.107552.
- 28. Li H, Schwab J, Antholzer S, Haltmeier M. NETT: solving inverse problems with deep neural networks. Inverse Prob.2020;36(6):065005.
- 29. Gao H, Zahr MJ, Wang JX. Physics-informed graph neural galerkin networks: a unified framework for solving pde-governed forward and inverse problems. Comput Methods Appl Mech Eng. 2022;390:114502.
- 30. Lee DH. Pseudo-label: the simple and efficient semi-supervised learning method for deep neural networks. In Workshop on challenges in representation learning, ICML, 2013;3(2):896.
- 31. Feng L, Qiu M, Li Y, Zheng HT, Shen Y. Learning to augment for data-scarce domain BERT knowledge distillation. In Proceedings of the AAAI Conference on Artificial Intelligence, 2021;35(8):7422–7430.
- 32. Wu H, Prasad S. Semi-supervised deep learning using pseudolabels for hyperspectral image classification. IEEE Trans Image Process. 2017;27(3):1259–70.
- 33. Zou Y, Zhang Z, Zhang H, Li CL, Bian X, Huang JB, Pfister T. Pseudoseg: Designing pseudo labels for semantic segmentation. 2020. arXiv preprint arXiv:2010.09713.
- 34. Li H, Wu Z, Shrivastava A, Davis LS. Rethinking pseudo labels for semi-supervised object detection. In Proceedings of the AAAI Conference on Artificial Intelligence. 2022;36(2):1314–1322.
- 35. Medsker LR, Jain LC. Recurrent neural networks. Design Appl.2001;5:64–7.
- 36. Ahmad AM, Ismail S, Samaon DF. Recurrent neural network with back propagation through time for speech recognition. In IEEE International Symposium on Communications and Information Technology, 2004. ISCIT 2004. 2004;1:98–102. IEEE.
- 37. Morchid M. Parsimonious memory unit for recurrent neural networks with application to natural language processing. Neu-rocomputing. 2018;314:48–64.
- 38. Lipton ZC, Berkowitz J, Elkan C. A critical review of recurrent neural networks for sequence learning. 2015. arXiv preprintarXiv:1506.00019.
- 39. Chen G. A gentle tutorial of recurrent neural network with error back propagation. 2016. arXiv preprint arXiv:1610.02583.
- 40. Bengio Y, Simard P, Frasconi P. Learning longterm dependencies with gradient descent is difficult. IEEE Trans Neural Netw.1994;5(2):157–66.
- 41. Le P, Zuidema W. Quantifying the vanishing gradient and long distance dependency problem in recursive neural networks andrecursive LSTMs. 2016. arXiv preprint arXiv:1603.00423.
- 42. Hochreiter S, Schmidhuber J. Long short-term memory. Neural Comput. 1997;9(8):1735–80.
- 43. Cho K, Van Merriënboer B, Gulcehre C, Bahdanau D, BougaresF, Schwenk H, Bengio Y. Learning phrase representations using RNN encoder-decoder for statistical machine translation. 2014.arXiv preprint arXiv:1406.1078.
- 44. Yamak PT, Yujian L, Gadosey PK. A comparison between arima, lstm, and gru for time series forecasting. In Proceedings of the 2019 2nd International Conference on Algorithms, Computing and Artificial Intelligence. 2019;49–55.
- 45. Wang J, Li X, Li J, Sun Q, Wang H. NGCU: a new RNN modelfor time-series data prediction. Big Data Res. 2022;27:100296.
- 46. Zhang K, Chen N, Liu J, Beer M. A GRU-based ensemble learning method for time-variant uncertain structural response analysis. Comput Methods Appl Mech Eng. 2022;391:114516.
- 47. Chen J, Jing H, Chang Y, Liu Q. Gated recurrent unit based recurrent neural network for remaining useful life prediction of nonlinear deterioration process. Reliab Eng Syst Saf.2019;185:372–82.
- 48. Truong TT, Lee J, Nguyen-Thoi T. An effective framework forreal-time structural damage detection using one-dimensional convolutional gated recurrent unit neural network and high performance computing. Ocean Eng. 2022;253:111202.
- 49. Yang J, Zhang L, Chen C, Li Y, Li R, Wang G, Zeng Z. A hierarchical deep convolutional neural network and gated recurrent unit framework for structural damage detection. Inf Sci.2020;540:117–30.
- 50. Shinozuka M, Yun CB, Imai H. Identification of linear structural dynamic systems. J Eng Mech Div. 1982;108(6):1371–90.
- 51. Chicco D, Warrens MJ, Jurman G. The coefficient of determination R-squared is more informative than SMAPE, MAE, MAPE,MSE and RMSE in regression analysis evaluation. Peer J Comput Sci. 2021;7:e623.
- 52. Singhal S, Chourasia A, Chellappa S, Parashar J. Precast reinforced concrete shear walls: state of the art review. Struct Concr.2019;20(3):886–98.
- 53. Pugh JS, Lowes LN, Lehman DE. Nonlinear line-element modeling of flexural reinforced concrete walls. Eng Struct.2015;104:174–92.
- 54. Mckenna F, Fenves G, Scott M, and Jeremic B. Open system for earthquake engineering simulation (OpenSees). Berkley, CA. 2000.
- 55. Standards Australia. Structural design actions, part 4: Earth-quake actions in Australia. Standards Australia, AS 1170.4-2007(R2018)/Amdt 2–2018, Sydney, NSW. 2018.
- 56. Hu Y, Lam NTK, Menegon S, Wilson J. The selection and scaling of ground motion accelero grams for use in stable Continental regions. J Earthquake Eng. 2022;26(12):6284–303.
- 57. Bergstra J, Bengio Y. Random search for hyper-parameter optimization. J Mach Learn Res. 2012;13(1):281–305.
- 58. Chen S, Zhang Y, Yang Q. Multi-task learning in natural language processing: An overview. 2021. arXiv preprint arXiv:2109.09138.
- 59. Srivastava N, Hinton G, Krizhevsky A, et al. Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res. 2014;15(1):1929–58.
- 60. Schmidt-Hieber J. Nonparametric regression using deep neural networks with ReLU activation function. Ann Stat.2020;48(4):1875–97.
- 61. Glorot X, Bordes A, Bengio Y. Deep sparse rectifier neural networks. In: Proceedings of the four teenth international conference on artificial intelligence and statistics, 582 Lauderdale, FL, USA, no. (15); 2011. p. 315–23.
- 62. Kingma D, Ba J. Adam: a method for stochastic optimization. Comput Sci. 2014;575. arXiv:1412.6980v8.
Uwagi
Opracowanie rekordu ze środków MNiSW, umowa nr POPUL/SP/0154/2024/02 w ramach programu "Społeczna odpowiedzialność nauki II" - moduł: Popularyzacja nauki (2025)
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-f8ec6a5a-29aa-4814-8f2e-69c5e2fbb038
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.