PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Powiadomienia systemowe
  • Sesja wygasła!
Tytuł artykułu

Comparative Study of Deep Learning Models for Predicting Stock Prices

Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
The stock market is volatile, dynamic, and nonlinear. Hence, predicting the stock prices has been a challenging task for any researcher in time series forecasting. Accurately predicting stock prices has been a hot topic for both financial and technical researchers. In this paper, we deploy six deep learning models (i.e., MLP, CNN, RNN, LSTM, GRU, and AE) to predict the closing price, one day ahead, of 20 different companies (i.e. 5 groups of 4) in the S&P 500 markets over the 7-years range (Jan 2015 - August 2022). The experimental results do not provide interesting insights, but also help us to deepen our understanding of how to use deep learning models in financial markets.
Rocznik
Tom
Strony
103--108
Opis fizyczny
Bibliogr. 21 poz., rys., tab., wykr.
Twórcy
  • Faculty of Information Technology University of Transport in Ho Chi Minh City (UTH) Ho Chi Minh, Vietnam
  • Faculty of Information Technology Hung Yen University of Technology and Education (UTEHY) Hung Yen, Vietnam
  • Faculty of Information Technology Hung Yen University of Technology and Education (UTEHY) Hung Yen, Vietnam
  • Faculty of Information Technology Hung Yen University of Technology and Education (UTEHY) Hung Yen, Vietnam
  • Faculty of Information Technology University of Transport in Ho Chi Minh City (UTH) Ho Chi Minh, Vietnam
Bibliografia
  • [1] J. Hur, M. Raj, and Y. E. Riyanto, “Finance and trade: A crosscountry empirical analysis on the impact of financial development and asset tangibility on international trade,” World Development, vol. 34, no. 10, pp. 1728–1741, 2006. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0305750X06001148
  • [2] L. Li, Y. Wu, Y. Ou, Q. Li, Y. Zhou, and D. Chen, “Research on machine learning algorithms and feature extraction for time series,” in 2017 IEEE 28th Annual International Symposium on Personal, Indoor, and Mobile Radio Communications (PIMRC), 2017, pp. 1–5.
  • [3] “Forecasting directional movements of stock prices for intraday trading using lstm and random forests,” Finance Research Letters, vol. 46, p.102280, 2022.
  • [4] R. Xiong, E. P. Nichols, and Y. Shen, “Deep learning stock volatility with google domestic trends,” 2015. [Online]. Available:https://arxiv.org/abs/1512.04916
  • [5] R. Engle, “Autoregressive conditional heteroscedasticity with estimates of the variance of united kingdom inflation,” Econometrica, vol. 50, no. 4, pp. 987–1007, 1982. [Online]. Available: https://EconPapers.repec.org/RePEc:ecm:emetrp:v:50:y:1982:i:4:p:987-1007
  • [6] S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural Comput., vol. 9, no. 8, p. 1735–1780, nov 1997. [Online]. Available: https://doi.org/10.1162/neco.1997.9.8.1735
  • [7] T. Fischer and C. Krauss, “Deep learning with long short-term memory networks for financial market predictions,” European Journal of Operational Research, vol. 270, no. 2, pp. 654–669, 2018. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0377221717310652
  • [8] P. Yu and X. Yan, “Stock price prediction based on deep neural networks,” Neural Comput. Appl., vol. 32, no. 6, pp. 1609–1628, 2020. [Online]. Available: https://doi.org/10.1007/s00521-019-04212-x
  • [9] D. Karmiani, R. Kazi, A. Nambisan, A. Shah, and V. Kamble, “Comparison of predictive algorithms: Backpropagation, svm, lstm and kalman filter for stock market,” in 2019 Amity International Conference on Artificial Intelligence (AICAI), 2019, pp. 228–234.
  • [10] P. Gao, R. Zhang, and X. Yang, “The application of stock index price prediction with neural network,” Mathematical and Computational Applications, vol. 25, no. 3, 2020. [Online]. Available: https://www.mdpi.com/2297-8747/25/3/53
  • [11] A. Krizhevsky, I. Sutskever, and G. E. Hinton, “Imagenet classification with deep convolutional neural networks,” in Advances in Neural Information Processing Systems, F. Pereira, C. Burges, L. Bottou, and K. Weinberger, Eds. Curran Associates, Inc.
  • [12] J. Schmidhuber, “Deep learning in neural networks: An overview,” CoRR, vol. abs/1404.7828, 2014. [Online]. Available: http://arxiv.org/abs/1404.7828
  • [13] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, Learning Representations by Back-Propagating Errors. Cambridge, MA, USA: MIT Press, 1988, p. 696–699.
  • [14] I. Sutskever, O. Vinyals, and Q. V. Le, “Sequence to sequence learning with neural networks,” CoRR, vol. abs/1409.3215, 2014. [Online]. Available: http://arxiv.org/abs/1409.3215
  • [15] A. Graves and J. Schmidhuber, “Framewise phoneme classification with bidirectional lstm and other neural network architectures,” Neural Networks, vol. 18, no. 5, pp. 602–610, 2005, iJCNN 2005. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0893608005001206
  • [16] P. Malhotra, L. Vig, G. M. Shroff, and P. Agarwal, “Long short term memory networks for anomaly detection in time series,” in ESANN, 2015.
  • [17] H. Hewamalage, C. Bergmeir, and K. Bandara, “Recurrent neural networks for time series forecasting: Current status and future directions,” International Journal of Forecasting, vol. 37, no. 1, pp. 388–427, 2021. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0169207020300996
  • [18] A. Graves, “Supervised sequence labelling with recurrent neural networks,” Ph.D. dissertation, Technical University Munich, 2008. [Online]. Available: https://d-nb.info/99115827X
  • [19] K. Cho, B. van Merrienboer, D. Bahdanau, and Y. Bengio, “On the properties of neural machine translation: Encoder-decoder approaches,” CoRR, vol. abs/1409.1259, 2014. [Online]. Available: http://arxiv.org/abs/1409.1259
  • [20] J. Chung, Ç. Gülçehre, K. Cho, and Y. Bengio, “Empirical evaluation" of gated recurrent neural networks on sequence modeling,” CoRR, vol.abs/1412.3555, 2014. [Online]. Available: http://arxiv.org/abs/1412.3555
  • [21] M. A. Kramer, “Nonlinear principal component analysis using autoassociative neural networks,” AIChE journal, vol. 37, no. 2, pp. 233–243, 1991.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-2d5cd097-4514-4ac2-8a15-dcde5652b5fc
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.