PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Powiadomienia systemowe
  • Sesja wygasła!
  • Sesja wygasła!
  • Sesja wygasła!
  • Sesja wygasła!
  • Sesja wygasła!
Tytuł artykułu

Deep Bi-Directional LSTM Networks for Device Workload Forecasting

Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Konferencja
Federated Conference on Computer Science and Information Systems (15 ; 06-09.09.2020 ; Sofia, Bulgaria)
Języki publikacji
EN
Abstrakty
EN
Deep convolutional neural networks revolutionized the area of automated objects detection from images. Can the same be achieved in the domain of time series forecasting? Can one build a universal deep network that once trained on the past would be able to deliver accurate predictions reaching deep into the future for any even most diverse time series? This work is a first step in an attempt to address such a challenge in the context of a FEDCSIS'2020 Competition dedicated to network device workload prediction based on their historical time series data. We have developed and pre-trained a universal 3-layer bi-directional Long-Short-Term-Memory (LSTM) regression network that reported the most accurate hourly predictions of the weekly workload time series from the thousands of different network devices with diverse shape and seasonality profiles. We will also show how intuitive human-led post-processing of the raw LSTM predictions could easily destroy the generalization abilities of such prediction model.
Rocznik
Tom
Strony
115--118
Opis fizyczny
Bibliogr. 13 poz., wz., wykr.
Twórcy
autor
  • EBTIC, Khalifa University, UAE
autor
  • EBTIC, Khalifa University, UAE
  • Zalora, Singapore
Bibliografia
  • 1. FedCSIS 2020 Challenge: Network Device Workload Prediction, https://knowledgepit.ml/fedcsis20-challenge/.
  • 2. Z. Chen, J. Hu, G. Min, A. Zomaya, and T. El-Ghazawi, "Towards Accurate Prediction for High-Dimensional and Highly-Variable Cloud Workloads with Deep Learning," IEEE Transactions on Parallel and Distributed Systems, vol. 31, no. 4, pp. 923-934, April 2020.
  • 3. H. Nguyen, S. Woo, J. Im, T. Jun, and D. Kim, "A Workload Prediction Approach Using Models Stacking Based on Recurrent Neural Network and Autoencoder," IEEE Int. Conference on High Performance Computing and Communications, IEEE International Conference on Smart City, IEEE International Conference on Data Science and Systems, Dec. 2016.
  • 4. K. Qazi and I. Aizenberg, Towards quantum computing algorithms for datacenter workload predictions, IEEE Int. Conf. on Cloud Comput., 2018.
  • 5. P. Saripalli, G. Kiran, R. Shankar, H. Narware, and N. Bindal, “Load prediction and hot spot detection models for autonomic cloud computing,” IEEE Int. Conf. in Utility and Cloud Computing, pp. 397–402, 2011.
  • 6. R. Calheiros, E. Masoumi, R. Ranjan, R. Buyya, "Workload prediction using ARIMA model and its impact on cloud applications’ QoS," IEEE Trans. Cloud Comput., vol. 3, no. 4, pp. 449–458, 2014.
  • 7. P. Dinda, and D. O’Hallaron, “Host load prediction using linear models,” Cluster Computing," vol. 3, no. 4, pp. 265–280, 2000.
  • 8. S. Di, D. Kondo, W. Cirne, "Host load prediction in a Google compute cloud with a Bayesian model," Proc. of IEEE Int. Conf. on High Performance Computing, Networking, Storage and Analysis, 2012.
  • 9. F. Benhammadi, Z. Gessoum, A. Mokhtari, CPU load prediction using neuro-fuzzy Bayesian inferences. Neurocomputing 74, 1606–1616 (2011)
  • 10. J. Kumar, A. Singh, Workload prediction in cloud using artificial neural net. and adaptive diff. evolution, Futur. Gen. Comput. Syst. 81:41–52, 2018.
  • 11. Y. Zhu, W. Zhang, Y. Chen and H. Gao, "A novel approach to workload prediction using attention-based LSTM encoder-decoder network in cloud environment," EURASIP Journal on Wireless Communications and Networking, Article number: 274, 2019.
  • 12. C. Peng, Y. Li, Y. Yu, Y. Zhou and S. Du, "Multi-step-ahead host load prediction with GRU based encoder-decoder in cloud computing," IEEE Int. Conference on Knowledge and Smart Technology, pp. 186–191.
  • 13. S. Hochreiter, J. Schmidhuber, "Long short-term memory," Neural Computation, vol. 9, no. 8, pp. 1735-1780, 1997.
Uwagi
1. Track 1: Artificial Intelligence
2. Technical Session: 15th International Symposium Advances in Artificial Intelligence and Applications
3. Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2021).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-0a4a3abd-4ed0-4414-97f7-c2d81ef722a2
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.