PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Gradient Boosting Application in Forecasting of Performance Indicators Values for Measuring the Efficiency of Promotions in FMCG Retail

Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Konferencja
Federated Conference on Computer Science and Information Systems (15 ; 06-09.09.2020 ; Sofia, Bulgaria)
Języki publikacji
EN
Abstrakty
EN
In the paper, a problem of forecasting promotion efficiency is raised. The authors propose a new approach, using the gradient boosting method for this task. Six performance indicators are introduced to capture the promotion effect. For each of them, within predefined groups of products, a model was trained. A description of using these models for forecasting and optimising promotion efficiency is provided. Data preparation and hyperparameters tuning processes are also described. The experiments were performed for three groups of products from a large grocery company.
Rocznik
Tom
Strony
59--68
Opis fizyczny
Bibliogr. 36 poz., wykr., tab., rys.
Twórcy
  • Department of Computer Networks and System, Faculty of Automatic Control, Electronics and Computer Science, Silesian University of Technology, ul. Akademicka 16, 44-100 Gliwice, Poland
autor
  • Department of Computer Networks and System, Faculty of Automatic Control, Electronics and Computer Science, Silesian University of Technology, ul. Akademicka 16, 44-100 Gliwice, Poland
Bibliografia
  • 1. M. C. Cohen, N. H. Z. Leung, K. Panchamgam, G. Perakis, and A. Smith, “The impact of linear optimization on promotion planning,” Operations Research, vol. 65, no. 2, pp. 446–468, 2017. http://dx.doi.org/10.1287/opre.2016.1573
  • 2. R. Fildes, P. Goodwin, and D. Önkal, “Use and misuse of information in supply chain forecasting of promotion effects,” International Journal of Forecasting, vol. 35, no. 1, pp. 144–156, jan 2019. http://dx.doi.org/10.1016/j.ijforecast.2017.12.006
  • 3. S. Makridakis, “The art and science of forecasting An assessment and future directions,” International Journal of Forecasting, vol. 2, no. 1, pp. 15–39, 1986. http://dx.doi.org/10.1016/0169-2070(86)90028-2
  • 4. E. S. Gardner Jr., “Exponential smoothing: The state of the art,” vol. 4, no. October 1983, pp. 1–28, 1985.
  • 5. T.-M. Choi, Y. Yu, and K.-F. Au, “A hybrid SARIMA wavelet transform method for sales forecasting,” Decision Support Systems, vol. 51, no. 1, pp. 130–140, apr 2011. http://dx.doi.org/10.1016/j.dss.2010.12.002
  • 6. N. S. Arunraj and D. Ahrens, “A hybrid seasonal autoregressive integrated moving average and quantile regression for daily food sales forecasting,” International Journal of Production Economics, vol. 170, pp. 321–335, dec 2015. http://dx.doi.org/10.1016/j.ijpe.2015.09.039
  • 7. C. W. Chu and G. P. Zhang, “A comparative study of linear and nonlinear models for aggregate retail sales forecasting,” International Journal of Production Economics, vol. 86, no. 3, pp. 217–231, dec 2003. http://dx.doi.org/10.1016/S0925-5273(03)00068-9
  • 8. C. Y. Chen, W. I. Lee, H. M. Kuo, C. W. Chen, and K. H. Chen, “The study of a forecasting sales model for fresh food,” Expert Systems with Applications, vol. 37, no. 12, pp. 7696–7702, dec 2010. http://dx.doi.org/10.1016/j.eswa.2010.04.072
  • 9. K.-F. Au, T.-M. Choi, and Y. Yu, “Fashion retail forecasting by evolutionary neural networks,” International Journal of Production Economics, vol. 114, no. 2, pp. 615 – 630, 2008. http://dx.doi.org/10.1016/j.ijpe.2007.06.013
  • 10. Z.-L. Sun, T.-M. Choi, K.-F. Au, and Y. Yu, “Sales forecasting using extreme learning machine with applications in fashion retailing,” Decision Support Systems, vol. 46, no. 1, pp. 411–419, 2008. http://dx.doi.org/10.1016/j.dss.2008.07.009
  • 11. M. Xia, Y. Zhang, L. Weng, and X. Ye, “Fashion retailing forecasting based on extreme learning machine with adaptive metrics of inputs,” Knowledge-Based Systems, vol. 36, pp. 253–259, dec 2012. http://dx.doi.org/10.1016/j.knosys.2012.07.002
  • 12. Y. Yu, T.-M. Choi, and C.-L. Hui, “An intelligent fast sales forecasting model for fashion products,” Expert Systems with Applications, vol. 38, no. 6, pp. 7373–7379, jun 2011. http://dx.doi.org/10.1016/j.eswa.2010.12.089
  • 13. K. Kaczmarek and O. Hryniewicz, “Linguistic knowledge about temporal data in bayesian linear regression model to support forecasting of time series,” in 2013 Federated Conference on Computer Science and Information Systems, FedCSIS 2013, 2013. ISBN 9781467344715 pp. 651–654.
  • 14. P. Wachter, T. Widmer, and A. Klein, “Predicting automotive sales using pre-purchase online search data,” in Proceedings of the 2019 Federated Conference on Computer Science and Information Systems, 2019. http://dx.doi.org/10.15439/2019F239 pp. 569–577.
  • 15. P. Doganis, A. Alexandridis, P. Patrinos, and H. Sarimveis, “Time series sales forecasting for short shelf-life food products based on artificial neural networks and evolutionary computing,” Journal of Food Engineering, vol. 75, no. 2, pp. 196–204, jul 2006. http://dx.doi.org/10.1016/j.jfoodeng.2005.03.056
  • 16. E. Tarallo, G. K. Akabane, C. I. Shimabukuro, J. Mello, and D. Amancio, “Machine learning in predicting demand for fast-moving consumer goods: An exploratory research,” IFAC-PapersOnLine, vol. 52, no. 13, pp. 737–742, 2019. http://dx.doi.org/10.1016/j.ifacol.2019.11.203
  • 17. T. Huang, R. Fildes, and D. Soopramanien, “The value of competitive information in forecasting FMCG retail product sales and the variable selection problem,” European Journal of Operational Research, vol. 237, no. 2, pp. 738–748, sep 2014. http://dx.doi.org/10.1016/j.ejor.2014.02.022
  • 18. V. Adithya Ganesan, S. Divi, N. B. Moudhgalya, U. Sriharsha, and V. Vijayaraghavan, “Forecasting food sales in a multiplex using dynamic artificial neural networks,” in Advances in Intelligent Systems and Computing, vol. 944. Springer Verlag, 2020. http://dx.doi.org/10.1007/978-3-030-17798-0_8. ISBN 9783030177973. ISSN 21945365 pp. 69–80.
  • 19. I. Islek and S. Gunduz Oguducu, “A decision support system for demand forecasting based on classifier ensemble,” in Communication Papers of the 2017 Federated Conference on Computer Science and Information Systems, 2017. http://dx.doi.org/10.15439/2017F224 pp. 35–41.
  • 20. S. Thomassey and A. Fiordaliso, “A hybrid sales forecasting system based on clustering and decision trees,” Decision Support Systems, vol. 42, no. 1, pp. 408–421, oct 2006. http://dx.doi.org/10.1016/j.dss.2005.01.008
  • 21. A. Krishna, V. Akhilesh, A. Aich, and C. Hegde, “Sales-forecasting of retail stores using machine learning techniques,” in Sales-forecasting of Retail Stores using Machine Learning Techniques. IEEE, 2018. http://dx.doi.org/10.1109/CSITSS.2018.8768765. ISBN 9781538660782 pp. 160–166.
  • 22. R. C. Blattberg and A. Levin, “Modelling the effectiveness and profitability of trade promotions,” Marketing Science, vol. 6, no. 2, pp. 124–146, 1987. http://dx.doi.org/10.1287/mksc.6.2.124
  • 23. J. Zhang and M. Wedel, “The effectiveness of customized promotions in online and offline stores,” Journal of Marketing Research, vol. 46, no. 2, pp. 190–206, 2009. http://dx.doi.org/10.1509/jmkr.46.2.190
  • 24. K. H. Van Donselaar, J. Peters, A. De Jong, and R. Broekmeulen, “Analysis and forecasting of demand during promotions for perishable items,” International Journal of Production Economics, vol. 172, pp. 65–75, feb 2016. http://dx.doi.org/10.1016/j.ijpe.2015.10.022
  • 25. J. R. Trapero, N. Kourentzes, and R. Fildes, “On the identification of sales forecasting models in the presence of promotions,” Journal of the Operational Research Society, vol. 66, no. 2, pp. 299–307, 2015. http://dx.doi.org/10.1057/jors.2013.174
  • 26. G. Cui, M. L. Wong, and H. K. Lui, “Machine learning for direct marketing response models: Bayesian networks with evolutionary programming,” Management Science, vol. 52, no. 4, pp. 597–612, 2006. http://dx.doi.org/10.1287/mnsc.1060.0514
  • 27. Ö. G. Ali, S. Sayin, T. van Woensel, and J. Fransoo, “SKU demand forecasting in the presence of promotions,” Expert Systems with Applications, vol. 36, no. 10, pp. 12 340–12 348, dec 2009. http://dx.doi.org/10.1016/j.eswa.2009.04.052
  • 28. T. Chen and C. Guestrin, “XGBoost: A scalable tree boosting system,” in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, vol. KDD ’16. ACM, 2016. http://dx.doi.org/10.1145/2939672.2939785. ISBN 9781450342322 pp. 785–794. [Online]. Available: http://doi.acm.org/10.1145/2939672.2939785
  • 29. L. Torlay, M. Perrone-Bertolotti, E. Thomas, and M. Baciu, “Machine learning–XGBoost analysis of language networks to classify patients with epilepsy,” Brain Informatics, vol. 4, no. 3, pp. 159–169, sep 2017. http://dx.doi.org/10.1007/s40708-017-0065-7
  • 30. D. Zhang, L. Qian, B. Mao, C. Huang, B. Huang, and Y. Si, “A data-driven design for fault detection of wind turbines using random forests and XGboost,” IEEE Access, vol. 6, pp. 21 020–21 031, mar 2018. http://dx.doi.org/10.1109/ACCESS.2018.2818678
  • 31. J. Nobre and R. F. Neves, “Combining principal component analysis, discrete wavelet transform and XGBoost to trade in the financial markets,” Expert Systems with Applications, vol. 125, pp. 181–194, jul 2019. http://dx.doi.org/10.1016/j.eswa.2019.01.083
  • 32. A. B. Parsa, A. Movahedi, H. Taghipour, S. Derrible, and A. K. Mohammadian, “Toward safer highways, application of XGBoost and SHAP for real-time accident detection and feature analysis,” Accident Analysis and Prevention, vol. 136, p. 105405, mar 2020. http://dx.doi.org/10.1016/j.aap.2019.105405
  • 33. Y. Wang and X. S. Ni, “A XGBoost risk model via feature selection and Bayesian hyper-parameter optimization,” International Journal of Database Management Systems, vol. 11, no. 1, pp. 1–17, jan 2019. [Online]. Available: http://arxiv.org/abs/1901.08433
  • 34. M. Nishio, M. Nishizawa, O. Sugiyama, R. Kojima, M. Yakami, T. Kuroda, and K. Togashi, “Computer-aided diagnosis of lung nodule using gradient tree boosting and Bayesian optimization,” PLoS ONE, vol. 13, no. 4, apr 2018. http://dx.doi.org/10.1371/journal.pone.0195875
  • 35. Y. Xia, C. Liu, Y. Y. Li, and N. Liu, “A boosted decision tree approach using Bayesian hyper-parameter optimization for credit scoring,” Expert Systems with Applications, vol. 78, pp. 225–241, jul 2017. http://dx.doi.org/10.1016/j.eswa.2017.02.017
  • 36. T. Chen, T. He, M. Benesty, V. Khotilovich, Y. Tang, H. Cho, K. Chen, R. Mitchell, I. Cano, T. Zhou, M. Li, J. Xie, M. Lin, Y. Geng, and Y. Li, xgboost: Extreme Gradient Boosting, 2019, r package version 0.90.0.2. [Online]. Available: https://CRAN.R-project.org/package=xgboost
Uwagi
1. Track 1: Artificial Intelligence
2. Technical Session: 15th International Symposium Advances in Artificial Intelligence and Applications
3. Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2021).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-426167f9-3bd9-4192-88de-f60578488933
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.