This study demonstrates how algorithms can assist humans in decision-making in the apparel industry. A two-stage method including suggestions and intelligent forecasting was proposed. In the first stage, a web crawler was used to browse a B2C apparel website to identify popular products. In the second stage, machine learning methods were used to predict the sales demand for new products. Additionally, we used Google Trends to collect external information indices to adjust the demand forecasting. Our numerical study shows that the intelligent forecasting approach can effectively reduce the Mean Square Error (MSE), Root Mean Square Error (RMSE), and Mean Absolute Percentage Error (MAPE) by at least 45.79, 26.35, and 26.34 %, respectively.
Crawling systems available on the market are usually closed solutions dedicated to performing a particular kind of task. There is a meaningful group of users, however, which require an all–in–one studio, not only for executing and running Internet robots, but also for (graphical) (re)defining and (re)composing crawlers according to dynamically changing requirements and use–cases. The Cassiopeia framework addresses the above idea. The crucial aspect regarding its efficiency and scalability is concurrency model applied. One of the promising models is staged event–driven architecture providing some useful benefits, such as splitting an application into separate stages connected by events’ queues–which is interesting, taking into account Cassiopeia’s assumptions regarding crawler (re)composition. The goal of this paper is to present the idea and PoC implementation of the Cassiopeia framework, with special attention paid to its crucial architectural element; i.e., design, implementation, and application of staged event–driven architecture.
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.