PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Greedy incremental support vector regression

Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Konferencja
Federated Conference on Computer Science and Information Systems (14 ; 01-04.09.2019 ; Leipzig, Germany)
Języki publikacji
EN
Abstrakty
EN
Support Vector Regression (SVR) is a powerful supervised machine learning model especially well suited to the normalized or binarized data. However, its quadratic complexity in the number of training examples eliminates it from training on large datasets, especially high dimensional with frequent retraining requirement. We propose a simple two-stage greedy selection of training data for SVR to maximize its validation set accuracy at the minimum number of training examples and illustrate the performance of such strategy in the context of Clash Royale Challenge 2019, concerned with efficient decks' win rate prediction. Hundreds of thousands of labelled data examples were reduced to hundreds, optimized SVR was trained on to maximize the validation R2 score. The proposed model scored the first place in the Cash Royale 2019 challenge, outperforming over hundred of competitive teams from around the world.
Rocznik
Tom
Strony
7--9
Opis fizyczny
Bibliogr. 13 poz., wz., tab.
Twórcy
autor
  • EBTIC, Khalifa University, UAE
autor
  • EBTIC, Khalifa University, UAE
  • Zalora, Singapore
Bibliografia
  • 1. V. Vapnik and A. Lerner, “Pattern Recognition Using Generalized Portrait Method”, Automation and Remote Control 24:774–780, 1963.
  • 2. V. Vapnik and A. Chervonenkis, “A Note on One Class of Perceptrons”, Automation and Remote Control 25, 1964.
  • 3. B. Boser, I. Guyon, and V. Vapnik, "A training algorithm for optimal margin classifiers," Proc. Fifth Annual Workshop of Computational Learning Theory, vol. 5, pp. 144–152, Pittsburgh, 1992.
  • 4. V. Vapnik, "The Nature of Stat. Learning Theory", Springer, NY, 1995.
  • 5. C. Cortes and V. Vapnik, "Support-Vector Networks", Machine learning 20(3):273-297, 1995.
  • 6. V. Vapnik, S. Golowich and A. Smola, “Support Vector Method for Function Approximation, Regression Estimation, and Signal Processing,” in M. Mozer, M. Jordan, and T. Petsche (eds.), Neural Information Processing Systems, vol. 9, MIT Press, Cambridge, MA., 1997.
  • 7. A. Smola, and B. Schölkopf, "A Tutorial on Support Vector Regression," Statistics and computing, vol. 14, pp. 199-222, 2003.
  • 8. X. Xia,M. Lyu, T. Lok, G. Huang, "Methods of Decreasing the Number of Support Vectors via k-Mean Clustering," Proc. Int. Conf. Intelligent Computing, pp. 717-726, 2005.
  • 9. C. Burges, "Simplified support vector decision rules," Proc. 13th Int. Conf. Mach. Learning, pp. 71-77, 1996.
  • 10. E. Osuna and F. Girosi, "Reducing the run-time complexity of support vector machines," Int. Conf. Pattern Recognition, Australia, 1998.
  • 11. D. Geebelen, J. Suykens, J. Vandewalle, "Reducing the number of support vectors of SVM classifiers using the smoothed separable case approximation", IEEE Trans Neural Net Learn Sys. 23(4):682-688, 2012.
  • 12. T. Downs, K. Gates, and A. Masters, "Exact simplification of support vector solutions", Machine Learning Research 1:293-297, 2001.
  • 13. G. Bakir, J. Weston, and L. Bottou, "Breaking SVM complexity with cross-training," Advances in Neural Information Processing Systems, vol. 17, pp. 81-88, 2005.
Uwagi
1. Track 1: Artificial Intelligence and Applications
2. Technical Session: 14th International Symposium Advances in Artificial Intelligence and Applications
3. Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2020).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-38f3cd3e-f314-4007-a8ac-a34a288c61fb
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.