PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Applications of Kolmogorov Complexity and Universal Codes to Nonparametric Estimation of Characteristics of Time Series

Autorzy
Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
We consider finite-alphabet and real-valued time series and the following four problems: i) estimation of the (limiting) probability P(x0 1/4xs) for every s and each sequence x0 1/4xs of letters from the process alphabet (or estimation of the density p(x0, 1/4, xs) for real-valued time series), ii) the so-called on-line prediction, where the conditional probability P(xt+1| x1x2 1/4xt) (or the conditional density p(xt+1| x1x2 1/4xt)) should be estimated, where x1x21/4xt are given, iii) regression and iv) classification (or so-called problems with side information). We show that Kolmogorov complexity (KC) and universal codes (or universal data compressors), whose codeword length can be considered as an estimation of KC, can be used as a basis for constructing asymptotically optimal methods for the above problems. (By definition, a universal code can "compress" any sequence generated by a stationary and ergodic source asymptotically to the Shannon entropy of the source.)
Wydawca
Rocznik
Strony
177--196
Opis fizyczny
bibliogr. 46 poz.
Twórcy
autor
  • Siberian State University of Telecommunications and Informatics of Computational Technologies of Siberian Branch of Russian Academy of Science, Kirov Street,86, 630102, Novosibirsk, Russia, boris@ryabko.net
Bibliografia
  • [1] Algoet, P.: Universal Schemes for Learning the Best Nonlinear Predictor Given the Infinite Past and Side Information. IEEE Trans. Inform. Theory, 45, 1999, 1165-1185.
  • [2] Barron, A.R.: The strong ergodic theorem for dencities: generalized Shannon-McMillan-Breiman theorem. The annals of Probability, 13(4), 1985, 1292-1303.
  • [3] Begleiter, R., El-Yaniv, R.: Superior Guarantees for Sequential Prediction and Lossless Compression via Alphabet Decomposition. Journal of Machine Learning Research, 7, 2006, 379-411.
  • [4] Billingsley, P.: Ergodic theory and information, John Wiley & Sons, 1965.
  • [5] Calude, C. S., Staiger, L., Svozil, K.: Randomness Relative to Cantor Expansions. CDMTCS Research Report Series, 2003, http://www.es.auckland.ac.nz/CDMTCS//researchreports/213cris.pdf
  • [6] Calude, C.S., Staiger, L., Terwijn, S.A.: On Partial Randomness. CDMTCS Research Report Series, 2004, http://www.es.auckland.ac.nz/CDMTCS//researchreports/239cris.pdf
  • [7] Cilibrasi, R., Vitanyi, P.M.B.: Clustering by Compression. IEEE Transactions on Information Theory, 51(4), 2005.
  • [8] Cilibrasi R., de Wolf R., Vitanyi P.M.B.: Algorithmic Clustering of Music. Computer Music Journal, 28(4), 2004,49-67.
  • [9] Cover, T.M., Thomas J.A.: Elements of Information Theory. JohnWiley and sons, 1991.
  • [10] Csiszar, I., Korner, J.: Information Theory: Coding Theorems for Discrete Memoryless Systems. Budapesht, Akademiai Kiadó, 1981.
  • [11] Csiszar, I., Shields, P.: The consistency of the BIC Markov order estimation. Annals of Statistics, 2000, 6, 1601-1619.
  • [12] Darbellay, G.A., Vajda, I.: Estimatin of the mutual information with data-dependent partitions. IEEE Trans. Inform. Theory. 48(5), 1999, 1061-1081.
  • [13] Effros, M., Visweswariah, K., Kulkami, S.R., Verdu, S.: Universal lossless source coding with the Burrows Wheeler transform. IEEE Trans. Inform. Theory, 45, 1999, 1315-1321.
  • [14] Feller, W.: An Introduction to Probabability Theory and Its Applications, vol.1. John Wiley & Sons, New York, 1970.
  • [15] Gallager, R.G.: Information Theory and Reliable Communication. John Wiley & Sons, New York, 1968.
  • [16] Hutter, M.: Universal Artificial Intelligence. Sequential Decisions based on algorithmic probability, Springer-Verlag, 2005.
  • [17] Kelly, J.L.: A new interpretation of information rate, Bell System Tech. J., 35, 1956, 917-926.
  • [18] Kieffer, J.: Prediction and Information Theory, Preprint, 1998. (available at ftp://oz.ee.umn.edu/users/kieffer/papers/prediction.pdf/)
  • [19] Kieffer, J.C., En-Hui Yang.: Grammar-based codes: a new class of universal lossless source codes. IEEE Transactions on Information Theory, 46 (3),2000, 737 - 754.
  • [20] Knuth, D.E.: The art of computer programming. Vol.2. Addison Wesley, 1981.
  • [21] Kolmogorov, A.N.: Three approaches to the quantitative definition of information, Problems of Inform. Transmission, 1,1965,3-11
  • [22] Krichevsky, R.: A relation between the plausibility of information about a source and encoding redundancy, Problems Inform. Transmission, 4(3), 1968, 48-57.
  • [23] Krichevsky, R.: Universal Compression and Retrival. Kluver Academic Publishers, 1993.
  • [24] Kullback, S.: Information Theory and Statistics, Wiley, New York, 1959.
  • [25] Li, M., Vitanyi, P.: An Introduction to Kolmogorov Complexity and Its Applications, Springer-Verlag, New York, 2nd Edition, 1997.
  • [26] Modha, D.S., Masry, E.: Memory-universal prediction of stationary random processes. IEEE Trans. Inform. Theory, 44(1), 1998, 117-133.
  • [27] Nobel, A.B.: On optimal sequential prediction. IEEE Trans. Inform. Theory, 49(1), 2003, 83-98.
  • [28] Rissanen, J.: Generalized Kraft inequality and arithmetic coding, IBM J. Res. Dev., 20(5), 1976,198-203.
  • [29] Rissanen, J.: Modeling by shortest data description. Automatica, v.14, 1978, 465-471.
  • [30] Rissanen, J.: Universal coding, information, prediction, and estimation., IEEE Trans. Inform. Theory, 30(4), 1984, 629-636.
  • [31] Ryabko, B.Ya.: Twice-universal coding. Problems of Information Transmission, 20(3), 1984,173-177.
  • [32] Ryabko, B.Ya.-. Prediction of random sequences and universal coding. Problems of Inform. Transmission, 24(2), 1988, 87-96.
  • [33] Ryabko, B.Ya.: A fast adaptive coding algorithm. Problems of Inform. Transmission, 26(4), 1990, 305-317.
  • [34] Ryabko, B. Ya.: The complexity and effectiveness of prediction algorithms. J. Complexity, 10 (3), 1994, 281-295.
  • [35] Ryabko, B., Astola, J.: Universal Codes as a Basis for Time Series Testing. Statistical Methodology, 3, 2006, 375-397.
  • [36] Ryabko, B., Astola, J., Gammerman, A.: Application of Kolmogorov complexity and universal codes to identity testing and nonparametric testing of serial independence for time series. Theoretical Computer Science, 359, 2006,440-448.
  • [37] Ryabko, B. Ya., Monarev, V.A.: Using information theory approach to randomness testing. Journal of Statistical Planning and Inference, 2005, 133(1), 95-110.
  • [38] Ryabko, B., Topsoe, P.: On Asymptotically Optimal Methods of Prediction and Adaptive Coding for Markov Sources. Journal of Complexity, 18(1), 2002, 224-241.
  • [39] Ryabko, D., Hutter, M.: Sequence prediction for non-stationary processes. In proceedings: IEEE International Symposium on Information Theory (ISIT 2007), 2007,2346-2350. see also http://arxiv.org/pdf/cs.LG/0606077
  • [40] Staiger, L.: The Kolmogorov complexity of infinite words. CDMTCS Research Report Series, 2006, http://www.es.auckland.ac.nz/CDMTCS//researchreports/2791udwig.pdf
  • [41] Szpankowsky, W.: Average case analysis of algorithms on sequences. John Wiley and Sons, New York, 2001.
  • [42] Uspenskii, V.A., Semenov, A.L., Shen, A.K.: Can an individual sequence of zeros and ones be random?, Russian Mathematical Surveys, 45, 1990.
  • [43] Vereshchagin, N., Vitanyi, P.M.B.: Kolmogorov's structure functions with application to the foundations of model selections. In: Proc. 43th Symposium on Foundations of Computer Science, 2002, 751- 760.
  • [44] Vitanyi, P.M.B., Li, M.: Minimum description length induction, Bayesianism, and Kolmogorov complexity, IEEE Trans. Inform. Theory, 46(2), 2000,446-464.
  • [45] Willems, F.M.J., Shtarkov, Y.M., Tjalkens, Tj.J.: The Context-Tree Weighting Method: Basic Properties, IEEE Trans. Inform. Theory, 41,1995,653-664.
  • [46] Willems, F.M.J., Shtarkov, Y.M., Tjalkens, Tj.J.: Reflections on 'The Context-Tree Weighting Method: Basic Properties', Newsletter of the IEEE Information Theory Society, 47(1), 1997
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-article-BUS5-0015-0046
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.