Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników

Znaleziono wyników: 3

Liczba wyników na stronie
first rewind previous Strona / 1 next fast forward last
Wyniki wyszukiwania
w słowach kluczowych:  source coding
help Sortuj według:

help Ogranicz wyniki do:
first rewind previous Strona / 1 next fast forward last
Content available remote Supporting source code annotations with metadata-aware development environment
To augment source code with high-level metadata with the intent to facilitate program comprehension, a programmer can use annotations. There are several types of annotations: either those put directly in the code or external ones. Each type comes with a unique workflow and inherent limitations. In this paper, we present a tool providing uniform annotation process, which also adds custom metadata-awareness for an industrial IDE. We also report an experiment in which we sought whether the created annotating support helps programmers to annotate code with comments faster and more consistently. The experiment showed that with the tool the annotating consistency was significantly higher but also that the increase in annotating speed was not statistically significant.
Content available remote Teoria informacji a statystyka matematyczna
W niniejszym artykule przedstawiony jest zarys teorii informacji z probabilistycznego i statystycznego punktu widzenia. Ten nurt teorii informacji rozwijał się intensywnie w ostatnich dziesięcioleciach. Wpłynął tez w znaczący sposób na rozwój metod statystycznych. Celem artykułu jest wprowadzenie czytelnika w przystępny sposób w podana powyżej tematykę, dostarczenie mu pewnych intuicji i przybliżenie specyfiki podejścia teorio-informacyjnego w statystyce matematycznej.
In the paper we present an outline of the information theory from the probabilistic and statistical point of view. Such a direction of the information theory has been intensively developed in recent decades and significantly influenced a progress in the statistical methodology. The aim of the article is to introduce the reader into these problems, provide some intuitions and acquaint with a specific information-theoretic approach to the mathematical statistics. The first part of the paper is devoted to brief and easy of approach introduction to the main notions of the information theory like entropy, relative entropy (Kullback- Leibler distance), information projection and Fisher information as well as presentation of their most important properties including de Bruijn’s identity, Fisher information inequalities and entropy power inequalities. In the short second part we give applications of the notions and results from the first part to limit theorems of the probability theory such as the asymptotic equipartition property, the convergence of empirical measures in the entropy distance, large deviation principle with emphasis to Sanov theorem, the convergence of distributions of homogeneous Markov chains in the entropy distance and the central limit theorem. The main, last part of the article shows some most significant and important applications of the information theory to the mathematical statistics. We discuss connections of the maximum likelihood estimators with the information projections and the notion of sufficient statistic from the information-theoretic point of view. The problems of source coding, channel capacity and an amount of information provided by statistical experiments are presented in a statistical framework. Some attention is paid to the expansion of Clarke and Barron and its corollaries e.g. in density estimation. Next, applications of the information theory to hypothesis testing is discussed. We give the classical Stein’s Lemma and its generalization to testing composite hypothesis obtained by Bahadur and show their connections with the asymptotic efficiency of statistical tests. Finally, we briefly mention the problem of information criteria in a model seletion including the most popular two-stage minimal description length criterion of Rissanen. The enclosed literature is limited only to papers and books which are referred to in the paper.
Limitations on memory and resources of communications systems require powerful data compression methods. Decompression of compressed data stream is very sensitive to errors which arise during transmission over noisy channels, therefore error correction coding is also required. One of the solutions to this problem is the application of joint source and channel coding. This paper contains a description of methods of joint source-channel coding based on the popular data compression algorithms LZ'77 and LZSS. These methods are capable of introducing some error resiliency into compressed stream of data without degradation of the compression ratio. We analyze joint source and channel coding algorithms based on these compression methods and present their novel extensions. We also present some simulation results showing usefulness and achievable quality of the analyzed algorithms.
first rewind previous Strona / 1 next fast forward last
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.