Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników

Znaleziono wyników: 5

Liczba wyników na stronie
first rewind previous Strona / 1 next fast forward last
Wyniki wyszukiwania
Wyszukiwano:
w słowach kluczowych:  Fisher information
help Sortuj według:

help Ogranicz wyniki do:
first rewind previous Strona / 1 next fast forward last
1
Content available remote Study on application of Fisher information for power system fault detection
EN
The ability to accurately detect power system faults is of vital importance for the purpose of isolating malfunctioning equipment and resuming normal operation as soon as possible after a fault occurs. People have used a variety of electric parameters as metrics to identify faults for a long time. The method proposed by this paper departs from the traditional approach by introducing Fisher information (FI) as a measure of the stability of electric signals and as a criterion for making fault decisions. In this way, a non-dimensional positive parameter is used as a single criterion to deliver fault detection for power distribution networks. Firstly, we simplified the formula of FI and then adopted a practical method for calculating values of FI. We demonstrated the application of FI to measure the stability of electric signals. Finally, we combined FI with wavelet analysis to propose a novel technique for phase selection of a power distribution network with a grounding short-circuit fault, namely the wavelet-based Fisher information (WFI). Simulation studies were then carried out to show the feasibility of the proposed method.
2
Content available remote The Fisher information and exponential families parametrized by a segment of means
EN
We consider natural and general exponential families (Qm)mϵM on Rd parametrized by the means. We study the submodels (Q0m1+(1−0)m2)0ϵ[0, 1] parametrized by a segment in the means domain from the point of view of the Fisher information. Such a parametrization allows for a parsimonious model and is particularly useful in practical situations when hesitating between two parameters m1 and m2. The most interesting cases are multivariate Gaussian and Wishart models with matrix parameters.
3
Content available remote Teoria informacji a statystyka matematyczna
PL
W niniejszym artykule przedstawiony jest zarys teorii informacji z probabilistycznego i statystycznego punktu widzenia. Ten nurt teorii informacji rozwijał się intensywnie w ostatnich dziesięcioleciach. Wpłynął tez w znaczący sposób na rozwój metod statystycznych. Celem artykułu jest wprowadzenie czytelnika w przystępny sposób w podana powyżej tematykę, dostarczenie mu pewnych intuicji i przybliżenie specyfiki podejścia teorio-informacyjnego w statystyce matematycznej.
EN
In the paper we present an outline of the information theory from the probabilistic and statistical point of view. Such a direction of the information theory has been intensively developed in recent decades and significantly influenced a progress in the statistical methodology. The aim of the article is to introduce the reader into these problems, provide some intuitions and acquaint with a specific information-theoretic approach to the mathematical statistics. The first part of the paper is devoted to brief and easy of approach introduction to the main notions of the information theory like entropy, relative entropy (Kullback- Leibler distance), information projection and Fisher information as well as presentation of their most important properties including de Bruijn’s identity, Fisher information inequalities and entropy power inequalities. In the short second part we give applications of the notions and results from the first part to limit theorems of the probability theory such as the asymptotic equipartition property, the convergence of empirical measures in the entropy distance, large deviation principle with emphasis to Sanov theorem, the convergence of distributions of homogeneous Markov chains in the entropy distance and the central limit theorem. The main, last part of the article shows some most significant and important applications of the information theory to the mathematical statistics. We discuss connections of the maximum likelihood estimators with the information projections and the notion of sufficient statistic from the information-theoretic point of view. The problems of source coding, channel capacity and an amount of information provided by statistical experiments are presented in a statistical framework. Some attention is paid to the expansion of Clarke and Barron and its corollaries e.g. in density estimation. Next, applications of the information theory to hypothesis testing is discussed. We give the classical Stein’s Lemma and its generalization to testing composite hypothesis obtained by Bahadur and show their connections with the asymptotic efficiency of statistical tests. Finally, we briefly mention the problem of information criteria in a model seletion including the most popular two-stage minimal description length criterion of Rissanen. The enclosed literature is limited only to papers and books which are referred to in the paper.
EN
An operational model of portfolio selection is presented. The target of a risk-neutral investor is to select the best portfolio composed of assets with the highest rate of return. The term "best" means that probability of attaining the return required by an investor is the largest for all possible stopping times, for which s/he has the right to buy information concerning the random vector of returns from investments (assets).
PL
Zaprezentowano pewien operatywny model wyboru portfela inwestycyjnego. Celem inwestora neutralnego względem ryzyka jest wybór najlepszego portfela złożonego z walorów o najwyższych stopach zwrotu. Pojęcie "najlepszy" oznacza, że prawdopodobieństwo osiągnięcia zwrotu wymaganego przez inwestora jest najwyższe dla wszystkich możliwych czasów stopu, przy których inwestor ma prawo zakupu informacji dotyczącej losowego wektora zwrotów z inwestycji (z walorów).
5
Content available remote Point regularity of p-stable density in Rd and Fisher information
EN
In the paper we prove that the n-th directional derivative of a p-stable density f (x) in the direction a can be estimated by [formula], where 0 < u < 1, and C depends also on geometrical properties of the Lévy measure. This inequality helps us to calculate the Fisher information of stable measures.
first rewind previous Strona / 1 next fast forward last
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.