Nowa wersja platformy, zawierająca wyłącznie zasoby pełnotekstowe, jest już dostępna.
Przejdź na https://bibliotekanauki.pl
Ograniczanie wyników
Czasopisma help
Lata help
Autorzy help
Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników

Znaleziono wyników: 47

Liczba wyników na stronie
first rewind previous Strona / 3 next fast forward last
Wyniki wyszukiwania
Wyszukiwano:
w słowach kluczowych:  information theory
help Sortuj według:

help Ogranicz wyniki do:
first rewind previous Strona / 3 next fast forward last
1
Content available Uplift Modeling in Direct Marketing
100%
EN
Marketing campaigns directed to randomly selected customers often generate huge costs and a weak response. Moreover, such campaigns tend to unnecessarily annoy customers and make them less likely to answer to future communications. Precise targeting of marketing actions can potentially results in a greater return on investment. Usually, response models are used to select good targets. They aim at achieving high prediction accuracy for the probability of purchase based on a sample of customers, to whom a pilot campaign has been sent. However, to separate the impact of the action from other stimuli and spontaneous purchases we should model not the response probabilities themselves, but instead, the change in those probabilities caused by the action. The problem of predicting this change is known as uplift modeling, differential response analysis, or true lift modeling. In this work, tree-based classifiers designed for uplift modeling are applied to real marketing data and compared with traditional response models, and other uplift modeling techniques described in literature. The experiments show that the proposed approaches outperform existing uplift modeling algorithms and demonstrate significant advantages of uplift modeling over traditional, response based targeting.
2
Content available remote Stability of two natural homologous proteins with different folds
100%
EN
The applicability of the model for protein folding process simulation is presented using as the test two homologous proteins of different fold: helical in 3BD1 and β-structural form in 2PIJ [L. van Dorn, T. Newlove, S. Chang, W. Ingram, M. Cordes. Biochemistry 45, 10542 (2006)]. The folding process is assumed to be directed by hydrophobic core directing the hydrophobic residues toward the center of the molecule and exposing the hydrophilic residues on the surface. The “fuzzy oil drop” model is expressed by the 3-dimensional Gauss function which mimics the external force field. The value of Gauss function is interpreted as the hydrophobicity density calculated in any point of the space of the protein body. The accordance of idealized and observed hydrophobicity distributions (calculated according to Levitt function) measured using the Kullback-Leibler divergence entropy reveals good accordance in two homological proteins of different folds. The structural differences appeared to be easily explainable on the basis of “fuzzy oil drop” model.
EN
The problem considered in this article involves the construction of evaluation model, which could subsequently be used in the field of modeling and risk management. The research work is finalized by a construction of a new model on the basis of observations of the models used for risk management and knowledge of information theory, machine learning and artificial neural networks. The developed tools are trained online, using their ability for automatic deduction rules based on data, during model application for evaluation tasks. The model, consequently changes the data analysis stage, limits the scope of the necessary expertise in the area, where the assessment model can be used and, to some extent, the shape of the model becomes independent from the current range of available data. These features increase its ability to generalize and to cope with the data of previously undefined classes, as well as improve its resistance to gaps occurring in the data. Performance of the model presented in this paper is tested and verified on the basis of real-life data, which would resemble a potentially real practical application. Preliminary tests performed within the scope of this work indicate that the developed model can form a starting point for further research as some of the used mechanisms have a fairly high efficiency and flexibility.
4
Content available Video Transmission Using Network Coding
100%
EN
Network coding is a new technique in the field of information theory and coding theory. This emerging technique offers great benefits in the communication networks such as increased throughput and energy savings. In this paper, we evaluate network coding for video transmission scenarios. In these scenarios, the source nodes encode the video packets, while the intermediate nodes implement network coding before forwarding the encoded packets to the end nodes. Finally, the end nodes decode the received packets in order to recovery the original video. H.264/MPEG-4 AVC is used as the video compression standard in this work. We have used the network simulator (NS-2) for our simulations. Our results show that network coding improves the system throughput, reduces the packet loss and improves the video quality in the end nodes.
5
100%
EN
Abstract It is demonstrated how to obtain the least biased description of a single particle spectra measured in all multiparticle production processes by using an information theory approach (known also as MaxEnt approach). The case of e+e annihilation in hadrons process is discussed in more detail as an example. Comparison between the MaxEnt approach and a simple dynamical model based on the cascade process is presented as well.
PL
Carl von Weizsäcker analyzes the information concept in the framework of his own philosophy. He emphasizes, on the one hand, the irreversibility inherent in the time structure and, on the other hand, the dependence of information on temporal relations. In his view, information constitutes an abstract, albeit real, element of the world related to the form, structure, and order. Its quantitative and dynamical character assimilates it to matter and energy. Information does not exist outside material processes and human consciousness. Its dynamical aspect is strictly connected with communication, and evolutionary and entropic processes. In Weizsäcker's view, information becomes a philosophical concept. His approach hardly could be regarded as a full theory of time and information; it is rather an attempt at grasping their mutual dependencies and meaning.
7
Content available remote Informational Analysis of Security and Integrity
100%
EN
Formalisms for analysis of systems of various nature specified by process algebras are proposed. They allow us to formalize security properties based on an absence of information flow and properties on system’s integrity. Resulting properties are compared and discussed. We present also quantification of these properties by means of information theory.
EN
In this paper a methodology of assessment and optimization of groundwater quality monitoring network which takes into account the evaluation criteria derived from the Shannon information theory is presented. The fundamental criteria derived from this theory are: (1) the value of marginal information entropy, which is a measure of the amount of information containing in the data in a location of sampling point, and (2) the value of transinformation (mutual information) which measures the amount of information shared between each of two sampling points. Transinformation can be interpreted as an index of the stochastic dependence between the random variables corresponding to groundwater quality data recorded in different sampling points of monitoring network and shows the reduction of uncertainty included in one variable due to the knowledge of the other variable. In the optimization problem the objective function involving the value of transinformation of the investigated water quality parameters (Cl, Cu, Na) is minimized. To minimize the objective function the simulated annealing algorithm, which allows to find a satisfactory sub-optimal solution, was used. The proposed methodology was applied to optimize the groundwater monitoring network of contaminant reservoir Żelazny Most, one of the worlds biggest industrial waste disposal site, which collects post-flotation contaminants originating from copper ore treatment. The results show an increase in the effectiveness of the monitoring network by reducing the number of sampling points while maintaining an acceptable amount of information available in the network.
9
Content available remote Maximizing T-complexity
88%
EN
We investigate Mark Titchener’s T-complexity, an algorithm which measures the information content of finite strings. After introducing the T-complexity algorithm, we turn our attention to a particular class of “simple” finite strings. By exploiting special properties of simple strings, we obtain a fast algorithm to compute the maximum T-complexity among strings of a given length, and our estimates of these maxima show that T-complexity differs asymptotically from Kolmogorov complexity. Finally, we examine how closely de Bruijn sequences resemble strings with high Tcomplexity.
10
Content available remote A Secure Non-monotonic Soft Concurrent Constraint Language
88%
EN
We present a fine-grained security model to enforce the access control on the shared constraint store in Concurrent Constraint Programming (CCP) languages. We show the model for a non-monotonic version of Soft CCP (SCCP), that is an extension of CCP where the constraints have a preference level associated with them. Crisp constraints can be modeled in the same framework as well. In the considered non-monotonic soft version (NmSCCP), it is also possible to remove constraints from the store. The language can be used for coordinating agents on a common store of information that represents the set of shared resources. In such scenarios, it is clearly important to enforce the integrity and confidentiality rights on the resources, in order, for instance, to hide part of the information to some agents, or to prevent an agent to consume too many resources. Finally, we present a bisimulation relation to check equivalence between two programs written in this language.
EN
We assess the predictability limits of the large-scale cloud patterns in the boreal summer intraseasonal variability (BSISO), which are measured by the infrared brightness temperature, a proxy for convective activity. A recent developed nonlinear data analysis technique, nonlinear Laplacian spectrum analysis (NLSA), is applied to the brightness temperature data, defining two spatial modes with high intermittency associated with the BSISO time series. Then a recent developed data-driven physics-constrained low-ordermodeling strategy is applied to these time series. The result is a four dimensional system with two observed BSISO variables and two hidden variables involving correlated multiplicative noise through the nonlinear energyconserving interaction. With the optimal parameters calibrated by information theory, the non-Gaussian fat tailed probability distribution functions (PDFs), the autocorrelations and the power spectrum of the model signals almost perfectly match those of the observed data. An ensemble prediction scheme incorporating an effective on-line data assimilation algorithm for determining the initial ensemble of the hidden variables shows the useful prediction skill in the non-El Niño years is at least 30 days and even reaches 55 days in those years with regular oscillations and the skillful prediction lasts for 18 days in the strong El Niño year (year 1998). Furthermore, the ensemble spread succeeds in indicating the forecast uncertainty. Although the reduced linear model with time-periodic stable-unstable damping is able to capture the non-Gaussian fat tailed PDFs, it is less skillful in forecasting the BSISO in the years with irregular oscillations. The failure of the ensemble spread to include the truth also indicates failure in quantification of the uncertainty. In addition, without the energy-conserving nonlinear interactions, the linear model is sensitive with parameter variations. mcwfnally, the twin experiment with nonlinear stochastic model has comparable skill as the observed data, suggesting the nonlinear stochastic model has significant skill for determining the predictability limits of the large-scale cloud patterns of the BSISO.
12
Content available remote Generalizations of Rough Set Tools Inspired by Graph Theory
88%
EN
We introduce and study new generalizations of some rough set tools. Namely, the extended core, the generalized discernibility function, the discernibility space and the maximum partitioner. All these concepts where firstly introduced during the application of rough set theory to graphs, here we show that they have an interesting and useful interpretation also in the general setting. Indeed, among other results, we prove that reducts can be computed in incremental polynomial time, we give some conditions in order that a partition coincides with an indiscernibility partition of a given information table and we give the conditions such that a discernibility matrix corresponds to an information table.
13
Content available remote Deterministic One-Way Turing Machines with Sublinear Space
88%
EN
Deterministic one-way Turing machines with sublinear space bounds are systematically studied. We distinguish among the notions of strong, weak, and restricted space bounds. The latter is motivated by the study of P automata. The space available on the work tape depends on the number of input symbols read so far, instead of the entire input. The class of functions space constructible by such machines is investigated, and it is shown that every function f that is space constructible by a deterministic two-way Turing machine, is space constructible by a strongly f space-bounded deterministic one-way Turing machine as well. We prove that the restricted mode coincides with the strong mode for space constructible functions. The known infinite, dense, and strict hierarchy of strong space complexity classes is derived also for the weak mode by Kolmogorov complexity arguments. Finally, closure properties under AFL operations, Boolean operations and reversal are shown.
14
Content available remote Verification of Scenarios in Petri Nets Using Compact Tokenflows
88%
EN
In this paper we tackle the problem of verifying whether a scenario is executable in a Petri net. In contrast to sequentially ordered runs, a scenario includes both information about dependencies and independencies of events. Consequently, a scenario allows a precise and intuitive specification of a run of a concurrent or distributed system. In this paper we consider Petri nets with arc weights, namely marked place/transition-nets (p/t-nets) and p/t-nets with inhibitor arcs (pti-nets). A scenario of a p/t-net is a labelled partial order (lpo). A scenario of a pti-net is a labelled stratified order structure (lso). Accordingly, the question is either whether a given lpo is in the language of a given p/t-net or whether an lso is in the language of a given pti-net. Different approaches exist to define the partial language of a Petri net. Each definition yields a different verification algorithm, but existing algorithms perform quite poorly in terms of runtime for most examples. We introduce a new compact characterization of the partial language of a Petri net. This characterization is optimized with respect to the verification problem. The paper is a revised and extended version of the conference paper [10].
15
Content available Semantic Sparse Representation of Disease Patterns
88%
EN
Sparse data representation is discussed in a context of useful fundamentals led to semantic content description and extraction of information. Disease patterns as semantic information extracted from medical images were underlined because of discussed application of computer-aided diagnosis. Compressive sensing rules were adjusted to the requirements of diagnostic pattern recognition. Proposed methodology of sparse disease patterns considers accuracy of sparse representation to estimate target content for detailed analysis. Semantics of sparse representation were modeled by morphological content analysis. Subtle or hidden components were extracted and displayed to increase information completeness. Usefulness of sparsity was verified for computer-aided diagnosis of stroke based on brain CT scans. Implemented method was based on selective and sparse representation of subtle hypodensity to improve diagnosis. Visual expression of disease signatures was fixed to radiologist requirements, domain knowledge and experimental analysis issues. Diagnosis assistance suitability was proven by experimental subjective rating and automatic recognition.
EN
We assess the predictability limits of the large-scale cloud patterns in the boreal summer intraseasonal variability (BSISO), which are measured by the infrared brightness temperature, a proxy for convective activity. A recent developed nonlinear data analysis technique, nonlinear Laplacian spectrum analysis (NLSA), is applied to the brightness temperature data, defining two spatial modes with high intermittency associated with the BSISO time series. Then a recent developed data-driven physics-constrained low-ordermodeling strategy is applied to these time series. The result is a four dimensional system with two observed BSISO variables and two hidden variables involving correlated multiplicative noise through the nonlinear energyconserving interaction. With the optimal parameters calibrated by information theory, the non-Gaussian fat tailed probability distribution functions (PDFs), the autocorrelations and the power spectrum of the model signals almost perfectly match those of the observed data. An ensemble prediction scheme incorporating an effective on-line data assimilation algorithm for determining the initial ensemble of the hidden variables shows the useful prediction skill in the non-El Niño years is at least 30 days and even reaches 55 days in those years with regular oscillations and the skillful prediction lasts for 18 days in the strong El Niño year (year 1998). Furthermore, the ensemble spread succeeds in indicating the forecast uncertainty. Although the reduced linear model with time-periodic stable-unstable damping is able to capture the non-Gaussian fat tailed PDFs, it is less skillful in forecasting the BSISO in the years with irregular oscillations. The failure of the ensemble spread to include the truth also indicates failure in quantification of the uncertainty. In addition, without the energy-conserving nonlinear interactions, the linear model is sensitive with parameter variations. mcwfnally, the twin experiment with nonlinear stochastic model has comparable skill as the observed data, suggesting the nonlinear stochastic model has significant skill for determining the predictability limits of the large-scale cloud patterns of the BSISO.
17
Content available remote Binary Analysis based on Symbolic Execution and Reversible x86 Instructions
88%
EN
We present a binary analysis framework based on symbolic execution with the distinguishing capability to execute stepwise forward and also backward through the execution tree. It was developed internally at Bitdefender and code-named RIVER. The framework provides components such as a taint engine, a dynamic symbolic execution engine, and integration with Z3 for constraint solving. In this paper we will provide details on the framework and give an example of analysis on binary code.
18
75%
EN
This paper describes the model which allows an estimation of the readability factor of texts written in natural language or programs coded in syntax of programming languages. Only font styles are considered in this model. The destination of the model is improving readability. It can get though change font style. Several samples of text written in natural language have been used to estimation of the readability factor. Then these factors for given texts have been increased or reduced though intentional change font style. Studies have shown that deliberately changing the font style has a visible effect on improving readability or significantly lowering it.
19
75%
EN
This paper describes the method which allows an estimation of information entropy in the meaning of Shannon. The method is suitable to an estimation which sample has a higher value of information entropy. Several algorithms have been used to estimate entropy, assuming that they do it faster. Each algorithm has calculated this value for several text samples. Then analysis has verified which comparisons of the two samples were correct. It has been found that the probabilistic algorithm is the fastest and most effective in returning the estimated value of entropy.
PL
W artykule przedstawiono próbę oceny zawartości informacyjnej wyników wyrównania danych pomiarowych na przecinających się halsach, na których pozycje okrętu zostały wyznaczone z dużą dokładnością. Na bazie wybranej metody wyrównania oraz definicji ilości informacji oceniono zawartość informacyjną wyrównywanego wyniku.
EN
This article contains the first attempt of the information content estimation of the crossing observations errors adjustment. On the base of adjustment methods and definitions of the information theory, estimation of the information content of the adjusted result is given.
first rewind previous Strona / 3 next fast forward last
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.