Ograniczanie wyników
Czasopisma help
Autorzy help
Lata help
Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników

Znaleziono wyników: 42

Liczba wyników na stronie
first rewind previous Strona / 3 next fast forward last
Wyniki wyszukiwania
Wyszukiwano:
w słowach kluczowych:  information theory
help Sortuj według:

help Ogranicz wyniki do:
first rewind previous Strona / 3 next fast forward last
1
Content available remote Information in the decision-making process
EN
The thesis that the variability of conditions in which organizations operate has an impact on the decisions made about their functioning is the starting point for achieving the research objective and presenting the research findings. The objective of the research was to demonstrate the importance of the information system in the organization and to visualize the impact of information on the decision-making process, also in terms of shaping the future. The analysis of a number of theoretical studies has led to conclusions that unequivocally prove that every decision should be based on reliable information. This is strongly emphasized in the article. The main methods used were literature analysis, synthesis, mathematical modeling and desk research. In addition, the paper presents tools for describing the uncertainty associated with the occurrence of random phenomenon, i.e., for assessing the amount of information conveyed by the observation of random phenomenon and for comparing two variables with different information potential. The analyses conducted clearly indicate that decision making and information must be closely linked. The results of the research inquiries in the publication also indicate the role to be played by the information system in the organization. It contains the state-ment that no decision can be made without information. The approach adopted in the article allowed for presentation of essential research findings, while providing a basis for further, extended research on this extremely important and topical issue, especially in the context of the internationalization of a number of phe-nomena and processes.
PL
Teza, że zmienność warunków funkcjonowania organizacji nie pozostaje bez wpływu na dokonywane rozstrzygnięcia związane z ich funkcjonowaniem, jest punktem wyjścia do osiągnięcia założonego celu i prezentacji wyników badań. Celem podjętych badań było wykazanie znaczenia systemu informacyjnego w organizacji i unaocznienie wpływu informacji na proces decyzyjny, także pod kątem kształtowania przyszłości. W wyniku analizy wielu opracowań teoretycznych, także zawierających wyniki przeprowadzonych badań, wysnuto liczne wnioski. One to jednoznacznie dowodzą, że podstawą każdego rozstrzygnięcia powinna być wiarygodna informacja. Jest to mocno podkreślone w artykule. Ukazano również potrzebę podejmowania działań ukierunkowanych na zdobycie, w miarę szybko, nieodzownych informacji, aby stworzyć podstawy do korzystnych rozstrzygnięć dla firm, które funkcjonują w zmieniających się warunkach. Nieodzowne są także wysokie kompetencje informacyjne menedżerów. Przeprowadzone analizy jednoznacznie wskazują, że decydowanie i informacja muszą być ze sobą ściśle pozwiązane. Wyniki dociekań naukowych zawarte w publikacji wskazują również na rolę, jaką ma spełniać system informacyjny w organizacji, który choć składa się z trzech etapów, to są one logicznie powiązane, fazy te charakteryzuje pragmatyzm naukowy. Zawarto twierdzenie, że bez informacji nie ma decyzji. W artykule przyjęte podejście umożliwiło uzyskanie przekrojowego charakteru prezentacji wyników badań, dając jednocześnie podstawy do dalszych, poszerzonych badań tego niezwykle istotnego oraz aktualnego problemu, zwłaszcza w kontekście umiędzynarodowiania wielu zjawisk i procesów.
EN
This paper describes the model which allows an estimation of the readability factor of texts written in natural language or programs coded in syntax of programming languages. Only font styles are considered in this model. The destination of the model is improving readability. It can get though change font style. Several samples of text written in natural language have been used to estimation of the readability factor. Then these factors for given texts have been increased or reduced though intentional change font style. Studies have shown that deliberately changing the font style has a visible effect on improving readability or significantly lowering it.
EN
This paper describes the method which allows an estimation of information entropy in the meaning of Shannon. The method is suitable to an estimation which sample has a higher value of information entropy. Several algorithms have been used to estimate entropy, assuming that they do it faster. Each algorithm has calculated this value for several text samples. Then analysis has verified which comparisons of the two samples were correct. It has been found that the probabilistic algorithm is the fastest and most effective in returning the estimated value of entropy.
EN
Counting in natural language presupposes that we can successfully identify what counts as one, which, as we argue, relies on how and whether one can balance two pressures on learning nominal predicates, which we formalise in probabilistic and information theoretic terms: individuation (establishing a schema for judging what counts as one with respect to a predicate); and reliability (establishing a reliable criterion for applying a predicate). This hypothesis has two main consequences. First, the mass/count distinction in natural language is a complex phenomenon that is partly grounded in a theory of individuation, which we contend must integrate particular qualitative properties of entities, among which a key role is player by those that rely on our spatial perception. Second, it allows us to predict when we can expect the puzzling variation in mass/count lexicalization, cross- and intralinguistically: namely, exactly when the two learning pressures of individuation and reliability conflict.
5
Content available remote Binary Analysis based on Symbolic Execution and Reversible x86 Instructions
EN
We present a binary analysis framework based on symbolic execution with the distinguishing capability to execute stepwise forward and also backward through the execution tree. It was developed internally at Bitdefender and code-named RIVER. The framework provides components such as a taint engine, a dynamic symbolic execution engine, and integration with Z3 for constraint solving. In this paper we will provide details on the framework and give an example of analysis on binary code.
6
Content available remote Generalizations of Rough Set Tools Inspired by Graph Theory
EN
We introduce and study new generalizations of some rough set tools. Namely, the extended core, the generalized discernibility function, the discernibility space and the maximum partitioner. All these concepts where firstly introduced during the application of rough set theory to graphs, here we show that they have an interesting and useful interpretation also in the general setting. Indeed, among other results, we prove that reducts can be computed in incremental polynomial time, we give some conditions in order that a partition coincides with an indiscernibility partition of a given information table and we give the conditions such that a discernibility matrix corresponds to an information table.
7
Content available remote Verification of Scenarios in Petri Nets Using Compact Tokenflows
EN
In this paper we tackle the problem of verifying whether a scenario is executable in a Petri net. In contrast to sequentially ordered runs, a scenario includes both information about dependencies and independencies of events. Consequently, a scenario allows a precise and intuitive specification of a run of a concurrent or distributed system. In this paper we consider Petri nets with arc weights, namely marked place/transition-nets (p/t-nets) and p/t-nets with inhibitor arcs (pti-nets). A scenario of a p/t-net is a labelled partial order (lpo). A scenario of a pti-net is a labelled stratified order structure (lso). Accordingly, the question is either whether a given lpo is in the language of a given p/t-net or whether an lso is in the language of a given pti-net. Different approaches exist to define the partial language of a Petri net. Each definition yields a different verification algorithm, but existing algorithms perform quite poorly in terms of runtime for most examples. We introduce a new compact characterization of the partial language of a Petri net. This characterization is optimized with respect to the verification problem. The paper is a revised and extended version of the conference paper [10].
8
Content available remote Maximizing T-complexity
EN
We investigate Mark Titchener’s T-complexity, an algorithm which measures the information content of finite strings. After introducing the T-complexity algorithm, we turn our attention to a particular class of “simple” finite strings. By exploiting special properties of simple strings, we obtain a fast algorithm to compute the maximum T-complexity among strings of a given length, and our estimates of these maxima show that T-complexity differs asymptotically from Kolmogorov complexity. Finally, we examine how closely de Bruijn sequences resemble strings with high Tcomplexity.
9
Content available remote Deterministic One-Way Turing Machines with Sublinear Space
EN
Deterministic one-way Turing machines with sublinear space bounds are systematically studied. We distinguish among the notions of strong, weak, and restricted space bounds. The latter is motivated by the study of P automata. The space available on the work tape depends on the number of input symbols read so far, instead of the entire input. The class of functions space constructible by such machines is investigated, and it is shown that every function f that is space constructible by a deterministic two-way Turing machine, is space constructible by a strongly f space-bounded deterministic one-way Turing machine as well. We prove that the restricted mode coincides with the strong mode for space constructible functions. The known infinite, dense, and strict hierarchy of strong space complexity classes is derived also for the weak mode by Kolmogorov complexity arguments. Finally, closure properties under AFL operations, Boolean operations and reversal are shown.
EN
Discretization is one of the most important parts of decision table preprocessing. Transforming continuous values of attributes into discrete intervals influences further analysis using data mining methods. In particular, the accuracy of generated predictions is highly dependent on the quality of discretization. The paper contains a description of three new heuristic algorithms for discretization of numeric data, based on Boolean reasoning. Additionally, an entropy-based evaluation of discretization is introduced to compare the results of the proposed algorithms with the results of leading university software for data analysis. Considering the discretization as a data compression method, the average compression ratio achieved for databases examined in the paper is 8.02 while maintaining the consistency of databases at 100%.
11
Content available remote Some lower bounds on the Shannon capacity
EN
In the paper we present a measure of a discrete noisy channel, named the Shannon capacity, which is described in the language of graph theory. Unfortunately, the Shannon capacity C0 is difficult to calculate, so we try to estimate the value of C0 for specific classes of graphs, i.e. circular graphs.
EN
The paper presents the project of a multimedia system of wide and fast information on the transport routes dealing with their current conditions. The system has a module structure and it is combined of four modules which institute a subject of the presented discussions and conclusions. Considerable part of the article is devoted to the method of the switching circuit property applications to creation of the selection algorithm of a transport route according to a defined criterion, leading to automation of continuous information process.
13
Content available remote A Secure Non-monotonic Soft Concurrent Constraint Language
EN
We present a fine-grained security model to enforce the access control on the shared constraint store in Concurrent Constraint Programming (CCP) languages. We show the model for a non-monotonic version of Soft CCP (SCCP), that is an extension of CCP where the constraints have a preference level associated with them. Crisp constraints can be modeled in the same framework as well. In the considered non-monotonic soft version (NmSCCP), it is also possible to remove constraints from the store. The language can be used for coordinating agents on a common store of information that represents the set of shared resources. In such scenarios, it is clearly important to enforce the integrity and confidentiality rights on the resources, in order, for instance, to hide part of the information to some agents, or to prevent an agent to consume too many resources. Finally, we present a bisimulation relation to check equivalence between two programs written in this language.
14
EN
In this paper we tackle the problem of the reconstruction of object based images, specifically formed by a set of circles inside a ring. By analyzing the projections of the image, we are able to determine some coordinates corresponding to interest points that give significant information about features of the image of aid in the reconstruction. Our approach yields promising results in comparison to other methods in literature. Finally, we discuss how a similar approach could be extended to more complex problems deriving from tomographic applications, in order to develop an efficient method exploiting the prior knowledge assumed on an image.
15
Content available remote Holistic Entropy Reduction for Collaborative Filtering
EN
We propose a collaborative filtering (CF) method that uses behavioral data provided as propositions having the RDF-compliant form of (user X, likes, item Y ) triples. The method involves the application of a novel self-configuration technique for the generation of vector-space representations optimized from the information-theoretic perspective. The method, referred to as Holistic Probabilistic Modus Ponendo Ponens (HPMPP), enables reasoning about the likelihood of unknown facts. The proposed vector-space graph representation model is based on the probabilistic apparatus of quantum Information Retrieval and on the compatibility of all operators representing subjects, predicates, objects and facts. The dual graph-vector representation of the available propositional data enables the entropy-reducing transformation and supports the compositionality of mutually compatible representations. As shown in the experiments presented in the paper, the compositionality of the vector-space representations allows an HPMPP-based recommendation system to identify which of the unknown facts having the triple form (user X, likes, item Y ) are the most likely to be true in a way that is both effective and, in contrast to methods proposed so far, fully automatic.
EN
The problem considered in this article involves the construction of evaluation model, which could subsequently be used in the field of modeling and risk management. The research work is finalized by a construction of a new model on the basis of observations of the models used for risk management and knowledge of information theory, machine learning and artificial neural networks. The developed tools are trained online, using their ability for automatic deduction rules based on data, during model application for evaluation tasks. The model, consequently changes the data analysis stage, limits the scope of the necessary expertise in the area, where the assessment model can be used and, to some extent, the shape of the model becomes independent from the current range of available data. These features increase its ability to generalize and to cope with the data of previously undefined classes, as well as improve its resistance to gaps occurring in the data. Performance of the model presented in this paper is tested and verified on the basis of real-life data, which would resemble a potentially real practical application. Preliminary tests performed within the scope of this work indicate that the developed model can form a starting point for further research as some of the used mechanisms have a fairly high efficiency and flexibility.
PL
W artykule przedstawiono próbę oceny zawartości informacyjnej wyników wyrównania danych pomiarowych na przecinających się halsach, na których pozycje okrętu zostały wyznaczone z dużą dokładnością. Na bazie wybranej metody wyrównania oraz definicji ilości informacji oceniono zawartość informacyjną wyrównywanego wyniku.
EN
This article contains the first attempt of the information content estimation of the crossing observations errors adjustment. On the base of adjustment methods and definitions of the information theory, estimation of the information content of the adjusted result is given.
PL
Artykuł składa się z dwóch części. W pierwszej części dokonano przeglądu metod zapewniania poufności i integralności danych, stosowanych w sieciach bezprzewodowych opartych na standardach rodziny IEEE 802.11. Omówiono stabości protokołów WEP oraz WPA, a także przedstawiono protokół WPA2 oparty na standardzie IEEE 802.11 i. W drugiej części artykułu przedstawiono odkrycia teorii informacji zmierzające do wyeliminowania możliwości przechwycenia informacji przez intruza drogą radiową. Skupiono się przy tym na kodach LDPC (Low Density Parity-Check). Podjęto także dyskusję na temat ograniczeń stosowalności tych kodów dla transmisji z widzialnością optyczną anten (Line-of-Sight).
EN
The following article is composed of two parts. The first part comprises an overview of methods providing data confidentiality and integrity in wire- less LANs based on IEEE 802.11 family of standards. The weaknesses of WEP and WPA protocols were discussed. Protocol WPA2 based on IEEE802.1H was presented. The second part of the article presents the newest discoveries in information theory which seem to eliminate the possibility of capturing messages sent via radio waves. LDPC codes are focused on in detail. Additionally, limited applicability of these codes to Line-of-sight transmissionis discussed.
19
Content available remote Rough Net Structures : Example of Information System
EN
Information system of net structures based on their calculus (a distributive lattice) is introduced and, in this context, basic notions of rough set theory are re-formulated and exemplified.
EN
Pawlak's flowgraph has been applied as a suitable data structure for description and analysis of human behaviour in the area supervised with multicamera video surveillance system. Information contained in the flowgraph can be easily used to predict consecutive movements of a particular object. Moreover, utilization of the flowgraph can support reconstructing object route from the past video images. However, such a flowgraph with its accumulative nature needs a certain period of time for adaptation to changes in behaviour of objects which can be caused, e.g. by closing a door or placing other obstacle forcing people to pass it by. In this paper a method for reduction of time needed for flowgraph adaptation is presented. Additionally, distance measure between flowgraphs is also introduced in order to determine if carrying out the adaptation process is needed.
first rewind previous Strona / 3 next fast forward last
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.