Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Powiadomienia systemowe
  • Sesja wygasła!
  • Sesja wygasła!

Znaleziono wyników: 5

Liczba wyników na stronie
first rewind previous Strona / 1 next fast forward last
Wyniki wyszukiwania
Wyszukiwano:
w słowach kluczowych:  Shannon's entropy
help Sortuj według:

help Ogranicz wyniki do:
first rewind previous Strona / 1 next fast forward last
EN
Phonocardiogram (PCG) recordings contain valuable information about the functioning and state of the heart that is useful in the diagnosis of cardiovascular diseases. The first heart sound (S1) and the second heart sound (S2), produced by the closing of the atrioventricular valves and the closing of the semilunar valves, respectively, are the fundamental sounds of the heart. The similarity in morphology and duration of these heart sounds and their superposition in the frequency domain makes it difficult to use them in computer systems to provide an automatic diagnosis. Therefore, in this paper, we analyzed these heart sounds in the intrinsic mode functions (IMF) domain, which were issued from two time-frequency decomposition techniques, the empirical mode decomposition (EMD) and the improved complete ensemble empirical mode decomposition with adaptive noise (ICEEMDAN), with the aim of retrieving useful information on an expanded basis. The decomposition of PCG recordings into IMF allows representing the fundamental cardiac sounds in many oscillating components, increasing thus the observability of the system. Moreover, the time-frequency representation of PCG recordings could provide valuable information to automatically detect heart sounds and diagnose pathologies from characteristic patterns of these heart sounds in the IMF. The analysis was made through the variance and Shannon's entropy of the heart sounds, observed in time windows located among different IMF. In addition, we determined the frequencies ranges of the IMF from the decomposition of the PCG recordings using both techniques. Given that the frequency content of S1 and S2 is different but overlap each other, and the duration of these sounds are also different, these heart sounds were represented in different IMF with different variances and entropies, in both techniques, but the ICEEMDAN offers a more consistent decomposition of S1 and S2 (they were concentrated in IMF 4-6). The decomposition of PCG signals into IMF has allowed us to identify the frequency components of the IMF in which these sounds are found.
EN
Brain tumor is one of the harsh diseases among human community and is usually diagnosed with medical imaging procedures. Computed-Tomography (CT) and Magnetic-Resonance- Image (MRI) are the regularly used non-invasive methods to acquire brain abnormalities for medical study. Due to its importance, a significant quantity of image assessment and decision-making procedures exist in literature. This article proposes a two-stage image assessment tool to examine brain MR images acquired using the Flair and DW modalities. The combination of the Social-Group-Optimization (SGO) and Shannon's-Entropy (SE) supported multi-thresholding is implemented to pre-processing the input images. The image post-processing includes several procedures, such as Active Contour (AC), Watershed and region-growing segmentation, to extract the tumor section. Finally, a classifier system is implemented using ANFIS to categorize the tumor under analysis into benign and malignant. Experimental investigation was executed using benchmark datasets, like ISLES and BRATS, and also clinical MR images obtained with Flair/DW modality. The outcome of this study confirms that AC offers enhanced results compared with other segmentation.
EN
This paper describes the method which allows an estimation of information entropy in the meaning of Shannon. The method is suitable to an estimation which sample has a higher value of information entropy. Several algorithms have been used to estimate entropy, assuming that they do it faster. Each algorithm has calculated this value for several text samples. Then analysis has verified which comparisons of the two samples were correct. It has been found that the probabilistic algorithm is the fastest and most effective in returning the estimated value of entropy.
4
Content available remote Złożoność danych pomiarowych i metody jej określania
PL
W referacie przedstawiono problematykę określania złożoności danych pomiarowych oraz przegląd pewnej klasy algorytmów wyznaczających złożoność. Omówiono złożoność Kołmogorowa, entropię Shannona oraz entropię próbek. Celem przeprowadzonych prac była analiza wybranych algorytmów zastosowanych do określania złożoności sygnałów generowanych przez coraz bardziej złożony system. Badania miały charakter eksperymentów symulacyjnych. Ich wyniki pokazują, że miara złożoności danych generowanych przez badany system może być powiązana ze złożonością jego struktury i procesów zachodzących w samym systemie. Jest to krok wstępny do opracowania metody monitorowania zmian zachodzących w systemach złożonych poprzez oceny złożoności generowanych przez nie sygnałów. Do systemów takich należy m.in. układ oddechowy monitorowany podczas bezdechu sennego.
EN
In this paper the problem of assessment of measurement data complexity is presented together with a review of a certain class of algorithms for complexity calculation. The Kolmogorov complexity, Shannon entropy and sample entropy are discussed. The aim of this work was to analyse the chosen algorithms used in the assessment of complexity of signals generated by a more and more complex system. The investigations were performed as simulation experiments. Their results show that the measure of complexity of data generated by the system under investigation can be related to the complexity of both the system’s structure and internal processes. This is a preliminary step towards the elaboration of a method for monitoring changes in complex systems by assessment of the complexity of generated by them signals. Such systems include e.g. the respiratory system monitored during sleep apnoea.
5
Content available remote Entropy and Gibbs distribution in image processing : an historical perspective
EN
This paper presents an historical overview about the entropy and its applications for the solution of inferential statistical problems in image processing. This survey covers some of the more important entropy-based research approaches. A brief introduction to the mathematical details and foundations about the basic concepts of Markov Random Fields (MRF} and related Gibbs sampling is also given. The information entropy is a mathematical measure of information or uncertainty derived from a probabilistic model. The paper starting from the seminal works of C. Shannon and of E.T. Javnes and of S. Geman and D. German discusses results obtained using different related techniques in image restoration, analysis and synthesis of textures and saliency maps construction. The paper moreover gives useful suggestions about the trend of development in future research
first rewind previous Strona / 1 next fast forward last
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.