Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Powiadomienia systemowe
  • Sesja wygasła!

Znaleziono wyników: 4

Liczba wyników na stronie
first rewind previous Strona / 1 next fast forward last
Wyniki wyszukiwania
Wyszukiwano:
w słowach kluczowych:  biomedical cybernetics
help Sortuj według:

help Ogranicz wyniki do:
first rewind previous Strona / 1 next fast forward last
EN
In general, a theoretical Computerized Tomography (CT) imaging problem can be formulated as a system of linear equations. The discrete inverse problem of reconstructing finite subsets of the n-dimensional integer lattice Zn that are only accessible via their line sums (discrete x-rays), in a finite set of lattice directions, results into an even more ill-posed problem, from noisy data. Because of background noise in the data, the reconstruction process is more difficult since the system of equations becomes inconsistent easily. Unfortunately, with every different kind of CT, as with many contemporary advanced instrumentation systems, one is always faced with an additional experimental data noise reduction problem. By using Information Geometry (IG) and Geometric Science of Information (GSI) approach, it is possible to extend traditional statistical noise reduction concepts and to develop new algorithm to overcome many previous limitations. On the other end, in the past five decades, trend in Systems Theory, in specialized research area, has shifted from classic single domain information channel transfer function approach (Shannon’s noisy channel) to the more structured ODR Functional Sub-domain Transfer Function Approach (Observation, Description and Representation), according to computational information conservation theory (CICT) Infocentric Worldview model (theoretically, virtually noise-free data). CICT achieves to bringing classical and quantum information theory together in a single framework, by considering information not only on the statistical manifold of model states but also from empirical measures. In fact, to grasp a more reliable representation of experimental reality and to get stronger physical and biological system correlates, researchers and scientists need two intelligently articulated hands: both stochastic and combinatorial approaches synergically articulated by natural coupling. As a matter of fact, traditional rational number system Q properties allow to generate an irreducible co-domain for every computational operative domain used. Then, computational information usually lost by using classic LTR computational approach only, based on the traditional noise-affected data model stochastic representation (with high-level perturbation computational model under either additive or multiplicative perturbation hypothesis), can be captured and fully recovered to arbitrary precision, by a corresponding complementary co-domain, step-by-step. In previous paper, we already saw that CICT can supply us with Optimized Exponential Cyclic numeric Sequences (OECS) co-domain perfectly tuned to low-level multiplicative noise source generators, related to experimental high-level overall perturbation. Now, associated OECS co-domain polynomially structured information can be used to evaluate any computed result at arbitrary scale, and to compensate for achieving multi-scale computational information conservation.
2
Content available remote How Random is Your Tomographic Noise? A Number Theoretic Transform (NTT) Approach
EN
Discrete Tomography (DT), differently from GT and CT, focuses on the case where only few specimen projections are known and the images contain a small number of different colours (e.g. black-and-white). A concise review on main contemporary physical and mathematical CT system problems is offered. Stochastic vs. Combinatorially Optimized Noise generation is compared and presented by two visual examples to emphasise a major double-bind problem at the core of contemporary most advanced instrumentation systems. Automatic tailoring denoising procedures to real dynamic system characteristics and performance can get closer to ideal self-registering and selflinearizing system to generate virtual uniform and robust probing field during its whole designed service life-cycle. The first attempt to develop basic principles for system background low-level noise source automatic characterization, profiling and identification by CICT, from discrete system parameter, is presented. As a matter of fact, CICT can supply us with cyclic numeric sequences perfectly tuned to their low-level multiplicative source generators, related to experimental high-level overall perturbation (according to high-level classic perturbation computational model under either additive or multiplicative perturbation hypothesis). Numeric examples are presented. Furthermore, a practical NTT example is given. Specifically, advanced CT system, HRO and Mission Critical Project (MCP) for very low Technological Risk (TR) and Crisis Management (CM) system will be highly benefitted mostly by CICT infocentric worldview. The presented framework, concepts and techniques can be used to boost the development of next generation algorithms and advanced applications quite conveniently.
3
Content available remote Discrete Tomography Data Footprint Reduction via Natural Compression
EN
In Discrete Tomography (DT) by electron microscopy, 2-D projection images are acquired from various angles, by tilting the sample, generating new challenges associated with the problem of formation, acquisition, compression, transmission, and analysis of enormous quantity of data. Data Footprint Reduction (DFR) is the process of employing one or more techniques to store a given set of data in less storage space. Modern lossless compressors use classical probabilistic models only, and are unable to match high end application requirements like “Arbitrary Bit Depth” (ABD) resolution and information “Dynamic Upscale Regeneration” (DUR). Traditional \mathbbQ Arithmetic can be regarded as a highly sophisticated open logic, powerful and flexible bidirectional (LTR and RTL) formal language of languages, according to brand new “Information Conservation Theory” (ICT). This new awareness can offer competitive approach to guide more convenient algorithm development and application for combinatorial lossless compression, we named “Natural Compression” (NC). To check practical implementation performance, a first raw example is presented, benchmarked to standard, more sophisticate lossless JPEG2000 algorithm, and critically discussed. NC raw overall lossless compression performance compare quite well to standard one, but offering true ABD and DUR at no extra computational cost.
4
Content available remote Discrete Tomography Data Footprint Reduction by Information Conservation
EN
The first impact of Discrete Tomography (DT) applied to nanoscale technology has been to generate enormous quantity of data. Data Footprint Reduction (DFR) is the process of employing one or more techniques to store a given set of data in less storage space. The very best modern lossless compressors use classical probabilistic models only, and are unable to match high end application requirements, like “Arbitrary Bit Depth” (ABD) resolution and “Dynamic Upscale Regeneration” (DUR), with full information conservation. This paper explores, at core level, the basic properties and relationships of Q Arithmetic to achieve full numeric information conservation and regeneration, algorithmically. That knowledge shows strong connections to modular group theory and combinatorial optimization. Traditional Q Arithmetic can be even regarded as a highly sophisticated open logic, powerful and flexible LTR and RTL formal numeric language of languages, with self-defining consistent word and rule, starting from elementary generator and relation. This new awareness can guide the development of successful more convenient algorithm and application.
first rewind previous Strona / 1 next fast forward last
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.