Ograniczanie wyników
Czasopisma help
Autorzy help
Lata help
Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników

Znaleziono wyników: 152

Liczba wyników na stronie
first rewind previous Strona / 8 next fast forward last
Wyniki wyszukiwania
Wyszukiwano:
w słowach kluczowych:  algorithms
help Sortuj według:

help Ogranicz wyniki do:
first rewind previous Strona / 8 next fast forward last
PL
Niniejszy artykuł prezentuje system do symulacji i analizy stanu pól komutacyjnych. Główną cechą systemu jest to, że obliczenia saą realizowane w dedykowanych układach sprzętowych. Jako moduły obliczeniowe wykorzystane zostały moduł z programowalnym układem FPGA -Spartan-3 firmy Xilinx. Kilkanaście takich modułów zostało połączonych w szeregowy systemi pracuą˛pod kontrolą aplikacji www, która komunikuje się z węzłami obliczeniowymi za pośrednictwem Raspberry Pi, który to realizuje funkcjonalność proxy między typowym oprogramowaniem a programowalnymi układami sprzętowymi.
EN
In this paper there is presented a system for simulations realized in hardware. The subject are blocking states in optical switching fabrics. Model of such a fabric is presented, and the way of its analysis is described. FPGA Spartan-3 chips are used for fast calculations, Raspberry PI, small PC, is used as an interface between PC and electronic part of the system. System is dedicated for searching blocking states (which is realized in hardware) and their analysis (which is realized by GUI and software on PC). Main elements of system are:Web based GUI, scripts and database for storing results, subsystem for controlling FPGA chips (controller is realized on Raspberry PI and its GPIOs) and 18 (or more) FPGA modules as a calculating engines.
EN
Algorithms based on singleton arc consistency (SAC) show considerable promise for improving backtrack search algorithms for constraint satisfaction problems (CSPs). The drawback is that even the most efficient of them is still comparatively expensive. Even when limited to preprocessing, they give overall improvement only when problems are quite difficult to solve with more typical procedures such as maintained arc consistency (MAC). The present work examines a form of partial SAC and neighbourhood SAC (NSAC) in which a subset of the variables in a CSP are chosen to be made SAC-consistent or neighbourhood-SAC-consistent. Such consistencies, despite their partial character, are still well-characterized in that algorithms have unique fixpoints. Heuristic strategies for choosing an effective subset of variables are described and tested, the best being choice by highest degree and a more complex strategy of choosing by constraint weight after random probing. Experimental results justify the claim that these methods can be nearly as effective as the corresponding full version of the algorithm in terms of values discarded or problems proven unsatisfiable, while significantly reducing the effort required to achieve this.
PL
W artykule omówiono metody komputerowej analizy obrazów znane z celowników lotniczych (ukrywanej przez wojskowych naukowców ubiegłego wieku w psychologii i badaniach językowego opisu obrazu), analizy obrazu z kamery robota spawalniczego, komputerowego wspomagania badań mikroskopowych. W artykule do stworzenia algorytmów rozpoznawania struktur mikroskopowych stali wykorzystano znaną z językoznawstwa metodę zamiany metaforycznych wypowiedzi na wektory czyli na zapis matematyczny. W analizie wektorów uwzględniono odziedziczone i wyuczone typy zdolności. Wynik analizy, który jest identyczny dla algorytmów odpowiadających 3 typom zdolności daje pewność jednoznaczności. W innych przypadkach potrzebna jest dodatkowa wiedza dotycząca technologii powstawania badanych struktur mikroskopowych.
4
Content available remote Implementacja środowiska testowego w języku R
PL
W pracy przedstawione zostały biblioteki oraz ich zastosowanie do implementacji środowiska testowego algorytmów rojowych. Wykorzystane moduły pozwalają przygotować w pełni sprawne narzędzie w szybki sposób. Przedstawione środowisko testowe w tym artykule zostało przygotowane do wspomagania badań nad algorytmami rojowymi i może być wykorzystane jako zasób sieciowy jak i skrypt dostępny lokalnie. Metody wykorzystane w tym artykule mogą posłużyć do budowy środowisk testowych dla wielu innych nie związanych z algorytmami rojowymi sceneriów.
EN
The work presents libraries and their application for implementing the swarm algorithms test environment. The modules used allow you to prepare a fully functional tool quickly. The test environment presented in this article has been prepared to support research on swarm algorithms and can be used as a network resource as well as a locally available script. The methods used in this article can be used to build test environments for many other scenery not related to swarm algorithms.
EN
In this article, we introduce Moschovakis higher-order type theory of acyclic recursion Lλar. We present the potentials of Lλar for incorporating different reduction systems in Lλar, with corresponding reduction calculi. At first, we introduce the original reduction calculus of Lλar, which reduces Lλar-terms to their canonical forms. This reduction calculus determines the relation of referential, i.e., algorithmic, synonymy between Lλar-terms with respect to a chosen semantic structure. Our contribution is the definition of a (γ) rule and extending the reduction calculus of Lλar and its referential synonymy to γ-reduction and γ-synonymy, respectively. The γ-reduction is very useful for simplification of terms in canonical forms, by reducing subterms having superfluous λ-abstraction and corresponding functional applications. Typically, such extra λ abstractions can be introduced by the λ-rule of the reduction calculus of Lλar.
6
Content available remote Cloud Brokering with Bundles: Multi-objective Optimization of Services Selection
EN
Cloud computing has become one of the major computing paradigms. Not only the number of offered cloud services has grown exponentially but also many different providers compete and propose very similar services. This situation should eventually be beneficial for the customers, but considering that these services slightly differ functionally and non-functionally -wise (e.g., performance, reliability, security), consumers may be confused and unable to make an optimal choice. The emergence of cloud service brokers addresses these issues. A broker gathers information about services from providers and about the needs and requirements of the customers, with the final goal of finding the best match. In this paper, we formalize and study a novel problem that arises in the area of cloud brokering. In its simplest form, brokering is a trivial assignment problem, but in more complex and realistic cases this does not longer hold. The novelty of the presented problem lies in considering services which can be sold in bundles. Bundling is a common business practice, in which a set of services is sold together for the lower price than the sum of services’ prices that are included in it. This work introduces a multi-criteria optimization problem which could help customers to determine the best IT solutions according to several criteria. The Cloud Brokering with Bundles (CBB) models the different IT packages (or bundles) found on the market while minimizing (maximizing) different criteria. A proof of complexity is given for the single-objective case and experiments have been conducted with a special case of two criteria: the first one being the cost and the second is artificially generated. We also designed and developed a benchmark generator, which is based on real data gathered from 19 cloud providers. The problem is solved using an exact optimizer relying on a dichotomic search method. The results show that the dichotomic search can be successfully applied for small instances corresponding to typical cloud-brokering use cases and returns results in terms of seconds. For larger problem instances, solving times are not prohibitive, and solutions could be obtained for large, corporate clients in terms of minutes.
PL
Na przełomie ostatnich lat PSE uruchomiło kilka nowych automatyk systemowych o charakterze odciążającym na potrzeby ochrony elementów przesyłowych przed przeciążeniem. Ten rodzaj automatyki nosi nazwę odciążającej. Automatyki tego typu zwykle współpracują z blokami wytwórczymi. Zadaniem automatyki odciążającej jest rozpoznanie stanu przeciążenia elementów sieci i zredukowanie wytwarzania mocy na blokach lub ich wyłączenie na potrzeby uzyskania odpowiedniego odciążenia. Do ich budowy wykorzystuje się sterowniki programowalne. Algorytmy decyzyjne automatyki opracowywane są w środowisku PLC, które umożliwia w łatwy i przejrzysty sposób symulowanie i weryfikowanie jego działania. W artykule opisano zagadnienia z zakresu rozwiązań funkcjonalnych automatyk odciążających.
EN
At the turn of recent years PSE polish TSO has launched several new Special Protection Scheme (SPS) for the protection of transmission elements against overload. This is called load shedding automation in cooperation with units in power plant. The task of this kind of SPS is to recognize the overload condition of the network elements and reduction of the power generation on the units or to trip them f or the purpose of obtaining the appropriate offloading. They are constructed based on programmable controllers. Algorithms of SPSs are developed in the PLC environment, which allows to simulate and verify its operation in an easy and transparent way. The article describes the issues in the field of functional solutions of the load shedding automation.
PL
Celem artykułu było porównanie trzech wybranych algorytmów optymalizacji wielokryterialnej w planowaniu sieci WLAN standardu 802.11b/g w środowisku wewnątrzbudynkowym z infrastrukturą. Zaproponowano wykorzystanie Metody Unitaryzacji Zerowanej (MUZ) do wyboru algorytmu i parametrów jego działania oraz najlepszego rozwiązania.
EN
The aim of the article was to compare three selected algorithms of multi-criteria indoor WLAN 802.11b/g planning with infrastructure. It has been proposed to use method (MUZ) for both the choice of algorithm and its operating parameters as well as the best solution.
PL
W artykule przedstawiono wyniki badań symulacyjnych właściwości wybranych cyfrowych algorytmów do estymacji częstotliwości w warunkach pracy przyrządów do dokładnych pomiarów napięcia sinusoidalnego. Zastosowano opracowany algorytm do pomiaru częstotliwości, oparty o pomiar przesunięcia fazowego z zastosowaniem DFT (Dyskretnej Transformaty Fouriera) oraz znane z literatury trzy algorytmy z interpolacją widma i algorytm zupełnych kwadratów błędów TLS (Total Least Squares) . Wyniki badań wskazują, że opracowany algorytm cechuje się najlepszymi właściwościami.
EN
The article presents the results of the simulation properties of selected digital algorithms to estimate the frequency, in the operating conditions of instruments for accurate measurements of sinusoidal voltage. A developed algorithm based on a phase shift measurement with a use of DFT (Discrete Fourier Transform) and four algorithms known from literature, three with spectral interpolation and a total least squares (TLS), were applied. The research results indicate that the developed algorithm is characterized by the best properties.
10
Content available remote Method for deisotoping based on fuzzy inference systems
EN
Proteins are very significant molecules that can construct the fingerprint of cancer. When dealing with large molecules, such as proteins, the crucial issue is their trustful and precise identification. In the majority of cases, mass spectrometry is used to identify the protein. Processing of data gathered in mass spectrometry experiment consists of several steps, and one of them is deisotoping. It is an essential part of preprocessing because some peaks in the spectrum are not the unique compound, but they are members of an isotopic envelope. There are several existing methods of deisotoping, but none of them is general and can be used in any experimental settings. To manage this, we propose a new algorithm based on fuzzy inference systems. The method was tested on the data provided by Institute of Oncology in Gliwice, that has been gathered in MALDI experiment in two different settings on head and neck cancer tissue samples. The comparison study, done between the developed fuzzy-based algorithm and mMass method revealed that the proposed method was able to identify more consistent with the expert annotation isotopic envelopes.
PL
Praca przedstawia nowy algorytm identyfikacji obwiedni izotopowych w widmach proteomicznych MALDI ToF. W ostatnich latach proteomika wraz z genetyką i transkryptomiką, silnie wspierają diagnostykę chorób nowotworowych. Bardzo ważne jest precyzyjne zidentyfikowanie białek znajdujących się w obszarze raka, gdyż pozwala to zrozumieć proces nowotworzenia oraz zaplanować własciwą terapię. Spektrometia mas, a właściwie technika zwana MALDI ToF (ang. Matrix Assisted Laser Desorption/Ionization Time-of-Flight Mass Spectrometry) jest powszechnie stosowana do pozyskania widm masowych, w których zawarta jest informacja o liczbie jonów o danym stosunku masy do ładunku. Etap przetwarzania wstępnego sygnału wymaga m.in. usunięcia szumu, linii bazowej i normalizacji. Identyfikacja przetwarzania wstępnego, który pozwala na usuniecie redundancji i zredukowanie wymiarowości danych. Istnieje wiele algorytmów identyfikacji obwiedni izotopowej, jednak każdy z nich przeznaczony jest dla innego rodzaju techniki spektrometrii masowej (MALDI, LC-MS, ESI, etc.) bądź dla konkretnego rodzaju cząsteczek. Zaproponowany algorytm oparty jest na teorii systemów rozmytych, a reguły wnioskowania zostały opracowane we współpracy z zespołem ekspertów w dziedzinie spektrometrii masowej. Przetestowany został na danych uzyskanych z Instytutu Onkologii im. Marii Skłodowskiej-Curie w Gliwicach, pochodzących z badań nad rakiem głowy i szyi. Wyniki autorskiego algorytmu do identyfikacji obwiedni izotopowych porównano z jedną z istniejących metod do identyfikacji obwiedni izotopowych.
PL
W części I artykułu przedstawiono algorytmy, które posłużyły do wyznaczenia współczynników dostosowawczych bazujących na dwóch kryteriach: maksymalnego wytężenia przekroju w stanie granicznym nośności oraz maksymalnego ugięcia w stanie granicznym użytkowalności.
EN
The first part of the article presents algorithms that were used to determine the adjustment coefficients based on two criteria: maximum effort in the Limit of Load Capacity and maximum deflection in the Operational Limit.
12
EN
The paper discusses the need for recommendations and the basic recommendation systems and algorithms. In the second part the design and implementation of the recommender system for online art gallery (photos, drawings, and paintings) is presented. The designed customized recommendation algorithm is based on collaborative filtering technique using the similarity between objects, improved by information from user profile. At the end conclusions of performed algorithm are formulated.
EN
This article presents a combinatorial algorithm to find a shortest triangular path (STP) between two points inside a digital object imposed on triangular grid that runs in O(n/g log n/g) time, where n is the number of pixels on the contour of the object and g is the grid size. Initially, the inner triangular cover which maximally inscribes the object is constructed to ensure that the path lies within the object. An appropriate bounding parallelogram is considered with those two points in diagonally opposite corners and then one of the semi-perimeters of the parallelogram is traversed. Certain combinatorial rules are formulated based on the properties of triangular grid and are applied during the traversal whenever required to shorten the triangular path. A shortest triangular path between any two points may not be unique. Another combinatorial algorithm is presented, which finds the family of shortest triangular path (FSTP) (i.e., the region containing all possible shortest triangular paths) between two given points inside a digital object and runs in O(n/g log n/g) time. Experimental results are presented to verify the correctness, robustness, and efficacy of the algorithms. STP and FSTP can be useful for shape analysis of digital objects and determining shape signatures.
EN
In this paper we introduce a new ranking algorithm, called Collaborative Judgement (CJ), that takes into account peer opinions of agents and/or humans on objects (e.g. products, exams, papers) as well as peer judgements over those opinions. The combination of these two types of information has not been studied in previous work in order to produce object rankings. Here we apply Collaborative Judgement to the use case of scientific paper assessment and we validate it over simulated data. The results show that the rankings produced by our algorithm improve current scientific paper ranking practice, which is based on averages of opinions weighted by their reviewers’ self-assessments.
EN
Predicting the secondary structure of a protein using a lattice model is one of the most studied computational problems in bioinformatics. Here the secondary structure or three dimensional structure of a protein is predicted from its amino acid sequence. The secondary structure refers to the local sub-structures of a protein. Simplified energy models have been proposed in the literature on the basis of interaction of amino acid residues in proteins. We focus on a well researched model known as the Hydrophobic-Polar (HP) energy model. In this paper, we propose the hexagonal prism lattice with diagonals that can overcome the problems of other lattice structures, e.g., parity problem. We give two approximation algorithms for protein folding on this lattice using HP model. Our first algorithm leads us to a similar structure of helix structure that is commonly found in a protein structure. This motivates us to propose the next algorithm with a better approximation ratio. Finally, we analyze the algorithms on the basis of intensity of the chemical forces along the different types of edges of hexagonal prism lattice with diagonals.
EN
Computed Tomography (CT) is an imaging technique that allows to reconstruct volumetric information of the analyzed objects from their projections. The most popular reconstruction technique is the Filtered Back Projection (FBP). It has the advantage of being the fastest technique available, but also the disadvantage to require a high number of projections to retrieve good quality reconstructions. In this article we propose a segmentation method for tomographic volumes composed of few materials. Our method combines existing high-quality variational segmentation frameworks with the data consistency approach used in tomography and discrete tomography. We show that our algorithm performs well under high noise level and with moderately low number of projections, and that the data consistency significantly improves the segmentation, at the cost of only one FBP reconstruction and forward projection.
EN
The one-dimensional Φ4 Model generalizes a harmonic chain with nearest-neighbor Hooke’s-Law interactions by adding quartic potentials tethering each particle to its lattice site. In their studies of this model Kenichiro Aoki and Dimitri Kusnezov emphasized its most interesting feature: because the quartic tethers act to scatter long-wavelength phonons, Φ4 chains exhibit Fourier heat conduction. In his recent Snook-Prize work Aoki also showed that the model can exhibit chaos on the three-dimensional energy surface describing a two-body two-spring chain. That surface can include at least two distinct chaotic seas. Aoki pointed out that the model typically exhibits different kinetic temperatures for the two bodies. Evidently few-body Φ4 problems merit more investigation. Accordingly, the 2018 Prizes honoring Ian Snook (1945-2013) will be awarded to the author(s) of the most interesting work analyzing and discussing few-body Φ4 models from the standpoints of dynamical systems theory and macroscopic thermodynamics, taking into account the model’s ability to maintain a steady-state kinetic temperature gradient as well as at least two coexisting chaotic seas in the presence of deterministic chaos.
18
Content available remote Ergodic Isoenergetic Molecular Dynamics for Microcanonical-Ensemble Averages
EN
Considerable research has led to ergodic isothermal dynamics which can replicate Gibbs’ canonical distribution for simple (small) dynamical problems. Adding one or two thermostat forces to the Hamiltonian motion equations can give an ergodic isothermal dynamics to a harmonic oscillator, to a quartic oscillator, and even to the “Mexican-Hat” (doublewell) potential problem. We consider here a time-reversible dynamical approach to Gibbs’ “microcanonical” (isoenergetic) distribution for simple systems. To enable isoenergetic ergodicity we add occasional random rotations to the velocities. This idea conserves energy exactly and can be made to cover the entire energy shell with an ergodic dynamics. We entirely avoid the Poincaré-section holes and island chains typical of Hamiltonian chaos. We illustrate this idea for the simplest possible two-dimensional example, a single particle moving in a periodic square-lattice array of scatterers, the “cell model”.
EN
In this paper two recent methods of solving a repeatable inverse kinematic task are compared. The methods differ substantially although both are rooted in optimization techniques. The first one is based on a paradigm of continuation methods while the second one takes advantage of consecutive approximations. The methods are compared based on a quality of provided results and other quantitative and qualitative factors. In order to get a statistically valuable comparison, some data are collected from simulations performed on pendula robots with different paths to follow, initial configurations and a degree of redundancy.
EN
This paper presents the outcome of a pre-project that resulted in an initial version (prototype) of an automated assessment algorithm for a specific maritime operation. The prototype is based on identified control requirements that human operators must meet to conduct safe navigation. Current assessment methods of navigation in simulators involve subject matter experts, whose evaluations unfortunately have some limitations related to reproducibility and consistency. Automated assessment algorithms may address these limitations. For a prototype, our algorithm had a large correlation with evaluations performed by subject matter experts in assessment of navigation routes. The results indicate that further research in automated assessment of maritime navigation has merit. The algorithm can be a stepping stone in developing a consistent, unbiased, and transparent assessment module for evaluating maritime navigation performance.
first rewind previous Strona / 8 next fast forward last
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.