Ograniczanie wyników
Czasopisma help
Autorzy help
Lata help
Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników

Znaleziono wyników: 85

Liczba wyników na stronie
first rewind previous Strona / 5 next fast forward last
Wyniki wyszukiwania
Wyszukiwano:
w słowach kluczowych:  data fusion
help Sortuj według:

help Ogranicz wyniki do:
first rewind previous Strona / 5 next fast forward last
EN
70 species of grasses family (Poaceae), coming from genera: Agrostis, Alopecurus, Anthoxanthum, Apera, Arrhenatherum, Avena, Brachypodium, Briza, Bromus, Calamagrostis, Corynephorus, Cynosurus, Dactylis, Danthonia, Deschampsia, Digitaria, Echinochloa, Elymus, Eragrostis, Festuca, Glyceria, Helictotrichon, Hierochloe, Holcus, Hordeum, Koeleria, Leymus, Lolium, Milium, Molinia, Nardus, Panicum, Phalaris, Phleum, Phragmites, Poa, Saccharum and Setaria, collected mostly from natural stands in Poland during 2020 season, were subjected to GC-MS fingerprinting of headspace volatile fraction above dried material. Obtained mass spectrometry data were analyzed by means of principal component analysis (PCA) and hierarchical cluster analysis (HCA). Five species: Glyceria maxima (Hartm.) Holmb., Lolium multiflorum Lam., Hordeum jubatum L., Bromus tectorum L. and Bromus secalinus L. were identified as outliers, which is consistent with our earlier analysis by thin layer chromatography. These species deserve further look and their outliance is orthogonal to coumarin content, which was independently observed for odorant species of grasses.
EN
In order to realize the evaluation of the vehicle transmission system health degree, a prediction model by multi-level data fusion method is established in this paper. The prediction model applies PSO(Particle Swarm Optimization)-BP(Back Propagation) neural network algorithm, calculates the whole machine health degree and each module respective weights from the test data. On this basis, it analyzes the error between the model calculated health degree and theoretical health degree. Then the research verifies the validity and prediction model accuracy. The health degree which is obtained by the single module feature parameters fusion, and the vehicle transmission system health degree is investigated, which is less effective compared to the three-level fusions. After that, by analyzing the vehicle transmission system multi-parameter feature weights, it is found that the mechanical module accounted for the largest damage rate, and the three modules influenced the vehicle transmission system health degree in the order of mechanical module, hydraulic module, and electric control module. The study has played a guiding role in the health management of complex equipment.
EN
The feature-extraction step is a major and crucial step in analyzing and understanding raw data, as it has a considerable impact on system accuracy. Despite the very acceptable results that have been obtained by many handcrafted methods, these can unfortunately have difficulty representing features in the cases of large databases or with strongly correlated samples. In this context, we attempt to examine the discriminability of texture features by proposing a novel, simple, and lightweight method for deep feature extraction to characterize the discriminative power of different textures. We evaluated the performance of our method by using a palm print-based biometric system, and the experimental results (using the CASIA multispectral palm--print database) demonstrate the superiority of the proposed method over the latest handcrafted and deep methods.
PL
Technologia rozwija się coraz szybciej w każdym obszarze naszego życia. Wszyscy mamy urządzenia elektroniczne, które emitują energię elektromagnetyczną. Promieniowanie sygnałów radiowych jest podstawowym elementem komunikacji bezprzewodowej, nawigacji satelitarnej bądź monitoringu w czasie rzeczywistym. Nowoczesne armie posiadają rozwiązania technologiczne oparte na technologiach bezprzewodowych, które poprawiają skuteczność działania, zwiększają świadomość sytuacyjną oraz pozwalają na szybsze podejmowanie decyzji, ale równocześnie w spektrum elektromagnetycznym wyposażenie korzystające z energii elektromagnetycznej można porównać do latarki - przeciwnik może zaobserwować „świecące” punkty na spektrogramach oraz zobrazowaniach w dziedzinie częstotliwości. Pozwala to na łatwe wykrycie oraz lokalizację celu, a następnie jego eliminację. Jest to domena, w której świetnie sprawdzają się techniki rozpoznania radioelektronicznego - rodzaj rozpoznania wojskowego, w którym spektrum elektromagnetyczne wykorzystywane jest do zdobywania informacji na temat przeciwnika. Jedna z metod to monitoring widma oparty na analizie odbieranych sygnałów radiowych. Obecnie coraz częściej twierdzi się, że informacja z jednego sensora to zbyt mało. Konieczne jest zbieranie produktów rozpoznawczych z wielu urządzeń, a następnie skuteczna fuzja danych. Algorytmy DF (ang. Data Fusion) pozwalają na kooperacyjny sensing widma elektromagnetycznego, co przekłada się na większe prawdopodobieństwo detekcji sygnału. Warto rozważyć wprowadzanie rozwiązań radia definiowanego programowo i bezzałogowych statków powietrznych, co pozwala na miniaturyzację systemów rozpoznawczych i zwiększenie zasięgu przez wykorzystanie platform latających. Implementacja systemów bezzałogowych oraz algorytmów sztucznej inteligencji, zdolnej do podejmowania szybkich i trafnych decyzji, pozwoli na uniknięcie strat ludzkich.
EN
The development of technology is progressing in every area of our lives. Each of us has an electronic device that emits electromagnetic energy. Radiation of radio signals is an essential element of wireless communication, satellite navigation or real-time monitoring. Modern armies have technological solutions based on wireless technologies that improve operational efficiency, increase situational awareness and allow us for faster decision making, but at the same time, in the electromagnetic spectrum, equipment using electromagnetic energy can be compared to a flashlight - the enemy can observe “glowing” points on the spectrograms and frequency domain images. This enables us to easy detect and localise the target and then to eliminate it. This is a domain where radio-electronic reconnaissance techniques work well - a type of military reconnaissance that uses the electromagnetic spectrum to gather information about the enemy. One of the methods is spectrum sensing, based on the analysis of received radio signals. Currently, there is a tendency in which information from one sensor is not enough. It is necessary to collect reconnaissance products from many devices, and then to make effective data fusion. DF (Data Fusion) algorithms allow us for cooperative sensing of the electromagnetic spectrum, which translates into a higher probability of signal detection. It is worth considering the introduction of software-defined radio and unmanned aerial vehicle solutions in order to miniaturise reconnaissance systems and to increase a range through the use of flying platforms. Implementations of unmanned systems and artificial intelligence algorithms, capable of making quick and accurate decisions will help to avoid human losses.
EN
To achieve comprehensive analyses, the presentation of comprehensive geophysical results usually involves the use of separate imaging and the combination of various results. At present, few studies have considered the correlation degree and unified imaging of different types of geophysical data. We establish a set of data fusion imaging methods for multiple geophysical data based on their refection coefficients. As geophysical exploration results are primarily provided through waveform and resistivity sections, waveform and resistivity data were selected for fusion and were converted into refection coefficients, and ground-penetrating radar (GPR) and surface electrical resistivity tomography (ERT) were taken as examples. Re-sampling and feature reconstruction were performed to unify the data in space and resolution. Finally, principal component analysis was used to calculate the correlation of the reconstructed refection coefficient and to perform data fusion; this led to unified imaging based on the refection coefficient of the considered geophysical data. Numerical simulation analyses and field experiments proved the efficacy of this method for producing unified imaging of multiple geophysical data. In summary, we provide a novel method for the unified interpretation of multiple geophysical data and enhance the identification ability of geological interfaces and anomaly distribution.
EN
The total electron content (TEC) maps are chosen as the elementary structures to provide ionospheric corrections for improving the positional accuracy for Global Navigational Satellite Systems (GNSS) users. Availability of total electron content data from a multi-constellation of satellite systems and various ground-based instruments possess an ability to monitor, nowcast and forecast the behavior of the ionosphere region. Conversely, combining ionospheric TEC data from different temporal and spatial scales is a difficult task to augment either ground or space-based ionospheric model's accuracy. And hence, a method like data fusion is essential to illustrate the ionospheric variability and to improve the accuracy of ionospheric models under equatorial ionization anomaly (EIA) conditions. This paper presented the weighted least square data fusion method with multi-instrument TEC data to analyze the EIA TEC structures in the low-latitude Indian region. Both ground-based (GPS TEC from 26 stations in the Indian region) and space-based (FORMOSAT-3/COSMIC RO and SWARM mini satellite constellation) observations are used for the analysis. The spherical harmonic function (SHF) model of order 2, which gives nine SHF coefficients, is implemented. The analysis illustrates that the SHF coefficients followed by TEC data fusion would be useful to investigate the entry, occupancy and exit TEC structures of EIA during geomagnetic storm conditions.
EN
Information fusion approaches have been commonly used in multi sensor environments for the fusion and grouping of data from various sensors which is used further to draw a meaningful interpretation of the data. Traditional information fusion methods have limitations such as high time complexity of fusion processes and poor recall rate. In this work, a new multi-channel nano sensor information fusion method based on a neural network has been designed. By analyzing the principles of information fusion methods, the back propagation based neural network (BP-NN) is devised in this work. Based on the design of the relevant algorithm flow, information is collected, processed, and normalized. Then the algorithm is trained, and output is generated to achieve the fusion of information based on multi-channel nano sensor. Moreover, an error function is utilized to reduce the fusion error. The results of the present study show that compared with the conventional methods, the proposed method has quicker fusion (integration of relevant data) and has a higher recall rate. The results indicate that this method has higher efficiency and reliability. The proposed method can be applied in many applications to integrate the data for further analysis and interpretations.
EN
One of the most hazardous places in mines are longwall areas. They emit a considerable amount of methane to the ventilation air. The emission depends on many but mostly known factors. The article presents the research results on changes in the methane concentration along the longwall excavations and longwall. The distributions were obtained based on a measurement experiment at the ZG Brzeszcze mine in Poland. The author’s research aimed to experimentally determine the concentration of methane as a function of the length of excavation for the longwall excavations and longwall. As a result, methane concentration trends along the excavations were obtained. The conclusions show the pros and cons of the method used, and it allows to set the right direction in the development of measurement systems and sensors.
EN
How to use efficient energy in wireless sensor networks (WSN) is one of the major challenges due to limited energy batteries and computation capacity. Therefore, in this paper, we propose combining a chain-base routing scheme and data fusion sensor information (CRSDF for short). CRSDF contains two major works: Firstly, the chain-based routing method is applied to connect sensor nodes into a chain in which each node transmits only with the nearest neighbor using the remaining energy and distance of nodes as standard parameters to determine which node will be selected the chain leader, secondly, we fuse and compress one or more data packets to generate a result packet with small size base on the Slepian-Wolf and Dempster-Shafer theory. The simulation results exhibit that the energy efficiency of our proposed protocol can be improved by 40\%, 20\%, and 15\% compared to low-energy adaptive clustering hierarchy (LEACH), power-efficient gathering in sensor information system (PEGASIS), and an improved energy-efficient PEGASIS-Based protocol, respectively.
EN
Travel time estimation for freeways has attracted much attention from researchers and traffic management departments. Because of various uncertain factors, travel time on a freeway is stochastic. To obtain travel time estimates for a freeway accurately, this paper proposes two traffic sensor location models that consider minimizing the error of travel time estimation and maximizing the collected traffic flow. First, a dynamic optimal location model of the mobile sensor is proposed under the assumption that there are no traffic sensors on a freeway. Next, a dynamic optimal combinatorial model of adding mobile sensors taking account of fixed sensors on a freeway is presented. It should be pointed out that the technology of data fusion will be adopted to tackle the collected data from multiple sensors in the second optimization model. Then, a simulated annealing algorithm is established to find the solutions of the proposed two optimization models. Numerical examples demonstrate that dynamic optimization of mobile sensor locations for the estimation of travel times on a freeway is more accurate than the conventional location model.
EN
Purpose: The term data fusion is often used in various technologies, where a significant element is the ability of combining data of different typology coming from diverse sources. Currently, the issue of DF is developing towards interdisciplinary field and is connected with 'agile' data (information) synthesis concerning phenomena and objects. Optimal environment to carry out data fusion are SN (Sensor Networks), in which DF process is carried out on a data stage, most often automatically with the use of probable association algorithms of this data. The purpose of this article was an implementation of a neural network and its adaptation in the process of data fusion and solving the value prediction problem. Design/methodology/approach: The conducted experiment was concerned with modelling artificial neural network to form radiation beam of microstrip antenna. In the research the MATLAB environment was used. Findings: The conducted experiment shows that depending on the type of output data set and the task for ANN, the effect of neural network's learning is dependent on the activation function type. The described and implemented network for different activation functions learns effectively, predicts results as well as has the ability to generalize facts on the basis of the patterns learnt. Research limitations/implications: Without doubts, it is possible to improve the model of a network and provide better results than these presented in the paper through modifying the number of hidden layers, the number of neurons, learning step value or modifying the learning algorithm itself. Originality/value: The paper presents the implementation of the sensor network in the context of the process of data fusion and solution prediction. The paper should be read by persons which research interests are focused at the decision support by the information and communication technologies.
EN
The recent rapid improvement of nautical equipment functionality allows one to better observe and predict the dangers related to seamanship. However, these new features come with added complexity, and large amounts of information can overwhelm vessel crews and fleet operation centers, and the current state-of-the-art tools cannot filter out only the most important data for a given time and location. This paper presents the concepts and the algorithms of a software suite that provides a user with problem-oriented advice about a particular risk endangering a vessel and its crew. Based on the calculated navigational dangers and their predicted development, actionable guidance is proposed in an easy-to-understand human language. The quality of good seamanship is improved by a holistic approach to vessel installation, automated fleet operation center priority queuing, and the evaluation of crew performance during simulator training and daily operations. Both the software user interface, as well as the insights provided by the algorithm, are discussed.
EN
Digital signal processing, such as filtering, information extraction, and fusion of various results, is currently an integral part of advanced medical therapies. It is especially important in neurosurgery during deep-brain stimulation procedures. In such procedures, the surgical target is accessed using special electrodes while not being directly visible. This requires very precise identification of brain structures in 3D space throughout the surgery. In the case of deep-brain stimulation surgery for Parkinson’s disease (PD), the target area—the subthalamic nucleus (STN)—is located deep within the brain. It is also very small (just a few millimetres across), which makes this procedure even more difficult. For this reason, various signals are acquired, filtered, and finally fused, to provide the neurosurgeon with the exact location of the target. These signals come from preoperative medical imaging (such as MRI and CT), and from recordings of brain activity carried out during surgery using special brain-implanted electrodes. Using the method described in this paper, it is possible to construct a decision-support system that, during surgery, analyses signals recorded within the patient’s brain and classifies them as recorded within the STN or not. The constructed classifier discriminates signals with a sensitivity of 0.97 and a specificity of 0.96. The described algorithm is currently used for deep-brain stimulation surgeries among PD patients.
14
Content available Radar data fusion in the STRADAR system
EN
The main task of the Polish Border Guard is protection of the country’s border which requires utilization of multimedia surveillance systems automatically gathering, processing and sharing various data. The paper presents such a system developed for the Maritime Division of the Polish Border Guard within the STRADAR project and the problem of fusion of radar data in this system. The system, apart from providing communication means, gathers data from AIS, GPS and radar receivers: ARPA and SCANTER 2001. In the paper the concept of the radar data gathering in STRADAR system is provided with detailed presentation of radar servers, Radar INT modules and a reduplication (fusion) module and the proposition of the algorithm for radar data fusio.
PL
Głównym zadaniem polskiej Straży Granicznej jest ochrona granicy kraju, która wymaga wykorzystania multimedialnych systemów nadzoru umożliwiającychautomatyczne gromadzenie,przetwarzanie i udostępnianie różnego rodzaju danych. W artykule przedstawiono taki system opracowany dla Morskiego Oddziału Straży Granicznej w ramach projektu STRADAR oraz problem fuzji danych radarowych w tym systemie. System STRADAR, oprócz zapewnienia komunikacji pomiędzy elementami systemu, gromadzi i udostępniania dane z AIS, GPS i odbiorników radarowych: ARPA i SCANTER 2001. W artykule zaprezentowano koncepcjęgromadzenia danych adarowych w systemie STRADAR, przedstawiono serwery radarowe, moduł radar INT i moduł reduplikacjioraz zaproponowano algorytm fuzji danych radarowych.
PL
W artykule przedstawiono wyniki oryginalnych badań nad zastosowaniem sieci neuronowej wykorzystującej techniki głębokiego uczenia w zadaniu identyfikacji tożsamości na podstawie obrazów twarzy zarejestrowanych w zakresie widzialnym i w podczerwieni. W badaniach użyte zostały obrazy twarzy eksponowanych w zmiennych ale kontrolowanych warunkach. Na podstawie uzyskanych wyników można stwierdzić, że oba badane zakresy spektralne dostarczają istotnych ale różnych informacji o tożsamości danej osoby, które się wzajemnie uzupełniają.
EN
The paper presents the results of the original research on the application of a neural network using deep learning techniques in the task of identity recognition on the basis of facial images acquired in both visual and thermal radiation ranges. In the research, the database containing images acquired in various but controlled conditions was used. On the basis of the obtained results it can be established that both investigated spectral ranges provide distinctive and complementary details about the identity of an examined person.
EN
Earth’s atmosphere is monitored by a multitude of sensors. It is the troposphere that is of crucial importance for human activity, as it is there that the weather phenomena take place. Weather observations are performed by surface sensors monitoring, inter alia, humidity, temperature and winds. In order to observe the developments taking place in the atmosphere, especially in the clouds, weather radars are commonly used. They monitor severe weather that is associated with storm clouds, cumulonimbuses, which create precipitation visible on radar screens. Therefore, radar images can be utilized to track storm clouds in a data fusion system. In this paper an algorithm is developed for the extraction of blobs (interesting areas in radar imagery) used within data fusion systems to track storm cells. The algorithm has been tested with the use of real data sourced from a weather radar network. 100% of convection cells were detected, with 90% of them being actual thunderstorms.
17
Content available IoT sensing networks for gait velocity measurement
EN
Gait velocity has been considered the sixth vital sign. It can be used not only to estimate the survival rate of the elderly, but also to predict the tendency of falling. Unfortunately, gait velocity is usually measured on a specially designed walk path, which has to be done at clinics or health institutes. Wearable tracking services using an accelerometer or an inertial measurement unit can measure the velocity for a certain time interval, but not all the time, due to the lack of a sustainable energy source. To tackle the shortcomings of wearable sensors, this work develops a framework to measure gait velocity using distributed tracking services deployed indoors. Two major challenges are tackled in this paper. The first is to minimize the sensing errors caused by thermal noise and overlapping sensing regions. The second is to minimize the data volume to be stored or transmitted. Given numerous errors caused by remote sensing, the framework takes into account the temporal and spatial relationship among tracking services to calibrate the services systematically. Consequently, gait velocity can be measured without wearable sensors and with higher accuracy. The developed method is built on top of WuKong, which is an intelligent IoT middleware, to enable location and temporal-aware data collection. In this work, we present an iterative method to reduce the data volume collected by thermal sensors. The evaluation results show that the file size is up to 25% of that of the JPEG format when the RMSE is limited to 0.5º.
EN
A prominent characteristic of clinical data is their heterogeneity—such data include structured examination records and laboratory results, unstructured clinical notes, raw and tagged images, and genomic data. This heterogeneity poses a formidable challenge while constructing diagnostic and therapeutic decision models that are currently based on single modalities and are not able to use data in different formats and structures. This limitation may be addressed using data fusion methods. In this paper, we describe a case study where we aimed at developing data fusion models that resulted in various therapeutic decision models for predicting the type of treatment (surgical vs. non-surgical) for patients with bone fractures. We considered six different approaches to integrate clinical data: one fusion model based on combination of data (COD) and five models based on combination of interpretation (COI). Experimental results showed that the decision model constructed following COI fusion models is more accurate than decision models employing COD. Moreover, statistical analysis using the one-way ANOVA test revealed that there were two groups of constructed decision models, each containing the set of three different models. The results highlighted that the behavior of models within a group can be similar, although it may vary between different groups.
EN
Land use/land cover (LULC) maps are important datasets in various environmental projects. Our aim was to demonstrate how GEOBIA framework can be used for integrating different data sources and classification methods in context of LULC mapping.We presented multi-stage semi-automated GEOBIA classification workflow created for LULC mapping of Tuszyma Forestry Management area based on multi-source, multi-temporal and multi-resolution input data, such as 4 bands- aerial orthophoto, LiDAR-derived nDSM, Sentinel-2 multispectral satellite images and ancillary vector data. Various classification methods were applied, i.e. rule-based and Random Forest supervised classification. This approach allowed us to focus on classification of each class ‘individually’ by taking advantage from all useful information from various input data, expert knowledge, and advanced machine-learning tools. In the first step, twelve classes were assigned in two-steps rule-based classification approach either vector-based, ortho- and vector-based or orthoand Lidar-based. Then, supervised classification was performed with use of Random Forest algorithm. Three agriculture-related LULC classes with vegetation alternating conditions were assigned based on aerial orthophoto and Sentinel-2 information. For classification of 15 LULC classes we obtained 81.3% overall accuracy and kappa coefficient of 0.78. The visual evaluation and class coverage comparison showed that the generated LULC layer differs from the existing land cover maps especially in relative cover of agriculture-related classes. Generally, the created map can be considered as superior to the existing data in terms of the level of details and correspondence to actual environmental and vegetation conditions that can be observed in RS images.
20
Content available Fast multispectral deep fusion networks
EN
Most current state-of-the-art computer vision algorithms use images captured by cameras, which operate in the visible spectral range as input data. Thus, image recognition systems that build on top of those algorithms can not provide acceptable recognition quality in poor lighting conditions, e.g. during nighttime. Another significant limitation of such systems is high demand for computational resources, which makes them impossible to use on low-powered embedded systems without GPU support. This work attempts to create an algorithm for pattern recognition that will consolidate data from visible and infrared spectral ranges and allow near real-time performance on embedded systems with infrared and visible sensors. First, we analyze existing methods of combining data from different spectral ranges for object detection task. Based on the analysis, an architecture of a deep convolutional neural network is proposed for the fusion of multi-spectral data. This architecture is based on the single shot multi-box detection algorithm. Comparison analysis of the proposed architecture with previously proposed solutions for the multi-spectral object detection task shows comparable or better detection accuracy with previous algorithms and significant improvement of the running time on embedded systems. This study was conducted in collaboration with Philips Lighting Research Lab and solutions based on the proposed architecture will be used in image recognition systems for the next generation of intelligent lighting systems. Thus, the main scientific outcomes of this work include an algorithm for multi-spectral pattern recognition based on convolutional neural networks, as well as a modification of detection algorithms for working on embedded systems.
first rewind previous Strona / 5 next fast forward last
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.