Ograniczanie wyników
Czasopisma help
Autorzy help
Lata help
Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników

Znaleziono wyników: 175

Liczba wyników na stronie
first rewind previous Strona / 9 next fast forward last
Wyniki wyszukiwania
Wyszukiwano:
w słowach kluczowych:  data analysis
help Sortuj według:

help Ogranicz wyniki do:
first rewind previous Strona / 9 next fast forward last
1
EN
The aim of this paper is to develop a container ship traffic model for port simulation studies. Such a model is essential for terminal design analyses and testing the performance of optimization algorithms. This kind of studies requires accurate information about the ship stream to build test scenarios and benchmark instances. A statistical model of ship traffic is developed on the basis of container ship arrivals in eight world ports. The model provides three parameters of the arriving ships: ship size, arrival time and service time. The stream of ships is divided into classes according to vessel sizes. For each class, service time distributions and mixes of return time distributions are provided. A model of aperiodic arrivals is also proposed. Moreover, the results achieved are used to compare port specific features.
EN
In this paper, we make an attempt to use the QHY174M-GPS camera for the photometry research of fast-rotating artificial objects including debris, satellites and rocket bodies. This device is useful for imaging occultations, eclipses, meteors, and so on due to a highly precise recording of the time (GPS-based) and location of the observation on every frame and fast readout of the CMOS detector. The precision of time registration by the QHY174M-GPS camera is at the level of microseconds. All light curves obtained by studied camera during observations of artificial satellites in this work were carried out at Derenivka Observatory of Uzhhorod National University, Ukraine. The created photometric system with QHY174M-GPS camera as the detector and reflector telescope with parameters D=120 mm, F=114 mm, FOV=2.82°×1.76° was calibrated. For target observations, SharpCap software was used. For the purposes of photometry processing, ccd_phot software was developed using Python 3.8 programming language with astropy and photutils packages. Photometry observations of artificial satellites of the Earth and standard stars were carried out. Over 80 lightcurves of artificial satellites were obtained. Comparing synchronous observations from two sites, separated 15 km from each other, we can conclude that photometry on the QHY174M-GPS camera gave us the same shape of lightcurve and additional advantages, such as time of exposure or simplicity of usage.
EN
Cybersecurity has benefitted from Artificial Intelligence (AI) technologies for attack detection. However, recent advances in AI techniques, in tandem with their misuse, have outpaced parallel advancements in cyberattack classification methods that have been achieved through academic and industry-led efforts. We describe the shift in the evolution of AI techniques, and we show how recent AI approaches are effective in helping an adversary attain his/her objectives appertaining to cyberattacks. We also discuss how the current architecture of computer communications enables the development of AI-based adversarial threats against heterogeneous computing platforms and infrastructures.
EN
Nowadays data analysis of any kind becomes a piece of art. The same happens with the teaching processes of statistics, econometrics and other related courses. This is not only because we are facing (and are forced to) teach online or in a hybrid mode. Students expect to see not only the theoretical part of the study and solve some practical examples together with the instructor. They are waiting to see a variety of tools, tutorials, interactive laboratory instructions, books, exams online. In this study I am going to show the latest technical solutions for instructors using R and/or Python in their online data analysis labs.
PL
Obecnie analiza danych w różnym środowisku jest zwykle małym dziełem sztuki. To samo dzieje się z procesem dydaktycznym w tej dziedzinie (np. statystyką, ekonometrią itp.). Przyczyną tego stanu rzeczy jest nie tylko fakt, iż w dydaktyce wykorzystujemy e-learning, distance-learning, lub tylko narzędzia online wspomagające co nieco tradycyjne nauczanie. Stopień skomplikowania problemu oraz jego techniczne i programistyczne rozwiązania stanowią spore wyzwanie. Stąd oczekiwania studentów są większe niż dotąd, a samo tradycyjne wprowadzenie teoretyczne do problemów analitycznych (wykład) i ich rozwiązywanie (laboratorium, ćwiczenia) nie jest wystarczające. Instruktor powinien w swojej pracy laboratoryjnej wykorzystywać szereg narzędzi interaktywnych, tj. zadań, tutoriali, instrukcji, egzaminów online. W tej pracy zaprezentowane zostaną najpopularniejsze i najnowsze rozwiązania techniczne z zakresu e-nauczania analizy danych z wykorzystaniem języka R (i/lub Pythona).
5
Content available remote Wpływ kondycji łożysk na utrzymanie procesów produkcyjnych
PL
Współczesne aspekty Przemysłu 4.0 ukierunkowane są na analizę danych, ale również na podejmowanie odpowiednich decyzji dotyczących produkcji biznesowej oraz na obszary związane z predykcją uszkodzeń elementów maszyn, które decydują o zachowaniu ciągłości procesów produkcyjnych.
EN
Assessment of seismic vulnerability of urban infrastructure is an actual problem, since the damage caused by earthquakes is quite significant. Despite the complexity of such tasks, today’s machine learning methods allow the use of “fast” methods for assessing seismic vulnerability. The article proposes a methodology for assessing the characteristics of typical urban objects that affect their seismic resistance; using classification and clustering methods. For the analysis, we use kmeans and hkmeans clustering methods, where the Euclidean distance is used as a measure of proximity. The optimal number of clusters is determined using the Elbow method. A decision-making model on the seismic resistance of an urban object is presented, also the most important variables that have the greatest impact on the seismic resistance of an urban object are identified. The study shows that the results of clustering coincide with expert estimates, and the characteristic of typical urban objects can be determined as a result of data modeling using clustering algorithms.
7
Content available remote Using TeX Markup Language for 3D and 2D Geological Plotting
EN
The paper presents technical application of TeX high-level, descriptive markup language for processing geological dataset from soil laboratory. Geotechnical measurements included equivalent soil cohesion, absolute and absolute deformation index, soil compressibility coefficient by time of immersion depth, exposure time to compressive strength to samples and physical and mechanical properties (humidity, density). Dataset was received from laboratory based experimental tests of the physical and mechanical properties of soils. Data were converted to csv table and processed by LaTeX. Methodology is based on LaTeX packages: {tikz}, {tikz-3dplot}, {tikzpicture}, {pgfplot}, {filecontetns}, {spy} for 3D plotting showing correlation in variables and descriptive statistical analysis based on the data array processing. Results demonstrated LaTeX scripts and graphics: 2D and 3D scatterplots, ternaries, bar charts, boxplots, zooming techniques detailing fragment of the plot, flowchart. Research novelty consists in technical approach of TeX language application for geo- logical data processing and graphical visualization. Engineering graphics by TeX was demonstrated with screenshots of the codes used for plotting.
EN
The aim of the thesis is to create a model defining the style of play of a team playing in the Polish Ekstraklasa. The limitation to the highest Polish league class is dictated by the differences in the style of play depending on the league. The model is to be created on the basis of data about the team's game. To build the model, supervised and unsupervised learning techniques will be used and compared to find the relationship between the team's statistics and the determination of its playing style.
EN
For many years, satellite systems have seen widespread use in a variety of technical applications, as well as in operations related to setting-out and the exploitation of track infrastructure. Their main applications include an inventory of the trackage course and detecting changes in its position. In both of these tasks, the most important element that determines the quality of an analyses is the high accuracy of the determinations being carried out. Satellite surveying techniques are not always sufficiently accurate, and in such cases, it is necessary to employ other land surveying methods to process surveying data. This article presents the authors’ considerations with regards to the possibility of applying one of the most common land surveying adjustment methods, the parametric method, to operations related to an inventory of tram infrastructure in Gdańsk. The results are based on surveys carried out during a surveying campaign in the autumn of 2018. The considerations presented in the article concern a small part of the research conducted under project No. POIR.04.01.01-00-0017/17 entitled “Development of an innovative method for determining the precise trajectory of a railborne vehicle” which is being implemented by a consortium of Gdansk University of Technology and Gdynia Maritime University
EN
The article presents methods of mapping data, information and knowledge value stream in order to identify the flow of key processes and to return knowledge exchange in the examined company. One large production company was audited. The amount of incoming to the company and processing information is so large that it must implement methods to improve their flow. Problems with knowledge exchange associated with high rotation contributed to analyzing the problem.
EN
In the first article in this series, the research methodology concerning the analysis of inspection errors based on MSA attribute study data set for the improvement purposes was presented. In the final article in the series, applying the methodology in practice step by step was presented. Instructions for correct performance of the analysis, in compliance with the author's procedure, were determined. Both advantages and disadvantages of the developed approach were underlined.
12
Content available remote Digital assets for project-based studies and data-driven project management
EN
Projects offer learning opportunities and digital data that can be analyzed through a multitude of theoretical lenses. They are key vehicles for economic and social action, and they are also a primary source of innovation, research, and organizational change. This research involves a survey of digital assets available through a project; specifically, it identifies sources of data that can be used for practicing data-driven, context-specific project management, or for project-based academic research. It identified four categories of data sources - communications, reports/records, model representations, and computer systems - and 48 digital assets. The list of digital assets can be inputs in the creation of project artifacts and sources for monitoring and controlling project activities and for sense-making in retrospectives or lessons learned. Moreover, this categorization is useful for decision support and artificial intelligence systems model development that requires real-world data.
EN
The complexity of managing the capacities of large IT infrastructures is constantly increasing as more network devices are connected. This task can no longer be performed manually, so the system must be monitored at runtime and estimations of future conditions must be made automatically. However, since using a single forecasting method typically performs poorly, this paper presents a framework for forecasting univariate network device workload traces using multiple forecasting methods. First, the time series are preprocessed by imputing missing data and removing anomalies. Then, different features are derived from the univariate time series, depending on the type of forecasting method. In addition, a recommendation approach for selecting the most suitable forecasting method from this set of algorithms for each time series based only on its historical values is proposed. For this purpose, the performance of the forecasting methods is approximated using the historical data of the respective time series under consideration. The framework is used in the FedCSIS 2020 Challenge and shows good forecasting quality with an average R2 score of 0.2575 on the small test data set.
PL
Jedną z metod pozwalających na utrzymanie kontrolowanego wysokiego stopnia sprawności maszyn górniczych jest ich bieżący monitoring. Czujniki pomiarowe oraz urządzenia obsługujące transmisję danych zabudowywane są na wybranych kluczowych grupach maszyn. Odczytywane dane wprowadzane są do hurtowni danych i interpretowane w oparciu o dysponowane oprogramowanie. Zasób tych danych oraz ustawicznie wzbogacana wiedza w zakresie pracy maszyn przesuwają metody interpretacji w kierunku wielowymiarowej analizy danych produkcyjnych. W artykule omówiono schematy logiczne gromadzenia danych i przewidywane tendencje rozwojowe w zakresie interpretacji uzyskiwanych informacji przy użyciu nowoczesnych narzędzi informatycznych.
EN
One of the methods to maintain high efficiency of mining machinery is monitoring. Measuring sensors are installed on selected groups of machines. The read data are entered into the data warehouse. The available software enables their correct interpretation. The article also discusses ssues related to the proper collection of data.
PL
Wprowadzenie technologii smart grid i smart metering jest koniecznością współczesnego rynku energii elektrycznej. Inteligentne liczniki energii elektrycznej są stopniowo instalowane u klientów indywidualnych i biznesowych. W niniejszym artykule pokazano przykładowe rozwiązanie, które za pomocą urządzenia typu beacon pozwala odczytywać dane pomiarowe z liczników elektronicznych, wyposażonych w port optyczny - niekoniecznie inteligentnych. Monitorowanie na bieżąco stanu licznika daje zarówno odbiorcy, jak i dostawcy możliwości analizy obciążeń i śledzenie zużycia energii u odbiorcy. W artykule pokazano na przykładzie testowych instalacji u dwóch rodzin, że już nawet bardzo proste analizy zużycia energii mogą dostarczyć użytecznych informacji.
EN
Roll-out of smart grids and smart metering technologies is a must for today's electricity markets. Smart meters are gradually being installed at households and business consumers. This article presents a sample solution that using a beacon allows to read energy consumption data through optical ports of electronic meters, even those non-smart ones. Near real-time read-out of an electricity meter's counter allows both the consumer and the provider to analyze power load and energy consumption of the end-user. The article shows, using an example of test installations at two families, that even simple analyses of energy consumption data can provide valuable information.
PL
Nowoczesne systemy elektroenergetyczne bardzo często wykorzystują sprzętowe i programowe rozwiązania zaczerpnięte ze stosowanych od wielu lat systemów teleinformatycznych. Należą do nich m.in.: systemy pomiarowo-rozliczeniowe pozyskujące dane bezpośrednio z urządzeń podstacji energetycznych, cyfrowe układy kontrolno-sterujące oraz systemy wspierające proces przetwarzania danych. Infrastruktura teleinformatyczna stanowi więc platformę komunikacyjną dla wszystkich podmiotów rynku energii i jednocześnie wszystkich poziomów infrastruktury technicznej. Mając na uwadze tak mocną integrację systemów teleinformatycznych z systemami elektroenergetycznymi, należy zwrócić uwagę na fakt, że wraz ze wzrostem stopnia złożoności inteligentnych systemów elektroenergetycznych wzrasta również ryzyko wystąpienia awarii systemowej lub cyberataku, co w przypadku tak złożonego i rozproszonego systemu może mieć fatalne skutki. W związku z tym ochrona danych i dostępu do własnej infrastruktury staje się wręcz nieodzowna i stanowi duże, często trudne do realizacji zadanie [1].
EN
Modern power systems often apply hardware and software solutions used for many years in Information and Communication Technologies (ICT), amongst others: metering and billing systems, that collect data directly from energy substation devices, digital control and support data processing systems. ICT systems are communication platforms for every energy market entities and at the same time for every layers of technical infrastructure. Bearing in mind such strong integration of ICT systems with energy power systems, one should pay attention to the fact that with increase of Smart Grids complexity level rises also the risk of cyberattack occurrence. Listed above can have very bad effects in the case of complex and distributed systems. Therefore, security of data and access to its own infrastructure are becoming more and more necessary and often a hard to realize task [1].
EN
The study area is focused on the Kuril–Kamchatka Trench, North Pacific Ocean. This region is geologically complex, notable for the lithosphere activity, tectonic plates subduction and active volcanism. The submarine geomorphology is complicated through terraces, slopes, seamounts and erosional processes. Understanding geomorphic features of such a region requires precise modelling and effective visualization of the high-resolution data sets. Therefore, current research presents a Generic Mapping Tools (GMT) based algorithm proposing a solution for effective data processing and precise mapping: iterative module-based scripting for the automated digitizing and modelling. Methodology consists of the following steps: topographic mapping of the raster grids, marine gravity and geoid; semi-automatic digitizing of the orthogonal cross-section profiles; modelling geomorphic trends of the gradient slopes; computing raster surfaces from the xyz data sets by modules nearneighbor and XYZ2grd. Several types of the cartographic projections were used: oblique Mercator, Mercator cylindrical, conic equal-area Albers, conic equidistant. The cross-section geomorphic profiles in a perpendicular direction across the two selected segments of the trench were automatically digitized. Developed algorithm of the semi-automated digitizing of the profiles enabled to visualize gradients of the slope steepness of the trench. The data were then modelled to show gradient variations in its two segments. The results of the comparative geomorphic analysis of northern and southern transects revealed variations in different parts of the trench. Presented research provided more quantitative insights into the structure and settings of the submarine landforms of the hadal trench that still remains a question for the marine geology. The research demonstrated the effectiveness of the GMT: a variety of modules, approaches and tools that can be used to produce high-quality mapping and graphics. The GMT listings are provided for repeatability.
PL
Zaprezentowano analizę zastosowania elementów Przemysłu 4.0 w modelowaniu procesów innowacyjnych w przemyśle chemicznym. Analiza ta wskazała obszary, które w procesach innowacyjnych umożliwiają zastosowanie rozwiązań Przemysłu 4.0, ale wymagają także dodatkowych działań, np. kontrolnych, na wczesnym etapie ich wdrażania. Analiza wskazała także zarówno spodziewane, jak i wstępnie osiągnięte efekty zastosowania rozwiązań Przemysłu 4.0 przez badane przedsiębiorstwa.
EN
Readiness of Polish chem. enterprises to implement the elements of Industry 4.0 in their industrial practice was tested by a poll. Eleven representatives of 6 enterprises answered to the inquiry. The anal. of the operational effects was found the most important tool for increasing the prodn. efficiency. An improvement of resources utilization and a reduction of prodn. costs were mostly expected as results of the Industry 4.0 implementation.
20
Content available Podstawowe programy do analizy sieci
PL
Tradycyjna analiza danych koncentruje się na atrybutach aktora. Ponieważ jednak zachowanie aktora nie zawsze jest niezależne, konieczna jest obserwacja aktorów z perspektywy ich związków z siecią. Jest takie tradycyjne koreańskie powiedzenie, zawarte w księdze Myungshimbogam1, które w polskiej wersji brzmi: „Pokaż mi swoich przyjaciół, a powiem ci, kim jesteś”. Przytaczamy te słowa, aby zmienić punkt widzenia z atrybutów jednego aktora na sieć. Analizę Big Data z tej perspektywy można przeprowadzać za pomocą dostępnych na rynku programów do analizy sieci. Omawiamy kilka najpopularniejszych z nich, w tym UCINET (University of California Irvine NETwork), NetMiner, R, Gephi i NodeXL. UNICET i NetMiner to kompleksowe programy, w którym można stosować różne techniki analizy sieci. NodeXL oraz R to oprogramowanie do obliczeń statystycznych, natomiast Gephi służy głównie do wizualizacji. Podstawową analizę i wizualizację danych w NodeXL, można wykonać, wprowadzając dane o sieci za pomocą szablonu Excela.
first rewind previous Strona / 9 next fast forward last
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.