Warianty tytułu
Języki publikacji
Abstrakty
Graphics processing units (GPU) have become the foundation of artificial intelligence. Machine learning was slow, inaccurate, and inadequate for many of today’s applications. The inclusion and utilization of GPUs made a remarkable difference in large neural networks. The numerous core processors on a GPU allow machine learning engineers to train complex models using many files relatively quickly. The ability to rapidly perform multiple computations in parallel is what makes them so effective; with a powerful processor, the model can make statistical predictions about very large amounts of data. GPUs are widely used in machine learning because they offer more power and speed than CPUs. In this paper, we show the use of GPU for solving a scheduling problem. The results show that this idea is useful, especially for large optimization problems.
Rocznik
Tom
Strony
81--96
Opis fizyczny
Bibliogr. 17 poz., rys., tab., wykr.
Twórcy
autor
- University of Siedlce, Faculty of Exact and Natural Sciences, Institute of Computer Science, piotr.switalski@uws.edu.pl
autor
- University of Siedlce, Faculty of Exact and Natural Sciences, Institute of Computer Science, karolina.siwiak@vp.pl
Bibliografia
- 1. Drabas T.: Scikit-learn Tutorial - Beginner’s Guide to GPU Accelerated ML Pipelines, available at https://developer.nvidia.com/blog/scikit-learn-tutorial-beginners-guide-to-gpu-accelerated-ml-pipelines, access time: 01.09.2023.
- 2. Graham, R.: Bounds for certain multiprocessing anomalies. Bell System Technical Journal. 45 (9), pp. 1563-1581. DOI: https://doi.org/10.1002/j.1538-7305.1966.tb01709.x.
- 3. Hearty J.: Zaawansowane uczenie maszynowe z językiem Python, Helion 2017
- 4. Junfei Q., Qihui W., Guoru D., Yuhua X., Shuo F.: A survey of machine learning for big data processing, EURASIP Journal on Advances in Signal Processing volume 2016.
- 5. Li Y., Carabelli S., Fadda E. et al. Machine learning and optimization for production rescheduling in Industry 4.0. The International Journal of Advanced Manufacturing Technology 110, 2445-2463 (2020), DOI: https://doi.org/10.1007/s00170-020-05850-5
- 6. Marr B.: How Much Data Do We Create Every Day? The Mind-Blowing Stats Everyone Should Read, available at https://bernardmarr.com/how-much-data-do-we-create-every-day-the-mind-blowing-stats-everyone-should-read, access time: 01.09.2023.
- 7. Park J., ChunJ., Hun Kim S., Kim Y., ParkJ.: Learning to schedule job-shop problems: Representation and policy learning using graph neural network and reinforcement learning, Artificial Intelligence, Multiagent Systems, DOI: https://doi.org/10.1080/00207543.2020.1870013
- 8. Parmentier A., T’kindt V.: Learning to solve the single machine scheduling problem with release times and sum of completion times. ArXiv 2021.
- 9. Raschka S., Mirjalili V.: Python : uczenie maszynowe, Helion 2019
- 10. Rikiya Y., Mizuho N., Gian Do Richard K., Kaori T.: Convolutional neural networks: an overview and application in radiology", Springer Nature 2018, pp. 611-629
- 11. Shetty C., Sarojadevi H.: Framework for Task scheduling in Cloud using Machine Learning Techniques, Fourth International Conference on Inventive Systems and Control (ICISC), Coimbatore, India, 2020, pp. 727-731, DOI: 10.1109/ICISC47916.2020.9171141.
- 12. Weise T., An Introduction to Optimization Algorithms. Hefei, Anhui, China: Institute of Applied Optimization (IAO), School of Artificial Intelligence and Big Data, Hefei University, 2018-2019. Available at: http://thomasweise.github.io/aitoa.
- 13. Zelin Z., Wanliang W., Yuhang S., Linyan L., Weikun L, Yule W., Yanwei Z.: Hybrid Deep Neural Network Scheduler for Job-Shop Problem Based on Convolution Two-Dimensional Transformation, Computational Intelligence and Neuroscience 2019, DOI: https://doi.org/10.1155/2019/7172842
- 14. https://www.tensorflow.org, access time: 01.09.2023.
- 15. https://pytorch.org, access time: 01.09.2023.
- 16. https://mxnet.apache.org, access time: 01.09.2023.
- 17. https://developer.nvidia.com/tensorrt, access time: 01.09.2023.
Typ dokumentu
Bibliografia
Identyfikatory
Identyfikator YADDA
bwmeta1.element.baztech-eb1abceb-32dd-41df-ad41-957aad637383