Nowa wersja platformy, zawierająca wyłącznie zasoby pełnotekstowe, jest już dostępna.
Przejdź na https://bibliotekanauki.pl

PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
2022 | Vol. 30 | 109--116
Tytuł artykułu

NiaNet: A framework for constructing Autoencoder architectures using nature-inspired algorithms

Wybrane pełne teksty z tego czasopisma
Warianty tytułu
Konferencja
Federated Conference on Computer Science and Information Systems (17 ; 04-07.09.2022 ; Sofia, Bulgaria)
Języki publikacji
EN
Abstrakty
EN
Autoencoder, an hourly glass-shaped deep neural network capable of learning data representation in a lower dimension, has performed well in various applications. However, developing a high-quality AE system for a specific task heavily relies on human expertise, limiting its widespread application. On the other hand, there has been a gradual increase in automated machine learning for developing deep learning systems without human intervention. However, there is a shortage of automatically designing particular deep neural networks such as AE. This study presents the NiaNet method and corresponding software framework for designing AE topology and hyper-parameter settings. Our findings show that it is possible to discover the optimal AE architecture for a specific dataset without the requirement for human expert assistance. The future potential of the proposed method is also discussed in this paper.
Wydawca

Rocznik
Tom
Strony
109--116
Opis fizyczny
Bibliogr. 36 poz., il., tab.
Twórcy
  • Faculty of Electrical Engineering and Computer Science University of Maribor Koroška cesta 46, 2000 Maribor Slovenia, saso.pavlic@student.um.si
  • Faculty of Electrical Engineering and Computer Science University of Maribor Koroška cesta 46, 2000 Maribor Slovenia, saso.karakatic@um.si
  • Faculty of Electrical Engineering and Computer Science University of Maribor Koroška cesta 46, 2000 Maribor Slovenia, iztok.fister1@um.si
Bibliografia
  • 1. F. Yu, Z. Qin, C. Liu, D. Wang, and X. Chen, “REIN the RobuTS: Robust DNN-Based Image Recognition in Autonomous Driving Systems,” IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, vol. 40, no. 6, pp. 1258–1271, Jun. 2021, conference Name: IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems.
  • 2. Y. Wu, M. Schuster, Z. Chen, Q. V. Le, M. Norouzi, W. Macherey, M. Krikun, Y. Cao, Q. Gao, K. Macherey, J. Klingner, A. Shah, M. Johnson, X. Liu, Kaiser, S. Gouws, Y. Kato, T. Kudo, H. Kazawa, K. Stevens, G. Kurian, N. Patil, W. Wang, C. Young, J. Smith, J. Riesa, A. Rudnick, O. Vinyals, G. Corrado, M. Hughes, and J. Dean, “Google’s Neural Machine Translation System: Bridging the Gap between Human and Machine Translation,” https://arxiv.org/abs/1609.08144 [cs], Oct. 2016, arXiv:1609.08144. [Online]. Available: http://arxiv.org/abs/1609.08144
  • 3. S. Shekhar, A. Singh, and A. K. Gupta, “A Deep Neural Network (DNN) Approach for Recommendation Systems,” in Advances in Computational Intelligence and Communication Technology, ser. Lecture Notes in Networks and Systems, X.-Z. Gao, S. Tiwari, M. C. Trivedi, P. K. Singh, and K. K. Mishra, Eds. Singapore: Springer, 2022, pp. 385–396.
  • 4. J. Jumper, R. Evans, A. Pritzel, T. Green, M. Figurnov, O. Ronneberger, K. Tunyasuvunakool, R. Bates, A. Žı́dek, A. Potapenko, A. Bridgland, C. Meyer, S. A. A. Kohl, A. J. Ballard, A. Cowie, B. Romera-Paredes, S. Nikolov, R. Jain, J. Adler, T. Back, S. Petersen, D. Reiman, E. Clancy, M. Zielinski, M. Steinegger, M. Pacholska, T. Berghammer, S. Bodenstein, D. Silver, O. Vinyals, A. W. Senior, K. Kavukcuoglu, P. Kohli, and D. Hassabis, “Highly accurate protein structure prediction with AlphaFold,” Nature, vol. 596, no. 7873, pp. 583–589, Aug. 2021, number: 7873 Publisher: Nature Publishing Group. [Online]. Available: https://www.nature.com/articles/s41586-021-03819-2
  • 5. Z. Li, M. Pan, T. Zhang, and X. Li, “Testing DNN-based Autonomous Driving Systems under Critical Environmental Conditions,” in Proceedings of the 38th International Conference on Machine Learning. PMLR, Jul. 2021, pp. 6471–6482, iSSN: 2640-3498. [Online]. Available: https://proceedings.mlr.press/v139/li21r.html
  • 6. J. N. K. Liu, Y. Hu, Y. He, P. W. Chan, and L. Lai, “Deep Neural Network Modeling for Big Data Weather Forecasting,” in Information Granularity, Big Data, and Computational Intelligence, ser. Studies in Big Data, W. Pedrycz and S.-M. Chen, Eds. Cham: Springer International Publishing, 2015, pp. 389–408. [Online]. Available: https://doi.org/10.1007/978-3-319-08254-7 19
  • 7. P. Dhar, “The carbon impact of artificial intelligence,” Nature Machine Intelligence, vol. 2, no. 8, pp. 423–425, 2020.
  • 8. E.-G. Talbi, “Automated Design of Deep Neural Networks: A Survey and Unified Taxonomy,” ACM Computing Surveys, vol. 54, no. 2, pp. 34:1–34:37, Mar. 2021. [Online]. Available: https://doi.org/10.1145/3439730
  • 9. G. Vrbančič, I. Fister jr, and V. Podgorelec, Designing Deep Neural Network Topologies with Population-Based Metaheuristics, Sep. 2018.
  • 10. L. Pečnik and I. Fister, “NiaAML: AutoML framework based on stochastic population-based nature-inspired algorithms,” Journal of Open Source Software, vol. 6, no. 61, p. 2949, May 2021. [Online]. Available: https://joss.theoj.org/papers/10.21105/joss.02949
  • 11. V. K. Ojha, A. Abraham, and V. Snášel, “Metaheuristic design of feedforward neural networks: A review of two decades of research,” Engineering Applications of Artificial Intelligence, vol. 60, pp. 97–116, Apr. 2017. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0952197617300234
  • 12. R. Miikkulainen, “Neuroevolution.”
  • 13. K. O. Stanley and R. Miikkulainen, “Evolving neural networks through augmenting topologies,” vol. 10, no. 2, pp. 99–127. [Online]. Available: https://direct.mit.edu/evco/article/10/2/99-127/1123
  • 14. A. Conradie, R. Miikkulainen, and C. Aldrich, “Intelligent process control utilising symbiotic memetic neuro-evolution,” in Proceedings of the 2002 Congress on Evolutionary Computation. CEC’02 (Cat. No.02TH8600), vol. 1, pp. 623–628 vol.1.
  • 15. A. Hara, J.-i. Kushida, K. Kitao, and T. Takahama, “Neuroevolution by particle swarm optimization with adaptive input selection for controlling platform-game agent,” in 2013 IEEE International Conference on Systems, Man, and Cybernetics, pp. 2504–2509, ISSN: 1062-922X.
  • 16. E. Galván and P. Mooney, “Neuroevolution in Deep Neural Networks: Current Trends and Future Challenges,” IEEE Transactions on Artificial Intelligence, vol. 2, no. 6, pp. 476–493, Dec. 2021, conference Name: IEEE Transactions on Artificial Intelligence.
  • 17. C. Broni-Bediako, “Automated Deep Neural Networks with Gene Expression Programming of Cellular Encoding - Towards the Applications in Remote Sensing Image Understanding-,” Mar. 2022. [Online]. Available: https://soka.repo.nii.ac.jp/index.php?active_action=repository_view_main_item detail&page_id=13&block_id=68&item_id=40743&item_no=1
  • 18. T. Elsken, J. H. Metzen, and F. Hutter, “Neural Architecture Search: A Survey,” https://arxiv.org/abs/1808.05377 [cs, stat], Apr. 2019, arXiv: 1808.05377. [Online]. Available: http://arxiv.org/abs/1808.05377
  • 19. X. Yao, “Evolving artificial neural networks,” Proceedings of the IEEE, vol. 87, no. 9, pp. 1423–1447, Sep. 1999, conference Name: Proceedings of the IEEE.
  • 20. E. Thomas, M. Jan Hendrik, and H. Frank, “Neural Architecture Search: A Survey,” https://arxiv.org/abs/1808.05377 [cs, stat], Apr. 2019, arXiv: 1808.05377. [Online]. Available: http://arxiv.org/abs/1808.05377
  • 21. M. Scanagatta, A. Salmerón, and F. Stella, “A survey on Bayesian network structure learning from data,” Progress in Artificial Intelligence, vol. 8, no. 4, pp. 425–439, Dec. 2019. [Online]. Available: https://doi.org/10.1007/s13748-019-00194-y
  • 22. X. He, K. Zhao, and X. Chu, “AutoML: A survey of the state-of-the-art,” Knowledge-Based Systems, vol. 212, p. 106622, Jan. 2021. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0950705120307516
  • 23. B. Evans, “Population-based Ensemble Learning with Tree Structures for Classification,” thesis, Open Access Te Herenga Waka-Victoria University of Wellington, Jan. 2019. [Online]. Available: https://openaccess.wgtn.ac.nz/articles/thesis/Population-based_Ensemble_Learning_with_Tree_Structures_for_Classification/17136296/1
  • 24. J. Meehan, N. Tatbul, C. Aslantas, and S. Zdonik, “Data ingestion for the connected world,” p. 11.
  • 25. G. Vrbančič, L. Brezočnik, U. Mlakar, D. Fister, and I. Fister, “NiaPy: Python microframework for building nature-inspired algorithms,” Journal of Open Source Software, vol. 3, no. 23, p. 613, Mar. 2018. [Online]. Available: https://joss.theoj.org/papers/10.21105/joss.00613
  • 26. C. M. Bishop, Neural Networks for Pattern Recognition. USA: Oxford University Press, Inc., 1995, p. 332.
  • 27. Diabetes data. [Online]. Available: https://www4.stat.ncsu.edu/~boos/var.select/diabetes.html
  • 28. “NiaNet/autoencoder.py at 408b7fe0f4634439eb69e75f6b0c5afb18ce0702· SasoPavlic/NiaNet.” [Online]. Available: https://github.com/SasoPavlic/NiaNet
  • 29. “scikit-learn: machine learning in Python — scikit-learn 1.0.2 documentation.” [Online]. Available: https://scikit-learn.org/stable/
  • 30. “NumPy.” [Online]. Available: https://numpy.org/
  • 31. PyTorch. [Online]. Available: https://www.pytorch.org
  • 32. J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of ICNN’95-international conference on neural networks, vol. 4. IEEE, 1995, pp. 1942–1948.
  • 33. R. Storn and K. Price, “Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces,” Journal of global optimization, vol. 11, no. 4, pp. 341–359, 1997.
  • 34. X.-S. Yang, Nature-inspired metaheuristic algorithms. Luniver press, 2010.
  • 35. J. Brest, S. Greiner, B. Boskovic, M. Mernik, and V. Zumer, “Self-adapting control parameters in differential evolution: A comparative study on numerical benchmark problems,” IEEE transactions on evolutionary computation, vol. 10, no. 6, pp. 646–657, 2006.
  • 36. J. H. Holland, Adaptation in natural and artificial systems: an introductory analysis with applications to biology, control, and artificial intelligence. MIT press, 1992.
Uwagi
1. Track 1: 17th International Symposium on Advanced Artificial Intelligence in Applications
2. Opracowanie rekordu ze środków MEiN, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2022-2023).
Typ dokumentu
Bibliografia
Identyfikatory
Identyfikator YADDA
bwmeta1.element.baztech-0a1e5716-1991-432e-b5c5-dcea336682be
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.