PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Distributed accelerated projection - based consensus decomposition

Autorzy
Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
With the development of machine learning and Big Data, the concepts of linear and non-linear optimization techniques are becoming increasingly valuable for many quantitative disciplines. Problems of that nature are typically solved using distinctive optimization algorithms, iterative methods, or heuristics. A new variant of the Accelerated Projection-Based Consensus (APC) iterative method is proposed, which is faster than its classical version while handling large sparse matrices in distributed settings. The algorithm is proposed, and its description and implementation in a high-level programming language are presented. Convergence tests measuring acceleration factors based on real-world datasets are done, and their results are promising. The results of this research can be used as an alternative to solving numerical optimization problems.
Rocznik
Strony
32--38
Opis fizyczny
Bibliogr. 25 poz., rys., tab.
Twórcy
autor
  • Gdansk University of Technology, Centre of Informatics – Tricity Academic Supercomputer Network
Bibliografia
  • [1] Alexander Burton. OLS (Linear) Regression, pp. 509–514. Wiley, 08 2021.
  • [2] W. Keith Nicholson. Linear Algebra with Applications. Lyryx Learning Inc., Calgary, Alberta, Canada, 2020. Book version 2021A.
  • [3] Arthur Stanley Goldberger et al. Econometric theory. New York: John Wiley & Sons., 1964.
  • [4] Julien Simon. Large language models: A new Moore’s law?, Oct 2021.
  • [5] Brian Swenson, Ryan Murray, Soummya Kar, and H Vincent Poor. Distributed stochastic gradient descent: Nonconvexity, nonsmoothness, and convergence to local minima. arXiv preprint arXiv:2003.02818, 2020.
  • [6] Ermin Wei and Asuman Ozdaglar. Distributed alternating direction method of multipliers. In 2012 IEEE 51st IEEE Conference on Decision and Control (CDC), pp. 5445–5450, 2012.
  • [7] Navid Azizan-Ruhi, Farshad Lahouti, Salman Avestimehr, and Babak Hassibi. Distributed solution of largescale linear systems via accelerated projection-based consensus, 2017.
  • [8] Eman Shaikh, Iman Mohiuddin, Yasmeen Alufaisan, and Irum Nahvi. Apache spark: A big data processing engine. pp. 1–6, 11 2019.
  • [9] Matthew Rocklin. Dask: Parallel computation with blocked algorithms and task scheduling. pp. 126–132. Python in Science Conference, 01 2015.
  • [10] Guido van Rossum. Python programming language. In USENIX Annual Technical Conference, 2007.
  • [11] V. Klema and A. Laub. The singular value decomposition: Its computation and some applications. IEEE Transactions on Automatic Control, 25(2):164–176, 1980.
  • [12] Stephen Andrilli and David Hecker. Chapter 1 – vectors and matrices. In Stephen Andrilli and David Hecker, editors, Elementary Linear Algebra (Fifth Edition), pp. 1–83. Academic Press, Boston, fifth edition, 2016.
  • [13] E.K.P. Chong and S.H. Zak. An Introduction to Optimization. Wiley-Interscience Series in Discrete Mathematics and Optimi. Wiley, 2004.
  • [14] Walter Gander. Algorithms for the QR-decomposition. Seminar für Angewandte Mathematik: Research report, 1980.
  • [15] Erik Vleck. On the error in the product QR decomposition. SIAM J. Matrix Analysis Applications, 31:1775–1791, 01 2010.
  • [16] L.N. Trefethen and D. Bau. Numerical Linear Algebra. Other Titles in Applied Mathematics. Society for Industrial and Applied Mathematics (SIAM, 3600 Market Street, Floor 6, Philadelphia, PA 19104), 1997.
  • [17] Alberto Moreira. Cs557a: Solving linear systems of equations. 2000.
  • [18] Steven C. Althoen and Renate McLaughlin. Gauss-Jordan reduction: A brief history. The American Mathematical Monthly, 94(2):130–142, 1987.
  • [19] Zhikuan Zhao, Jack K Fitzsimons, Michael A Osborne, Stephen J Roberts, and Joseph F Fitzsimons. Quantum algorithms for training Gaussian processes. Physical Review A, 100(1):012304, 2019.
  • [20] Daniel Povey, Gaofeng Cheng, Yiming Wang, Ke Li, Hainan Xu, Mahsa Yarmohammadi, and Sanjeev Khudanpur. Semi-orthogonal low-rank matrix factorization for deep neural networks. In Interspeech, pp. 3743–3747, 2018.
  • [21] Pauli Virtanen, Ralf Gommers, Travis E. Oliphant, Matt Haberland, Tyler Reddy, David Cournapeau, Evgeni Burovski, Pearu Peterson, Warren Weckesser, Jonathan Bright, Stéfan J. van der Walt, Matthew Brett, Joshua Wilson, K. Jarrod Millman, Nikolay Mayorov, Andrew R.J. Nelson, Eric Jones, Robert Kern, Eric Larson, C J Carey, ̇Ilhan Polat, Yu Feng, Eric W. Moore, Jake VanderPlas, Denis Laxalde, Josef Perktold, Robert Cimrman, Ian Henriksen, E.A. Quintero, Charles R. Harris, Anne M. Archibald, Antônio H. Ribeiro, Fabian Pedregosa, Paul van Mulbregt, and SciPy 1.0 Contributors. SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python. Nature Methods, 17:261–272, 2020.
  • [22] Charles R. Harris, K. Jarrod Millman, Stéfan J. van der Walt, Ralf Gommers, Pauli Virtanen, David Cournapeau, Eric Wieser, Julian Taylor, Sebastian Berg, Nathaniel J. Smith, Robert Kern, Matti Picus, Stephan Hoyer, Marten H. van Kerkwijk, Matthew Brett, Allan Haldane, Jaime Fernández del Río, Mark Wiebe, Pearu Peterson, Pierre Gérard-Marchant, Kevin Sheppard, Tyler Reddy, Warren Weckesser, Hameer Abbasi, Christoph Gohlke, and Travis E. Oliphant. Array programming with NumPy. Nature, 585(7825):357–362, September 2020.
  • [23] Alexei Botchkarev. A new typology design of performance metrics to measure errors in machine learning regression algorithms. Interdisciplinary Journal of Information, Knowledge, and Management, 14:045–076, 2019.
  • [24] Timothy A. Davis and Yifan Hu. The university of Florida sparse matrix collection. ACM Trans. Math. Softw., 38(1), Dec 2011.
  • [25] Gary Brassington. Mean absolute error and root mean square error: which is the better metric for assessing model performance? In EGU General Assembly Conference Abstracts, EGU General Assembly Conference Abstracts, pp. 3574, April 2017.
Uwagi
Opracowanie rekordu ze środków MEiN, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2022-2023).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-4c851dd6-62ad-4441-82e6-af67c49820f1
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.