PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Examining Supply Chain Risks in Autonomous Weapon Systems and Artificial Intelligence

Autorzy
Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
The development of increasingly AI-enabled autonomous systems and other military applications of Artificial Intelligence (AI) have been recognised as emergent major military innovations. In the absence of an effective and enforceable ban on their development and/or usage arising from the Group of Governmental Experts on Lethal Autonomous Weapon Systems (LAWS), it is likely that such systems will continue to be development. Amongst the legal, ethical, practical, and strategic concerns raised by the emergence of such systems, it is important not to lose sight of the risks involved in relying on a high-manufactured system in place of a human. This places additional strains and importance on securing diverse, com plex, and over cross-jurisdictional supply chains. This article focuses on the vulnerability of and the risks to the integrity and security of the supply chains responsible for producing AI-enabled autonomous military systems.
Rocznik
Strony
1--21
Opis fizyczny
Bibliogr. 58 poz.
Bibliografia
  • 1. A. Wyatt, The Disruptive Impact of Lethal Autonomous Weapons Systems Diffusion: Modern Melians and the Dawn of Robotic Warriors. Oxon and New York: Routledge, 2021.
  • 2. E. B. Kania. (2019). Chinese military innovation in artificial intelligence. Testimony to the US-China Economic and Security Review Commission. [Online]. Available: https://www.cnas.org/publicati.... [Accessed: Jan. 22, 2023].
  • 3. F. Sauer, “Stopping ‘killer robots’: Why now is the time to ban autonomous weapons systems,” Arms Control Today, vol. 46, no. 8, pp. 8 – 13, 2016.
  • 4. Office of the Under Secretary of Defense for Policy, Directive 3000.09, UnitedS tates Department of Defense, 2012.
  • 5. Development, Concepts and Doctrine Centre, Joint Concept Note 1/18: Human Machine Teaming, United Kingdom Ministry of Defence, 2018.
  • 6. Robotic and Autonomous Systems Implementation & Coordination Office, Robotic& Autonomous Systems Strategy v2.0, Canberra: Australian Army, 2022.
  • 7. M. C. Horowitz, “Why Words Matter: The Real World Consequences of Defining Autonomous Weapons Systmes, ” Temple International and Comparative Law Journal, vol. 30, pp. 85 – 98, 2016.
  • 8. P. Scharre, Four Battlegrounds: Power in the Age of Artificial Intelligence. New York: WW Norton, 2023.
  • 9. H. M. Roff, “The strategic robot problem: Lethal autonomous weapons in war,” Journal of Military Ethics, vol. 13, no. 3, pp. 211-227, 2014, doi:10.1080/15027570.2014.975010.
  • 10. I. Bode, H. Huelss, and A. Nadibaidze, “Written Evidence AIW 0015,” presented at the UK House of Lords AI in Weapon Systems Select Committee, 4 May 2023.[Online]. Available: https://committees.parliament..... [Accessed: Jun. 6, 2023].
  • 11. L. Righetti, N. Sharkey, R. Arkin, D. Ansell, M. Sassoli, et al., “Autonomous weapon systems: technical, military, legal and humanitarian aspects,” Proceedings of the International Committee of the Red Cross. Geneva, Switzerland, pp. 26 – 28, 2014.
  • 12. A. Wyatt, “Charting great power progress toward a lethal autonomous weapon system demonstration point,” Defence Studies, vol. 20, no. 1, pp. 1 – 20, 2020, doi:10.1080/14702436.2019.1698956.
  • 13. A. Ghadge, Maximilian Weiß, Nigel D. Caldwell, and R. Wilding, “Managing cyber risk in supply chains: A review and research agenda,” Supply Chain Management: An International Journal, vol. 25, no. 2, pp. 223 – 240, 2020, doi:10.1108/SCM-10-2018-0357.
  • 14. S. Abaimov and M. Martellini, “Artificial intelligence in autonomous weapon systems,” 21st Century Prometheus: Managing CBRN Safety and Security Affected by Cutting-Edge Technologies, pp. 141 – 177, 2020.
  • 15. M. C. Horowitz, The diffusion of military power (The Diffusion of Military Power).Princeton, NJ: Princeton University Press, 2010.
  • 16. A. Wyatt and J. Galliott, “The revolution of autonomous systems and its implications for the arms trade,” in Research Handbook on the Arms Trade, A.T.H. Tan, Ed. Cheltenham and Northampton, MA: Edward Elgar Publishing, 2020, pp. 389 – 405.
  • 17. A. B. Silverstein, “Revolutions in military affairs: A theory on first-mover advantage,” B.A. thesis, University of Pennsylvania, Philadelphia, 2013.
  • 18. J. Kwik, “Mitigating the Risk of Autonomous-Weapon Misuse by Insurgent Groups, ”Laws, vol. 12, no. 1, 2023, doi: 10.3390/laws12010005.
  • 19. K. Chávez and O. Swed, “The proliferation of drones to violent non state actors,”Defence Studies, vol. 21, no. 1, pp. 1 – 24, 2021, doi: 10.1080/14702436.2020.1848426.
  • 20. M. I. B. Amirruddin, “How Threat Assessments Can Become Self-Fulfilling Prophecies,” Pointer, vol. May, 2023.
  • 21. R. Gilpin, “The theory of hegemonic war,” The Journal of Interdisciplinary History, vol. 18, no. 4, pp. 591 – 613, 1988, doi: 10.2307/204816.
  • 22. S. Shead, “UN talks to ban ‘slaughterbots’ collapsed — here’s why that matters, ”in CNBC, ed, 2021.
  • 23. B. Zhang, M. Anderljung, L. Kahn, N. Dreksler, M. C. Horowitz, and A. Dafoe, “Ethics and governance of artificial intelligence: Evidence from a survey of machine learning researchers,” Journal of Artificial Intelligence Research, vol. 71,pp. 591-666-591-666, 2021, doi: 10.1613/jair.1.12895.
  • 24. A. Wyatt and J. Galliott, “An Empirical Examination of the Impact of Cross-Cultural Perspectives on Value Sensitive Design for Autonomous Systems,” Information, vol. 12, no. 12, p. 527, 2021, doi: 10.3390/info12120527.
  • 25. J. Galliott and A. Wyatt, “A consideration of how emerging military leaders perceive themes in the autonomous weapon system discourse,” Defence Studies, vol. 22,no. 2, pp. 253 – 276, 2022, doi: 10.1080/14702436.2021.2012653.
  • 26. A. Blanchard and M. Taddeo, “Autonomous weapon systems and jus Ad Bellum, ”AI & SOCIETY, pp. 1 – 7, 2022, doi: 10.1007/s00146-022-01425-y.
  • 27. E. Riesen, “The Moral Case for the Development and Use of Autonomous Weapon Systems,” Journal of Military Ethics, vol. 21, no. 2, pp. 132 – 150, 2022,doi: 10.1080/15027570.2022.2124022.
  • 28. R. Waters, “Falling costs of AI may leave its power in hands of a small group, ”Financial Times, 10 March 2023. [Online]. Available: https://www.ft.com/content/4fe.... Accessed: Jun. 6, 2023].
  • 29. D. Nikolaiev, Behind the Millions: Estimating the Scale of Large Language Models, 2023 [Online]. Available: https://towardsdatascience.com.... [Accessed: Jun. 6, 2023].
  • 30. M. DeGuerin, “‘Thirsty’ AI: Training Chat GPT Required Enough Water to Filla Nuclear Reactor’s Cooling Tower, Study Finds,” in Gizmodo, 2023. [Online].Available: https://gizmodo.com/chatgpt-ai.... [Access: Accessed: Jun. 6, 2023].
  • 31. U. Gal, “Chat GPT is a data privacy nightmare. If you’ve ever posted online, you ought to be concerned,” in University of Sydney News, 2023. [Online]. Available: https://www.sydney.edu.au/news.... [Accessed: Jun. 6, 2023].
  • 32. B. Martin, L.H. Baldwin, P. Deluca, S. Henriquez, N. Hvizdaet et al., Supply Chain Interdependence and Geopolitical Vulnerability: The Case of Taiwan and High-End Semiconductors. Santa Monica: Rand Corp.
  • 33. K. Devitt, M. Gan, J. Scholz, R. Bolia, “A Method for Ethical AI in Defence,” Defence Science and Technology Group, Contract No.: DSTG-TR-3786, 2021.
  • 34. T. Phillips-Levine. (2023). War on the Rocks [Online]. Available: https://warontherocks.com/2023...: Jun. 6, 2023].
  • 35. M. Brundage, S. Avin, J. Clark, H. Toner, P. Eckersley, B. Garfinkel, et al. “The Malicious Use of Artificial Intelligence: Forecasting, Prevention, and Mitigation,” 2018 doi: 10.48550/ARXIV.1802.07228.
  • 36. C. Strike, “Global Threat Report,” Crowd Strike, 2023.
  • 37. V. Boulanin, Mapping the development of autonomy in weapon systems: A primer on autonomy. Stockholm: Stockholm International Peace Research Institute, 2016.
  • 38. A. Mehta, Experiment over: Pentagon’s tech hub gets a vote of confidence [Online].Available: https://www.defensenews.com/pe.... [Accessed: Jun. 6, 2023].
  • 39. L. Hudson, Pentagon to resume F-35 deliveries after Chinese materials discovered, Politico, 2022 [Online]. Available: https://www.politico.com/news/.... [Accessed: Jun. 6, 2023].
  • 40. R. Neuhard, Foreign Policy Research Institute, 2022. [Online]. Available: https://www.fpri.org/article/2.... [Accessed: Jun. 6, 2023].
  • 41. A. Holland Michel, “The black box, unlocked: predictability and understand ability in military AI,” United Nations Institute for Disarmament Research, 2020. doi:1037559/SecTec/20/AI1.
  • 42. E. H. Christie, A. Ertan, L. Adomaitis, M. Klaus, “Regulating lethal autonomous weapon systems: exploring the challenges of explain ability and trace ability, ”AI Ethics, 2023. doi: 10.1007/s43681-023-00261-0.
  • 43. J. Haner, D. Garcia “The artificial intelligence arms race: Trends and world leaders in autonomous weapons development”, Global Policy, vol. 10, no. 3, pp. 331 – 337,2019, doi: 10.1111/1758-5899.12713.
  • 44. E. Schmidt, R. Work, S. Catz, E. Horovitz, S. Chien, A. Jassy, et al. “Final Report: National Security Commission on Artificial Intelligence (AI),” National Security Commission on Artificial Intelligence, Contract No.: AD1124333, 2021.
  • 45. A. Wyatt, J. Galliott, “Toward a Trusted Autonomous systems Offset Strategy: Examining the Options for Australia as a Middle Power,” Australian Army Research Centre, Contract No.: 2, 2021.
  • 46. S. Korreck, “Exploring the Promises and Perils of Chinese Investments in Tech Startups: The Case of Germany,” Observer Research Foundation, 2021.
  • 47. K. M. Sayler, “Artificial Intelligence and National Security,” Congressional Research Service, Contract No.: R45178, 2020.
  • 48. M. Brown, P. Singh, “China’s Technology Transfer Strategy: How Chinese Investments in Emerging Technology Enable A Strategic Competitor to Access theCrown Jewels of U.S. Innovation,” Defense Innovation Unit – Experimental, 2018.
  • 49. E. H. Christie, C. Buts and C. Du Bois, “America, China, and the struggle for AI supremacy,” 24th Annual International Conference on Economics and Security, Volos, Greece, July 8 – 9, 2021.
  • 50. M. C. Horowitz, “Artificial intelligence, international competition, and the balance of power,” Texas National Security Review, vol. 22, 2018, doi: 10.15781/T2639KP49.
  • 51. M. Lamberth, P. Scharre, “Arms Control for Artificial Intelligence,” Texas National Security Review, vol. 6, no. 2, pp. 95 – 110, 2023, doi: 10.26153/tsw/46142.
  • 52. S. Writer, “Fact Check-Simulation of AI drone killing its human operator was hypothetical, Air Force says,” in Reuters, 2023. [Online]. Available: https://www.reuters.com/articl... [Accessed: Dec. 4, 2023].
  • 53. E. Jones, B. Easterday, “Artificial Intelligence’s Environmental Costs and Promise,” in Council on Foreign Relations, 2022. [Online]. Available: https://www.cfr.org/blog/artif... [Accessed: Dec. 4, 2023].
  • 54. L. Irwin, “How Much Does GDPR Compliance Cost in 2023?,” in IT Governance, 2023.[Online]. Available: https://www.itgovernance.eu/bl... [Accessed: Dec. 4, 2023].
  • 55. J.-Y. Lee, E. Han, and K. Zhu, “Decoupling from China: how US Asian allies responded to the Huawei ban,” Australian Journal of International Affairs, vol. 76, no. 5, pp.486-506, 2022, doi: 10.1080/10357718.2021.2016611.
  • 56. G. Baryannis, S. Validi, S. Dani, G. Antoniou, “Supply chain risk management and artificial intelligence: state of the art and future research directions,” International Journal of Production Research, vol. 57, no. 7, pp. 2179 – 2202, 2019,doi: 10.1080/00207543.2018.1530476.
  • 57. R. Fedasiuk, J. Melot, B. Murphy, Harnessed lightning: How the Chinese military is adopting artificial intelligence. Washington DC: Center for Security and Emerging Technology, 2021.
  • 58. F. E. Morgan, M. Boudreaux, A. J. Lohn, M. Ashby, C. Curriden, et al., Military applications of artificial intelligence. Santa Monica: RAND Corporation, 2020.
Uwagi
Opracowanie rekordu ze środków MNiSW, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2024).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-44ba1b12-b11a-410d-b4d3-b6670e4a550e
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.