Wikimedia Commons

Archbishop Jurkovic Warns of Autonomous Weapons

‘An autonomous weapons system could never be a morally responsible subject.’

Print Friendly, PDF & Email
Share this Entry

Lethal Autonomous Weapons Systems raise serious ethical and legal issues, according to Archbishop Ivan Jurkovič, Permanent Observer of the Holy See to the United Nations and Other International Organizations. He spoke in Geneva on April 9, 2018, at the 2018 Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS).
“The Holy See recognizes the difficulties and resistance in foreseeing and fully understanding the implications of autonomous weapon systems,” the archbishop said. “One thing is certainly clear: the development of LAWS will provide the capacity of altering irreversibly the nature of warfare, becoming even more inhumane, putting in question the humanity of our societies and, in any case, compelling all States to reassess their military capabilities.”
Archbishop Jurkovic gave the grave concerns about LAWS:
1) An autonomous weapons system could never be a morally responsible subject.
2) An autonomous weapons system may deviate from the areas of evolution or the objectives prescribed by the responsible political or military authority.
3) The idea of a war waged by non-conscious and non-responsible autonomous weapons systems appears to hide a lure for dominance that conceals desperation and a dangerous lack of confidence in the human person.
“International security and peace are best achieved through the promotion of a culture of dialogue and cooperation, not through an arms race,” the archbishop stressed. “A world in which autonomous systems are left to manage, rigidly or randomly, fundamental questions related to the lives of human beings and nations, would lead us imperceptibly to dehumanization and to a weakening of the bonds of a true and lasting fraternity of the human family.”
The Archbishop’s Full Statement:
Mr. Chair,
The Holy See Delegation wishes to thank you for your leadership throughout the sessions of the Group of Governmental Experts (GGE) along with the distinguished members of the Delegation of India. The issue of Lethal Autonomous Weapons Systems (LAWS) is on our agenda for the fifth consecutive year now and thus we look forward to substantial outcomes and hopes that a consensual legal and ethical framework can finally be established.
The Holy See recognizes the difficulties and resistance in foreseeing and fully understanding the implications of autonomous weapon systems. One thing is certainly clear: the development of LAWS will provide the capacity of altering irreversibly the nature of warfare, becoming even more inhumane, putting in question the humanity of our societies and, in any case, compelling all States to reassess their military capabilities.
Any armed intervention must be carefully weighed and must at all times verify its legitimacy, legality, and conformity with its purposes, which must also be both ethically and legally legitimate. Confronted with today’s challenges, these tasks are growing ever more complex and too nuanced to be entrusted to a machine, which, for example, would be ineffective when facing moral dilemmas or questions raised by the application of the so-called principle of “double effect”.
For this reason, my Delegation suggests that it is possible, and desirable, to consider the principle of “anthropological non-contradiction” as the common point of reference for our discussion. In other words, any technology, including autonomous weapons, in order to be acceptable, must be compatible and consistent with the proper conception of the human person, the very foundation of law and ethics.
Mr. Chair,
The robotization and dehumanization of warfare, in particular through artificial intelligence and machine learning in weaponry, bring about several ethical and legal questions and contradictions. Please allow me to present a few:
1) An autonomous weapons system could never be a morally responsible subject. The unique human capacity for moral judgment and ethical decision-making is more than a complex collection of algorithms and such a capacity cannot be replaced by or programmed in, a machine. The application of rules or principles requires an understanding of the contexts and situations that entails going well beyond the potentialities of algorithms. For example, to characterize a fact or to apply a general law to a particular case demands, on the part of a judge, something more than the application of simple consequential logic, more than the pure manipulation of formal and codified rules. In this regard, autonomous weapons systems may consider normal in the statistical sense of the term and thus acceptable, those behaviors that international law prohibits, or that, albeit not explicitly outlined, are still forbidden by dictates of morality and public conscience.
2) An autonomous weapons system may deviate from the areas of evolution or the objectives prescribed by the responsible political or military authority. Yet, it is necessary to always have a proper traceability of the use of force, with an accurate identification of those responsible. In fact, this unpredictability would be at odds with the principle of jus in bello, since it could translate into the behavior of targeting civilians to maximize military interest, in direct opposition to the principle of distinction. Additionally, such loss or dilution of responsibility induces a total lack of accountability for violations of both international humanitarian law and international human rights law and could progressively incite to war.
3) The idea of a war waged by non-conscious and non-responsible autonomous weapons systems appears to hide a lure for dominance that conceals desperation and a dangerous lack of confidence in the human person. This rationale ignores the fact that for a machine, a human person, just like everything else, is merely an interchangeable set of data among others. Now, this is precisely the “inhuman” contradiction: the delegation of powers to autonomous systems puts us on the path of negation, oblivion, and contempt for the essential characteristics unique to the human persons and to soldierly virtues.
Mr. Chair,
International security and peace are best achieved through the promotion of a culture of dialogue and cooperation, not through an arms race. The Preamble of the CCW clearly expresses the desire to contribute to “the ending of the arms race and the building of confidence among States, and hence to the realization of the aspiration of all peoples to live in peace”. A world in which autonomous systems are left to manage, rigidly or randomly, fundamental questions related to the lives of human beings and nations, would lead us imperceptibly to dehumanization and to a weakening of the bonds of a true and lasting fraternity of the human family.
For this reason, relying on the principle of precaution and adopting a responsible attitude of prevention are of the utmost importance in our current endeavors. My Delegation wishes to echo the words of Pope Francis that: “We have the freedom needed to limit and direct technology; we can put it at the service of another type of progress, one which is healthier, more human, more social, more integral” (Laudato Si’, para 112). Let us translate these words into a sound and lasting outcome within the CCW!
Thank you, Mr. Chair.
Copyright © 2017 Permanent Observer Mission of the Holy See to the United Nations, All rights reserved.

Print Friendly, PDF & Email
Share this Entry

Jim Fair

Jim Fair is a husband, father, grandfather, writer, and communications consultant. He also likes playing the piano and fishing. He writes from the Chicago area.

Support ZENIT

If you liked this article, support ZENIT now with a donation