Archbishop Ivan Jurkovic © RV

Archbishop Ivan Jurkovic © RV

Arch Jurkovic: Technology Risks Accompany Opportunity

Dangers of Lethal Autonomous Weapons Systems

Print Friendly, PDF & Email
Share this Entry

While noting the benefits of advances in science and technology, Archbishop Ivan Jurkovič, Permanent Observer of the Holy See to the United Nations and Other International Organizations in Geneva, on November 13, 2017 warned of the dangers of Lethal Autonomous Weapons Systems (LAWS).
His remarks came in Geneva at the 2017 Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS).
“As Pope Francis recalls, advances in science and technology from useful domestic appliances to great transportation systems – “wonderful products of a God-given human creativity” – when directed to truly helping people live with more dignity and less suffering, have contributed greatly to improving the quality of human life,” the archbishop said. “Yet, such technological prowess inevitably also has an impact on weapons and on the nature of conflicts… A stark reality, particularly evident in weaponry, is that such rapid advances have often outpaced thoughtful developments in human responsibility, values, conscience and international humanitarian law.”
In light of the technology-human gap, Archbishop Jurkovic cited three major concerns about LAWS:
1) An autonomous system necessarily gives way to a certain unpredictability of actions.
2) The risk of concealing the true responsibility and the lack of accountability
3) The risk of proliferation and a new arms race.
The archbishop insisted that these concerns do not signal the opposition of the Holy See to advances in science in technology. But, “we must then make sure to develop a great awareness of responsibility in those who are developing robotic technologies that are becoming increasingly autonomous.”
 
The Archbishop’s Statement
Mr. Chair,
The Holy See Delegation would like to express its satisfaction for the excellent preparatory work that you have undertaken to lay the ground for this important meeting. We look forward to substantial discussions under your guidance over the course of this week and we hope that a common understanding of the underlying ethical and legal aspects involved can be established
As Pope Francis recalls, advances in science and technology from useful domestic appliances to great transportation systems – “wonderful products of a God-given human creativity” – when directed to truly helping people live with more dignity and less suffering, have contributed greatly to improving the quality of human life. 1
Yet, such technological prowess inevitably also has an impact on weapons and on the nature of conflicts. A stark reality, particularly evident in weaponry, is that such rapid advances have often outpaced thoughtful developments in human responsibility, values, conscience and international humanitarian law. The historical experience of regulations or prohibitions of certain weapons best illustrates this reality. Tragically, the international legal frameworks came too late and only after grave human tragedies had taken place.
Mr. Chair,
If a cautious preventive approach is not pursued with regards to emerging technologies in the area of lethal autonomous weapon systems (LAWS), we may not escape this very same irrational trend of lagging behind. The conduct of military operations is a serious matter. Any armed intervention must be carefully considered and must at all times verify its legitimacy, legality and conformity with its purposes, which must also be both ethically and legally legitimate. The risks are far too many and the consequences too profound and far-reaching to be ignored. Please allow me to elaborate on some of them for the benefit of our discussion:
1) An autonomous system necessarily gives way to a certain unpredictability of actions. This is particularly grave when LAWS may have self-learning or self-programmable capabilities, that is, they may at some point perform unforeseen actions, escaping the predicted area of evolution and thus contradicting the objective set by a responsible human agent. Such unpredictability inherently entails the impossibility to fulfill requirements under IHL at all times, as this unpredictability could, for instance, “deviate” into behavior targeting civilians to maximize military interest, in direct violation of the principle of distinction. Furthermore, respect and application of laws or principles require timely understanding and judgment of particular situations that entail going well beyond algorithms. Legal and ethical decisions often require an interpretation of the rules in order to save the spirit of the rules itself. A machine can never be able to perform such tasks.
2) The risk of concealing the true responsibility and the lack of accountability. In the use of LAWS, there is a certain degree of hypocrisy: one actually intends to cause lethal effects or damage without giving the impression of being personally engaged. The autonomous robot creates a true “technological screen” that may remove the cause and effect of an act of war. In case of collateral damage, it will be easy to blame a malfunction of the machine and to try to diminish the accountability of the responsible authority. The disappearance or concealment of the human agent is problematic not only from the point of view of the ethics of responsibility, but also from the point of view of the foundation of law. In this regard it would be dangerous to consider an “electronic personality” for the robot, be it civilian or military, or to give it legal status as a human person. A machine is only a complex set of circuits and this material system cannot in any case become a truly morally responsible agent. In fact, for a machine, a human person is only a datum, a set of number among others. It is very important never to lose sight of the difference between the system (an object) and the human person (the subject). “For a natural person, the source of his/her legal personality, which implies rights and duties, derives from his/her existence as a human person. Responsibility rooted in legal personality shall only be exercised in presence of a certain capacity for freedom. Freedom is more than autonomy.” 2
3) The risk of proliferation and a new arms race. In the absence of preventive actions at the international level, the immediate military interest of LAWS poses the risk of stimulating their development. Indeed, as in the case of nuclear deterrence, a vicious spiral of arms race is likely to occur in the long term, which would cause dangerous instability at the global level. Information on robot technology is readily available and the implementation of this technology is increasingly within the reach of many. It is therefore likely to proliferate and evolve rapidly, with the risk of weapons falling into the hands of irresponsible non-State actors. Furthermore, one can only imagine how much easier it would be to justify at the domestic level the loss of a robot, rather than a soldier.
Mr. Chair,
My Delegation wishes to stress the fact that being cautious about the research and development of LAWS does not mean being opposed to the promotion of research for science and technology applications in the civilian field. However, to argue that, given the intrinsic dual-use nature of artificial intelligence, a prohibition on LAWS would hinder the development and use of robotic technology for civilian purpose is in contradiction with the successful examples of the promotion of international cooperation and exchange of scientific and technical information, for instance, within the framework of the Chemical Weapons Convention and the Biological Weapons Convention.
In conclusion, we must then make sure to develop a great awareness of responsibility in those who are developing robotic technologies that are becoming increasingly autonomous. “We must convince ourselves of the priority of ethics over technology, of the primacy of the person over 3 things,”3 In this context, the discussions on LAWS cannot be detached from humanitarian and ethical foundations. Hence, the dialogue between theologians, ethicists, scientists, engineers, and technologists becomes not just an interesting endeavor, but also a common and shared mission in the interest of mankind.
It is my Delegation’s hope that this profound attention to the human person and his/her inherent dignity – the very foundation of every legal order – could lead to a common understanding of the multifaceted complexities related to LAWS and to the establishment of a legal and ethical framework underpinning our discussions concerning the evaluation of emerging military technologies.
International security and peace are best achieved through the promotion of a culture of dialogue and cooperation, not through an arms race. A world in which autonomous machines are left to manage, rigidly or randomly, fundamental questions related to the lives of human beings and nations, would lead us imperceptibly to a weakening of the ties that underlie the possibility of a true and lasting fraternity. To limit the inhumanity of war, to the extent that this is possible, it is important to preserve, at the heart of the tragedies of armed conflicts, the crucial role of conscious and responsible human agents. In particular, in the dynamics of reconciliation and a return to peace, only the human person is and always will be the subject capable of forgiveness. The inventiveness and originality of forgiveness cannot be subject to any formalization, neither algorithmically nor legally.
I thank you, Mr. Chair.
1 Cf. Pope Francis, Laudato Si’, paras 103 and 112.
2 COMECE’s Secretariat comment to the recommendation of the European Parliament to study the possibility of creating a specific legal status for robots in the long run. Document available at: http://www.comece.eu/dl/KrtkJKJKomMKJqx4KJK/20170425_COMECE_contribution_on_robotics_COM.pdf
3 Saint John Paul II, Address to Scientists and Representatives of the United Nations University, Hiroshima (1981)
Copyright © 2017 Permanent Observer Mission of the Holy See to the United Nations, All rights reserved.

Print Friendly, PDF & Email
Share this Entry

Jim Fair

Jim Fair is a husband, father, grandfather, writer, and communications consultant. He also likes playing the piano and fishing. He writes from the Chicago area.

Support ZENIT

If you liked this article, support ZENIT now with a donation