UCAN needs your support
You are why we do what we do - report, describe, comment, review. It is to bring to your eyes just what life is like for believers across Asia that we publish UCAN.
But as you know, the effort needs to be sustained if it is to have continuing effect.
UCAN publishes some 150 stories a week in four languages across six websites. We are grateful to benefactors in Europe and the US who support us. But those countries and the Church there are under increasing financial strain and their generosity no longer covers our costs.
We need financial help from our readers to sustain our efforts. Our reporters, editors, video producers and photographers all have families and we need to support them. They do excellent jobs, but they can't do their jobs for nothing.
Will you help us to sustain UCAN? Please click here to help.
Thanks in anticipation.
Fr. Michael Kelly SJ
Vatican warns on new breed of lethal autonomous weapons
'Humans must not be taken out of the loop,' archbishop tells UN
Archbishop Tomasi (right). Picture: AFP Photo/Fabrice Coffrini
- Vatican Radio for NEWS.VA
- May 14, 2014
Archbishop Silvano Tomasi expressed his concern on Tuesday regarding the increasing trend of “dehumanization of warfare” and the use of lethal autonomous weapon systems.
Addressing the United Nations in Geneva at the meeting of experts on lethal autonomous weapons systems, Archbishop Tomasi who is the Vatican’s Permanent Representative to the United Nations and Other International Organizations in Geneva, commended the organizers of the meeting for highlighting what he called “emerging concerns around new technologies”.
Text of the address follows:
Let me first commend you for the good preparation for this very important meeting, even if the mandate is simply to discuss in an informal setting emerging concerns around new technologies which would not only impact the way of conducting war but more importantly would question the humanity of our societies in relying on machines to make decisions about death and life. In 2013, this Delegation expressed its deep concerns in relation with the use of drones and the troubling ethical consequences for users and victims alike.
While in many fields, autonomous technology may indeed prove beneficial to humanity, the application of autonomy to weapons technology is entirely distinct: it seeks to place a machine in the position of deciding over life and death. We are most troubled by emerging technologies of autonomous weapon systems which may move beyond surveillance or intelligence-gathering capabilities into actually engaging human targets. Good intentions could be the beginning to a slippery slope. When humanity is confronted with big and decisive challenges—from health to the environment, to war & peace—taking time to reflect, relying on the principle of precaution, and adopting a reasonable attitude of prevention are far more suitable than venturing into illusions and self-defeating endeavours.
Autonomous weapon systems, like any other weapon system, must be reviewed and pass the IHL examination. Respect for international law, for human rights law, and IHL is not optional. The Holy See supports the view that autonomous weapon systems have, like drones, a huge deficit which cannot be addressed only by respecting the rules of IHL. To comply, these systems would require human qualities that they inherently lack. The ethical consequences of such systems if deployed and used cannot be overlooked and underestimated.
The increasing trend of dehumanisation of warfare compels all nations and societies to reassess their thinking. The prospect of developing armed robots designed to engage human targets has the potential of changing the fundamental equation of war. Taking humans “out of the loop” presents significant ethical questions, primarily because of the absence of meaningful human involvement in lethal decision-making.
Mr. President, for the Holy See the fundamental question is the following: Can machines—well-programmed with highly sophisticated algorithms to make decisions on the battlefield in compliance with IHL—truly replace humans in decisions over life and death?
The answer is no. Humans must not be taken out of the loop over decisions regarding life and death for other human beings. Meaningful human intervention over such decisions must always be present. Decisions over life and death inherently call for human qualities, such as compassion and insight, to be present. While imperfect human beings may not perfectly apply such qualities in the heat of war, these qualities are neither replaceable nor programmable. Studies of soldiers’ experiences support that human beings are innately averse to taking life, and this aversion can show itself in moments of compassion and humanity amidst the horrors of war.