![]() |
![]() ![]() |
View previous stories | |
The human cost of this war is unbearable by Mark Lowcock, Lise Grande UN Office for Coordination of Humanitarian Affairs Sep. 2019 Syria: People's homes, hospitals, schools, water systems and markets must be protected, underlines Mark Lowcock - UN Emergency Relief Coordinator Condemning the military escalation in north-west Syria in an update to the UN Security Council, the UN Emergency Relief Coordinator, Mark Lowcock, said that more than 500 civilians had been killed and many hundreds more injured in the last four months. Citing WHO and UNICEF, Mr. Lowcock noted that 43 health facilities, 87 educational facilities, 29 water stations, and seven markets had been impacted since April. In an appeal to the belligerents to respect international humanitarian law, the Emergency Relief Coordinator insisted that a UN-led probe - announced by the Secretary-General, were set to investigate incidents in north-west Syria which damaged or destroyed facilities which have either been deconflicted or received humanitarian support from the UN. 'I and other UN officials have repeatedly called on the parties and this Council to ensure respect for international humanitarian law', Mr. Lowcock insisted. People's homes, hospitals, schools, water systems and markets must be protected. There can be no reason, rationale, excuse or justification for the destruction of civilian areas on the scale seen in Idlib today, he said. Seven health centres in Syria's north-west have been reportedly attacked in recent days and two of them have been destroyed, the World Health Organization (WHO) said on Monday. Attacks verified on four facilities, three pending In a statement, the UN health agency said that the facilities included four hospitals and two primary health care centres that were functional at the time they were hit. Attacks on four facilities have been verified according to WHO reporting standards, and three are in process of verification, WHO said in a statement. One primary health care center had been previously evacuated in advance of military action. At least two injuries were reported, and two facilities were destroyed. The attacks come amid increased hostilities in and around Syria's Idlib province, the last opposition stronghold in the country, which has been devastated by more than eight years of war. According to WHO, some 13.7 million people need health assistance in Syria, including an estimated four million in the north-west. The UN agency says that the attacks happened between 28 and 30 August, the latest violence against civilians since fighting escalated in north-west Syria in late April. In August alone, it reported that more than 130,000 people were displaced from Northern Hama and Idlib governorates. http://reliefweb.int/report/syrian-arab-republic/escalating-violence-and-waves-displacement-continue-torment-civilians http://reliefweb.int/country/syr http://www.unocha.org/story/syria-crisis-its-9th-year-9-figures http://www.unocha.org/syria 2Sep. 2019 These are very dark times for Yemen Lise Grande, the UN Humanitarian Coordinator in Yemen, described Sunday's deadly airstrikes in Dhamar City as a horrific incident, and the scale of the casualties, as staggering. These are very dark times for Yemen, said Ms. Grande. "There have been days of fighting and strikes in the south and hundreds of casualties". The strikes hit a former community college compound on the northern outskirts of Dhamar City. According to sources on the ground, as many as 170 prisoners were being held in a detention facility within the compound. The Yemen UN Office of the High Commissioner for Human Rights has confirmed that 52 detainees are among the dead. At least 68 detainees are still missing. Casualties are most likely to increase as rescue efforts are still on going. First responders have been struggling to reach the scene due to repeated strikes on the site. Survivors are believed to remain trapped under rubble and the search for further casualties continues. Today's event is a tragedy. The human cost of this war is unbearable, said Martin Griffiths, Special Envoy of the Secretary-General for Yemen. We need it to stop. Yemenis deserve a peaceful future. Today's tragedy reminds us that Yemen cannot wait. I hope the Coalition will launch an enquiry into this incident. Accountability needs to prevail. Humanitarian partners are rushing surgical and medical supplies, including trauma kits, to Dhamar General Hospital and Maaber Hospital. We are diverting critical medical supplies from the cholera response, said Ms. Grande. "We have no choice. Humanitarian funding gap compounds crisis The attack occurred just two days after Ms. Grande assessed the situation in Yemen as very fragile, with insecurity compounded by a humanitarian funding gap which has forced several health programmes to close. Yemen is the world's worst humanitarian crisis, with nearly 80 per cent of the total population, some 24.1 million people, requiring some form of humanitarian assistance and protection. The 2019 Yemen Humanitarian Response Plan (YHRP) requires US$4.2 billion to assist more than 20 million Yemenis but the plan is only 34 per cent funded. At a UN pledging conference in February, donor countries promised $2.6 billion to meet urgent needs but, to date, less than half of this amount has been received. http://reliefweb.int/report/yemen/yemen-collective-failure-collective-responsibility-un-expert-report http://www.ohchr.org/EN/HRBodies/HRC/YemenGEE/Pages/Index.aspx Visit the related web page |
|
The Need for and Elements of a New Treaty on Fully Autonomous Weapons by Campaign to Stop Killer Robots June 2020 The Need for and Elements of a New Treaty on Fully Autonomous Weapons, by Bonnie Docherty - Senior Researcher, Arms Division Human Rights Watch The rapid evolution of autonomous technology threatens to strip humans of their traditional role in the use of force. Fully autonomous weapons, in particular, would select and engage targets without meaningful human control. Due in large part to their lack of human control, these systems, also known as lethal autonomous weapons systems or 'killer robots, raise a host of legal and ethical concerns. States parties to the Convention on Conventional Weapons (CCW) have held eight in-depth meetings on lethal autonomous weapons systems since 2014. They have examined the extensive challenges raised by the systems and recognized the importance of retaining human control over the use of force. Progress toward an appropriate multilateral solution, however, has been slow. If states do not shift soon from abstract talk to treaty negotiations, the development of technology will outpace international diplomacy. Approaching the topic from a legal perspective, this chapter argues that fully autonomous weapons cross the threshold of acceptability and should be banned by a new international treaty. The chapter first examines the concerns raised by fully autonomous weapons, particularly under international humanitarian law. It then explains why a legally binding instrument best addresses those concerns. Finally, it proposes key elements of a new treaty to maintain meaningful human control over the use of force and prohibit weapons systems that operate without it. The Problems Posed by Fully Autonomous Weapons Fully autonomous weapons would present significant hurdles to compliance with international humanitarian law's fundamental rules of distinction and proportionality. In today's armed conflicts, combatants often seek to blend in with the civilian population. They hide in civilian areas and wear civilian clothes. As a result, the ability to distinguish combatants from civilians or those hors de combat often requires gauging an individual's intentions based on subtle behavioral cues, such as body language, gestures, and tone of voice. Humans, who can relate to other people, can better interpret those cues than inanimate machines. Fully autonomous weapons would find it even more difficult to weigh the proportionality of an attack. The proportionality test requires determining whether expected civilian harm outweighs anticipated military advantage on a case-by-case basis in a rapidly changing environment. Evaluating the proportionality of an attack involves more than a quantitative calculation. Commanders apply human judgment, informed by legal and moral norms and personal experience, to the specific situation. Whether the human judgment necessary to assess proportionality could ever be replicated in a machine is doubtful. Furthermore, robots could not be programmed in advance to deal with the infinite number of unexpected situations they might encounter on the battlefield. The use of fully autonomous weapons also risks creating a serious accountability gap. International humanitarian law requires that individuals be held legally responsible for war crimes and grave breaches of the Geneva Conventions. Military commanders or operators could be found guilty if they deployed a fully autonomous weapon with the intent to commit a crime. It would, however, be legally challenging and arguably unfair to hold an operator responsible for the unforeseeable actions of an autonomous robot. Finally, fully autonomous weapons contravene the Martens Clause, a provision that appears in numerous international humanitarian law treaties. The clause states that if there is no specific law on a topic, civilians are still protected by the principles of humanity and dictates of public conscience. Fully autonomous weapons would undermine the principles of humanity because of their inability to show compassion or respect human dignity. Widespread opposition to fully autonomous weapons among faith leaders, scientists, tech workers, civil society organizations, the public, and more indicate that this emerging technology also runs counter to the dictates of public conscience. Fully autonomous weapons pose numerous other threats that go far beyond concerns over compliance with international humanitarian law. For many, delegating life-and-death decisions to machines would cross a moral red line. The use of fully autonomous weapons, including in law enforcement operations, would undermine the rights to life, remedy, and dignity. Development and production of these machines could trigger an arms race, and the systems could proliferate to irresponsible states and non-state armed groups. Even if new technology could address some of the international humanitarian law problems discussed above, it would not resolve many of these other concerns. The Need for a Legally Binding Instrument The unacceptable risks posed by fully autonomous weapons necessitate creation of a new legally binding instrument. It could take the form of a stand-alone treaty or a protocol to the Convention on Conventional Weapons. Existing international law, including international humanitarian law, is insufficient in this context because its fundamental rules were designed to be implemented by humans not machines. At the time states negotiated the additional protocols to the Geneva Conventions, they could not have envisioned full autonomy in technology. Therefore, while CCW states parties have agreed that international humanitarian law applies to this new technology, there are debates about how it does. A new treaty would clarify and strengthen existing international humanitarian law. It would establish clear international rules to address the specific problem of weapons systems that operate outside of meaningful human control. In so doing, the instrument would fill the legal gap highlighted by the Martens Clause, help eliminate disputes about interpretation, promote consistency of interpretation and implementation, and facilitate compliance and enforcement. The treaty could also go beyond the scope of current international humanitarian law. While the relevant provisions of international humanitarian law focus on the use of weapons, a new treaty could address development, production, and use. In addition, it could apply to the use of fully autonomous weapons in both law enforcement operations as well as situations of armed conflict. A legally binding instrument is preferable to the 'normative and operational framework'' that the CCW states parties agreed to develop in 2020 and 2021. The phrase 'normative and operational framework' is intentionally vague, and thus has created uncertainty about what states should be working toward. While the term could encompass a legally binding CCW protocol, it could also refer to political commitments or voluntary best practices, which would be not be enough to preempt what has been called the third revolution in warfare. Whether adopted under the auspices of CCW or in another forum, a legally binding instrument would bind states parties to clear obligations. Past experience shows that the stigma it would create could also influence states not party and non-state armed groups. Aug. 2019 Russia, the United States, and a handful of other nations investing in autonomous weapons are preventing efforts to start negotiations on a new treaty to retain meaningful human control over the use of force, Human Rights Watch said today. More than 70 member countries of the Convention on Conventional Weapons will meet in Geneva on August 20 and 21, 2019 for their eighth meeting since 2014 to discuss concerns raised by lethal autonomous weapons systems, also known as fully autonomous weapons or killer robots. But the Convention on Conventional Weapons, all talk no action approach indicates that it is incapable of dealing with this threat, Human Rights Watch said. 'Most governments want to negotiate a new treaty to retain meaningful human control over the use of force', said Steve Goose, arms director at Human Rights Watch, which coordinates the Campaign to Stop Killer Robots. 'But with a small number of countries blocking any progress, these diplomatic talks increasingly look like an attempt to buy time and distract public attention rather than to urgently address the serious challenges raised by killer robots'. Human Rights Watch and the Campaign to Stop Killer Robots urge states party to the convention to agree to begin negotiations in November for a new treaty to require meaningful human control over the use of force, which would effectively prohibit fully autonomous weapons. Only new international law can effectively address the multiple moral, legal, accountability, security, and technological concerns raised by killer robots. The Convention on Conventional Weapons talks began in 2014 and were formalized three years later, but still have not produced anything more than some non-binding principles. Russia and the United States, as well as Australia, Israel, and the United Kingdom, opposed calls to move to negotiate a new treaty at the last meeting on killer robots in March, calling such a move premature. At the previous talks, almost all countries have called for retaining human control over the use of force, which is effectively equivalent to a ban on weapons that lack such control. To date, 28 countries have explicitly supported a prohibition on fully autonomous weapons. There is increasing evidence that developing these weapons would run contrary to the dictates of public conscience, Human Rights Watch said. Thousands of scientists and artificial intelligence experts, more than 20 Nobel Peace Laureates, and more than 160 religious leaders and organizations of various denominations also support a ban on killer robots. In 2018, Google released a set of ethical principles that includes a pledge not to develop artificial intelligence for use in weapons. Killer robots would be unable to apply either compassion or nuanced legal and ethical judgment to decisions to use lethal forcce. Without these human qualities, the weapons would face significant obstacles in ensuring the humane treatment of others and showing respect for human life and dignity. According to international humanitarian law, the dictates of public conscience and principles of humanity should be upheld when there is no specific relevant treaty, which is the case with killer robots. The 28 countries that have called for the ban are: Algeria, Argentina, Austria, Bolivia, Brazil, Chile, China (use only), Colombia, Costa Rica, Cuba, Djibouti, Ecuador, El Salvador, Egypt, Ghana, Guatemala, the Holy See, Iraq, Mexico, Morocco, Nicaragua, Pakistan, Panama, Peru, the State of Palestine, Uganda, Venezuela, and Zimbabwe. The Campaign to Stop Killer Robots, which began in 2013, is a coalition of 112 nongovernmental organizations in 56 countries that is working to preemptively ban the development, production, and use of fully autonomous weapons. 'Both prohibitions and positive obligations are needed to ensure that systems that select and engage targets do not undermine ethical values and are always subject to meaningful human control', Goose said. 'The public expects greater efforts from governments to prevent the development of fully autonomous weapons, before they proliferate widely, in fact, nothing less than a legally-binding ban treaty'. http://www.hrw.org/news/2020/06/01/need-and-elements-new-treaty-fully-autonomous-weapons http://www.hrw.org/news/2019/09/26/killer-robots-ban-treaty-only-credible-solution http://www.hrw.org/news/2019/08/19/killer-robots-russia-us-oppose-treaty-negotiations http://www.stopkillerrobots.org/ http://www.theguardian.com/technology/2019/sep/15/ex-google-worker-fears-killer-robots-cause-mass-atrocities Visit the related web page |
|
View more stories | |
![]() ![]() ![]() |