Dron over Mount Tamalpais in the USA. “Killer robots”: the future that has almost come Peace and safety The world in which algorithms determine the fate of both soldiers and civilians no longer seems to be fantasy. Unmanned aerial vehicles controlled by artificial intelligence change the methods of conducting military operations, raising deep moral and ethical issues. While politicians are trying to establish basic restrictions in this area, in the world there is a race for mastering this rapidly developing technology. 62 ~ Every day we voluntarily give information about ourselves to machines. This happens when we get cookie files or use the search engine. Before pressing the “accept” or “agree” button to get to the page we need, we rarely think about where our personal data will follow. We only vaguely guess that then this action will be used to convince us to buy something absolutely not necessary for us. ~ 60 > it is easy to imagine that in the relatively distant perspective, the vehicles will use the personal data of people in order to determine who the enemy and who need to be destroyed. The UN and a number of non -governmental organizations are concerned that this scenario is close to reality. They call for international regulation of the use of lethal autonomous weapons to avoid the future where the machines will solve the issues of life and death of people. 62 ~ 62 ~ war of drones 60 > for several months the Kherson region of Ukraine was subjected to constant attacks from Russian drones, mainly aimed at peaceful residents. According to official figures, more than 150 civilians were killed and hundreds were injured. Independent investigation initiated by the UN , it came to the conclusion that these attacks are crimes against humanity. 62 ~The Ukrainian army also largely depends on the drones and, as reported, develops the “wall of drones”-the defensive line of unmanned aerial vehicles-to protect vulnerable sections of the border. ~ 60 >~ 60 > once such technologies were able to develop only wealthy countries, but Ukraine proved that if there was a savvy, it was possible to do it, if there was a savvy. Modify inexpensive drones. Armed conflicts around the world reflect this shift, and the nature of modern hostilities is rapidly transformed. “ digital dehumanization ” ~ 60 >“ General Secretary General always stated that the use of machines with fully delegated powers to solve life and death, a morally disgusting prospect, ”the high representative for disarmament emphasizes. UN Izumi Namamitsu. – This cannot be allowed. Similar technologies should be prohibited by international law. This is the position of the United Nations. ”~ 60 > international non -governmental organization Human Rights Watch stated that the use of autonomous combat systems will become the next example of“ digital dehumanization ”, as a result of which II will be able to make fateful decisions on issues affecting the prosperity people. 60 > “Several countries with large resources invest significant funds in artificial intelligence and related technologies for the development of autonomous systems of ground and naval weapons. This is a fact , – 60 ~/strong > 62 ~ warns Mary Werham from Human Rights Watch. – Basically, these are the United States, but now other countries, including Russia, China, Israel and South Korea, invest significant funds in autonomous weapons systems. ”~ 60 > 62 > 62 > 62 > 62 > 62 > 62 > 62 ~ 62Supporters of the use of artificial intelligence in the war indicate the shortcomings of a person: soldiers can be mistaken in evaluating the situation, act under the influence of emotions, they need rest and reward for service, while cars are accurate and impartial, moreover, they are improved every day in identifying threats. The next step, apparently, will be the provision of the opportunity to decide when to press the trigger. ~ 60 > there are two main objections to allow the machine to independently control their actions on the battlefield: firstly, these technologies are unreliable and imperfect. Secondly, the UN and many other organizations consider the use of autonomous drones unethical. “Cars may be wrong when choosing human goals,” says Mary Werham from Human Rights Watch. – In particular, people with disabilities are at a special risk. Disabled stroller AI can be defined as a weapon. There are also fears that person recognition technology is not able to correctly identify people with certain skin shades. AI is still imperfect, and the prejudices of people who program it are laid. ~ 60 > as for ethical and moral aspects, Nicole Van Royen, executive director of the organization stop Killer Rabots , it believes that the use of AI on the battlefield will significantly complicate the establishment of responsibility for committing war crimes: & nbsp; “who will be guilty for such acts ? 63 ~ person who created the algorithm ? arises a number of issues that do not arise answer “. Discussion of the ban ~ 60 > speed that the technologies of AI develop, as well as evidence that guidance systems with support for artificial intelligence are already used on the battlefield, enhance the relevance of calls for the development of international rules in this sphere. In May & NBSP; this year, an informal discussion of this problem was held at the UN headquarters in New York. Antoniu Gutherrish called on the member states to create a legally binding agreement on the regulation and prohibition of the use of autonomous combat drones by 2026. ~ 60 > 60 > in 2014 in the Palace of Nations in the UN Geneva held the first meeting of diplomats in history, dedicated to the use of autonomous combat drones. At that time, it was believed that proactive actions were necessary in case such a technology becomes a reality. & Amp; nbsp; 60 > 11 years, negotiations continue, and there is still no consensus on the determination of autonomous weapons, not to mention the coordinated regulation of its use. Nevertheless, non -governmental organizations and the UN look at the future with optimism, since the international community, although slowly, is still advancing to the general understanding of key issues. & Amp; nbsp; Mary Warham from human rights Watch. An important step forward: “At least 120 countries fully supported the call for the beginning of a dialogue on the development of the International Law on Autonomous Weapons Systems. We see great interest and support, including from the experts in the field of artificial intelligence and religious leaders. ”~ 60 >“ The consensus is gradually developing that completely autonomous weapons systems should be prohibited, ”says Izumi Namamsy. -When the conversation is about issues of life and death, someone must bear responsibility. “& Amp; nbsp;