South Korea to Open AI Weapons Lab, Researchers Across the World Boycott as Response

More than 50 scientists across 30 countries have stated intentions to boycott South Korean university that is about to open an AI-run weaponry research and developmental lab.

The South Korean Korea Advanced Institute of Science and Technology (KAIST) is ready to open an AI weapons lab. They are joined by Hanwha Systems in this effort, who is a defense company with a focus on cluster munitions. The United Nations have banned the opening of this lab.

In a movement started by University of New South Wales’ artificial intelligence professor Toby Walsh, thirty countries’ researchers of artificial intelligence and robotics have made it clear that they intend to boycott all connections to KAIST as soon as the weapons laboratory opens. The over fifty scientists have written an open letter addressing the university, stating that the boycott will not take place if KAIST pledges to not produce self-governing weapons that do not have any worthwhile human input. The letter characterizes the development of such weapons as the third revolutionary wave in warfare and goes on to say that it will make wars much more large-scale and faster. Moreover, the letter calls such areas of research Pandora’s box, saying that once opened, it will be difficult to close.

A history of improvements

Unfortunately, this is not the first open letter of this nature. Prof. Walsh also released one in 2015, then another in 2017, warning nations of the perils of self-governing weaponry. The 2015 letter forewarned of a competition in AI-controlled weapons, and as proof. Walsh points out, that race has already started; early models of self-governing weapons already exist and are being tested in the present day. The 2017 letter was signed by 116 heads of industry, including Elon Musk and Mustafa Suleyman of Google. Prof. Walsh goes on to say that KAIST’s opening of the lab would aggravate the situation and that such actions will not be tolerated.

An additional open letter in 2015 boasted thousands of signatures, amongst them the late Stephen Hawking, Elon Musk, and Steve Wozniak, Apple’s co-founder. The letter calls for the ban of self-governing weapons, fearing a future where armed quadcopters exist with the aim to locate and eliminate human targets. A further concern brought up by this letter is that autonomous weapons, dissimilar to nuclear weapons, are built with equipment that is easy to acquire. As such, they will be inexpensive and thus easily mass-produced. This would ensure them a place on black markets, where it could land in the possession of warlords and terrorists.

International Danger

The United Nations are having a meeting on the 9th of April, in Geneva, where they plan to discuss lethal self-governing weapons. The UN is hoping to minimize the danger posed to society of such weapons. Researcher fear that autonomous weapons will be used as means of terror as well, as they would make it easy for terrorists to use the weapons against the innocent population.

Furthermore, a report by Harvard Law School and the Human Rights Watch, released in 2015, outlined how the absence of regulation on autonomous regulation could remove accountability for human deaths. They say that as of the laws of today’s programmers, engineers, manufacturer, as well as military personnel might avoid liability for the deaths brought on by the self-governing weapons. Without the appropriate legal framework, it is impossible to pinpoint where the responsibility falls, and deaths would go unpunished, especially when errors surface.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.