Senators Act to Prevent AI-Triggered Nuclear Catastrophe

Nuclear bomb explosion

US Senator Edward Markey (D-Mass.) and Representatives Don Beyer (D-Va.), Ted Lieu (D-Calif.), and Ken Buck (R-Colo.) announced last April 26 a bipartisan legislation to prevent autonomous artificial intelligence (AI) systems from launching nuclear weapons.

The Block Nuclear Launch by Autonomous Artificial Intelligence Act would make it illegal to use federal funds to launch any nuclear weapon by an automated system without “meaningful human control.”

The bill seeks to maintain the current US Department of Defense policy that requires a human “in the loop” for all critical actions involving nuclear weapon employment.

It aims to codify this principle into law and follows the recommendation of the National Security Commission on Artificial Intelligence, which called for the US to affirm its policy that only human beings can authorize the use of nuclear weapons.

Aim to Block AI-Initiated Nuclear Attacks

The Block Nuclear Launch by Autonomous Artificial Intelligence Act aims to prevent AI from making nuclear launch decisions. It is a response to concerns over the future potential of generative AI technology, which some researchers believe could threaten human civilization.

The legislation builds on the existing policy that requires human beings to be in control of all nuclear-related decisions, with the new bill aiming to make this policy law.

The cosponsors of the Block Nuclear Launch by Autonomous Artificial Intelligence Act in the Senate include Bernie Sanders (I-Vt.) and Elizabeth Warren (D-Mass.).

The legislation is part of a larger plan from Markey and Lieu to avoid nuclear escalation. The pair recently reintroduced a bill prohibiting any US president from launching a nuclear strike without prior authorization from Congress.

According to the congressmen, the overall goal is to reduce the risk of “nuclear Armageddon” and hinder nuclear proliferation.

Protecting the World from Autonomous Nuclear Attacks

The Block Nuclear Launch by Autonomous Artificial Intelligence Act seeks to ensure that humans are the only ones who can make life-or-death decisions to use deadly force, especially for the most dangerous weapons.

Markey said about the bill, “As we live in an increasingly digital age, we need to ensure that humans hold the power alone to command, control, and launch nuclear weapons—not robots.”

The legislation has been welcomed by many, including Buck, who said, “While US military use of AI can be appropriate for enhancing national security purposes, use of AI for deploying nuclear weapons without a human chain of command and control is reckless, dangerous, and should be prohibited.”

The bill has been praised for taking a proactive approach to the potential risks associated with AI technology.

Related Reading:

Author