In 2023, non-governmental organizations such as Human Rights Watch and Stop Killer Robots will continue their calls for a new international legal framework to regulate autonomous weapons systems. Some States and scholars are optimistic about the possibility. These optimists often analogize to nuclear weapons regulation to illustrate that States sometimes have been willing to limit their own flexibility in strategic and sensitive areas – such as the one posed by the AI “arms race.” However, this analogy is flawed. There are good reasons to be skeptical about the prospects that States will achieve a new, robust multilateral agreement that implicates development of lethal autonomous systems or other “national security AI.” The threat posed by these systems seems less tangible and, for now, less existential than nuclear weapons. The leading AI States perceive themselves as differently situated from each other (unlike the United States and Soviet Union did in the nuclear setting). Further, unlike with nuclear weapons, the development and use of AI tools are closely-guarded secrets.
Ashley S. Deeks, Year Ahead - The Hurdles to International Regulation of AI Tools, Articles of War (January 5, 2023).