Today's news in major cities, regional and local areas hich can include accident reports

Sunday, February 26, 2023

[New post] Should Algorithms Control Nuclear Launch Codes?

Site logo image Christina Macpherson posted: "  https://www.wired.com/story/fast-forward-should-algorithms-control-nuclear-launch-codes-the-us-says-no/-- 26 Feb 23 A new State Department proposal asks other nations to agree to limits on the power of military AI. LAST THURSDAY," nuclear-news

Should Algorithms Control Nuclear Launch Codes?

Christina Macpherson

Feb 27

 https://www.wired.com/story/fast-forward-should-algorithms-control-nuclear-launch-codes-the-us-says-no/-- 26 Feb 23

A new State Department proposal asks other nations to agree to limits on the power of military AI.

LAST THURSDAY, THE US State Department outlined a new vision for developing, testing, and verifying military systems—including weapons—that make use of AI. 

The Political Declaration on Responsible Military Use of Artificial Intelligence and Autonomy represents an attempt by the US to guide the development of military AI at a crucial time for the technology. The document does not legally bind the US military, but the hope is that allied nations will agree to its principles, creating a kind of global standard for building AI systems responsibly. 

Among other things, the declaration states that military AI needs to be developed according to international laws, that nations should be transparent about the principles underlying their technology, and that high standards are implemented for verifying the performance of AI systems. It also says that humans alone should make decisions around the use of nuclear weapons. 

When it comes to autonomous weapons systems, US military leaders have often reassured that a human will remain "in the loop" for decisions about use of deadly force. But the official policy, first issued by the DOD in 2012 and updated this year, does not require this to be the case.

Attempts to forge an international ban on autonomous weapons have so far come to naught. The International Red Cross and campaign groups like Stop Killer Robots have pushed for an agreement at the United Nations, but some major powers—the US, Russia, Israel, South Korea, and Australia—have proven unwilling to commit.

One reason is that many within the Pentagon see increased use of AI across the military, including outside of non-weapons systems, as vital—and inevitable. They argue that a ban would slow US progress and handicap its technology relative to adversaries such as China and Russia. The war in Ukraine has shown how rapidly autonomy in the form of cheap, disposable drones, which are becoming more capable thanks to machine learning algorithms that help them perceive and act, can help provide an edge in a conflict. 

Earlier this month, I wrote about onetime Google CEO Eric Schmidt's personal mission to amp up Pentagon AI to ensure the US does not fall behind China. It was just one story to emerge from months spent reporting on efforts to adopt AI in critical military systems, and how that is becoming  central to US military strategy—even if many of the technologies involved remain nascent and untested in any crisis.

Lauren Kahn, a research fellow at the Council on Foreign Relations, welcomed the new US declaration as a potential building block for more responsible use of military AI around the world.

A few nations already have weapons that operate without direct human control in limited circumstances, such as missile defenses that need to respond at superhuman speed to be effective. Greater use of AI might mean more scenarios where systems act autonomously, for example when drones are operating out of communications range or in swarms too complex for any human to manage. 

Some proclamations around the need for AI in weapons, especially from companies developing the technology, still seem a little farfetched. There have been reports of fully autonomous weapons being used in recent conflicts and of AI assisting in targeted military strikes, but these have not been verified, and in truth many soldiers may be wary of systems that rely on algorithms that are far from infallible.

And yet if autonomous weapons cannot be banned, then their development will continue. That will make it vital to ensure that the AI involved behave as expected—even if the engineering required to fully enact intentions like those in the new US declaration is yet to be perfected.

Comment
Like
Tip icon image You can also reply to this email to leave a comment.

Unsubscribe to no longer receive posts from nuclear-news.
Change your email settings at manage subscriptions.

Trouble clicking? Copy and paste this URL into your browser:
http://nuclear-news.net/2023/02/27/a-should-algorithms-control-nuclear-launch-codes/

Powered by WordPress.com
Download on the App Store Get it on Google Play
at February 26, 2023
Email ThisBlogThis!Share to XShare to FacebookShare to Pinterest

No comments:

Post a Comment

Newer Post Older Post Home
Subscribe to: Post Comments (Atom)

JHI Blog: Recent posts

...

  • [New post] My Week In Books (15 Aug 2021) #booklove #bookupdate #MeAndMyBooks
    yvonnembee posted: " I have had a great week with the book reading, there have been some fabulous ones. The weather here ha...
  • [New post] 6 Apps You Must Add to Your iPhone ASAP | FinanceBuzz
    lhvi3...
  • [New post] Is Chicken In A Biskit Coming Back? We Just Got Word That It Might Be
    trentbartlett posted: "Rumours around this snack's return have been floating around the internet for a little while now...

Search This Blog

  • Home

About Me

Today's news in major cities, regional and local areas which can include accident reports, police & emergency responses, criminal and court proceedings or live
View my complete profile

Report Abuse

Blog Archive

  • January 2026 (9)
  • December 2025 (17)
  • November 2025 (10)
  • October 2025 (13)
  • September 2025 (10)
  • August 2025 (8)
  • July 2025 (5)
  • June 2025 (7)
  • May 2025 (3)
  • April 2025 (10)
  • March 2025 (8)
  • February 2025 (6)
  • January 2025 (4)
  • December 2024 (6)
  • November 2024 (8)
  • October 2024 (9)
  • September 2024 (8)
  • August 2024 (5)
  • July 2024 (10)
  • June 2024 (10)
  • May 2024 (11)
  • April 2024 (4)
  • March 2024 (1462)
  • February 2024 (3037)
  • January 2024 (3253)
  • December 2023 (3238)
  • November 2023 (3122)
  • October 2023 (3010)
  • September 2023 (2524)
  • August 2023 (2299)
  • July 2023 (2223)
  • June 2023 (2164)
  • May 2023 (2229)
  • April 2023 (2135)
  • March 2023 (2236)
  • February 2023 (2171)
  • January 2023 (2326)
  • December 2022 (2500)
  • November 2022 (2470)
  • October 2022 (2648)
  • September 2022 (1909)
  • August 2022 (1839)
  • July 2022 (1856)
  • June 2022 (1969)
  • May 2022 (2411)
  • April 2022 (2354)
  • March 2022 (1867)
  • February 2022 (1013)
  • January 2022 (1050)
  • December 2021 (1620)
  • November 2021 (3122)
  • October 2021 (3276)
  • September 2021 (3145)
  • August 2021 (3259)
  • July 2021 (3084)
Powered by Blogger.