While numerous AI experts have told the Jerusalem Post over the years that people worried about AI turning on humanity as in the famous “Terminator” movies simply misunderstand the technology, the likelihood of AI making a catastrophic mistake with nuclear weapons is no fairytale.
A recent article in the Bulletin of the Atomic Scientists, a top group of nuclear scientists, as well as other recent publications by defense experts have said that Russia may already be integrating AI into a new nuclear torpedo it is developing known as the Poseidon, to make it autonomous.
According to the Atomic Scientists report, the US and China are also considering injecting AI deeper into their nuclear weapons’ programs as they modernize and overhaul their nuclear inventory.
There have been no express reports about Israel integrating AI into, what according to foreign reports, is an apparatus of between 80-200 nuclear weapons. But there have been reports of the IDF integrating AI into conventional weapons, such as its spice bomb carried by F-16s.
Part of the concern in the report was that integrating AI into nuclear weapons’ systems could become culturally inevitable once non-conventional weapons become more dominated by AI.
The nuclear holocaust risks that scientists and experts are writing about are not a hostile takeover by AI, but by AI getting hacked, slipping out of control by a technical error or badly misjudging a situation.
Such risks could be magnified by unmanned vehicles carrying nuclear weapons where there is no one on board and responsible for making the final decision to deploy a nuclear weapon.
As a secondary but still serious risk, AI integration into early warning systems could overwhelm human decision-makers who could be faster on the nuclear trigger finger to yield to the technology despite any human judgment doubts they might have.
Some studies have shown that AI and automated evidence in general can reinforce bubble-style thinking and make it more difficult for analysts to entertain alternate narratives about what might be occurring in murky and hi-stress situations.
An example that the article gives of human judgment’s importance was a 1983 incident when a Soviet officer named Stanislav Petrov disregarded automated audible and visual warnings that US nuclear missiles were inbound.
The systems were wrong and had Petrov trusted technology over his own instincts, the world might have gone to nuclear war over a technological malfunction.
The article also points out potential valuable aspects of AI in the nuclear weapons arena, such as gathering more accurate and comprehensive data so that decision-makers are guessing in the dark less often.
In addition, AI can get such key information to decision-makers much faster whereas in the past key information might be stuck in the collection process without getting to leaders in time before they had to make a decision.
Moreover, the article noted that AI has been integrated into aspects of countries’ nuclear programs for some time.
Even in earlier decades of the Cold War, both the US and Russia had certain capabilities programmed into some nuclear weapons to be able to quickly switch to targeting each other, as opposed to landing harmlessly at sea, should certain scenarios occur.
Overall, the greatest concern about AI in nuclear weapons is with the weaker side in a potential standoff.
A country like China, with much more limited nuclear or conventional weapons capabilities, might seek to integrate AI into its nuclear weapons program with the hope of accelerating deployment speed so that the US would be unable to knock it out of a war with a preemptive “first strike.”
Some analysts believe this might be a reason that Russia is entertaining AI in its nuclear program, while others view Moscow as wanting to speed up its nuclear weapons deployment in order to be more offensive-minded, and not merely in self-defense.
Although such abilities would seem to be far away from what Iran can achieve, Tehran has had sudden jumps in nuclear technology in the past when given assistance by Russia, China, North Korea or Pakistan.
"style" - Google News
December 26, 2019 at 01:23AM
https://ift.tt/2EUv8fo
Scientists warn AI control of nukes could lead to ‘terminator-style’ war - The Jerusalem Post
"style" - Google News
https://ift.tt/2Mvyfz3
Shoes Man Tutorial
Pos News Update
Meme Update
Korean Entertainment News
Japan News Update
No comments:
Post a Comment