04/04/2024 / By Ava Grace
Artificial intelligence research company OpenAI has quietly dropped its ban on using its AI-enhanced tools for military purposes and is now working with the Department of Defense on software projects, including ones related to cybersecurity.
The ChatGPT creator is also in discussions with the federal government about developing tools to reduce veteran suicides, said Anna Makanju, the company’s vice president of global affairs, while adding that the company has retained its ban on using AI to develop weapons. (Related: OpenAI thinks white genocide is no big deal.)
OpenAI’s policies previously specified that the company did not allow the usage of its models for “activity that has high risk of physical harm” such as weapons development or military and warfare. OpenAI has removed the specific reference to the military, although its policy still states that users should not “use our service to harm yourself or others,” including to “develop or use weapons.”
“Because we previously had what was essentially a blanket prohibition on [the] military, many people thought that would prohibit many of these use cases, which people think are very much aligned with what we want to see in the world,” Makanju said.
“Our policy does not allow our tools to be used to harm people, develop weapons, for communications surveillance, or to injure others or destroy property,” the spokesperson said. “There are, however, national security use cases that align with our mission.”
The news comes after years of controversy about tech companies developing technology for military use, highlighted by the public concerns of tech workers – especially those working on AI.
Silicon Valley has softened its stance on collaborating with the U.S. military in recent years. The Pentagon for its part has also made a concerted effort to win over Silicon Valley startups to help develop new defensive and offensive technology and integrate advanced tools into the department’s ongoing and future operations.
Rising U.S.-China tensions and Russia’s ongoing special military operation in Ukraine have also served as a means to dispel many of the qualms entrepreneurs once had about military collaboration by heightening fears of risks to national security.
Defense experts have been bullish about the impact AI will have on the military. Former Google CEO Eric Schmidt, now a prominent defense industry figure, has compared the arrival of AI to the advent of nuclear weapons.
“Einstein wrote a letter to Roosevelt in the 1930s saying that there is this new technology – nuclear weapons – that could change war, which it clearly did. I would argue that (AI-powered) autonomy and decentralized, distributed systems are that powerful,” said Schmidt.
Microsoft, OpenAI’s single largest investor, already works extensively with the U.S. Armed Forces and other government branches. Other than OpenAI and Microsoft, tech companies like Anthropic and Google are already participating in the Defense Advance Research Agency’s ongoing efforts to find software that can automatically fix vulnerabilities in U.S. cybersecurity and defend critical infrastructure from cyberattacks.
Learn more about AI integration into more and more technologies at FutureTech.news.
Watch this Jan. 13 episode of “The Daily Wrap Up” discussing OpenAI’s quiet removal of its self-imposed ban on participating in the military and war.
This video is from the What is Happening channel on Brighteon.com.
Conservative AI Chatbot ‘GIPPR’ shut down by ChatGPT-maker OpenAI.
New York Times sues Microsoft, OpenAI, claiming artificial intelligence copyright infringement.
OpenAI researchers warn board that rapidly advancing AI technology threatens humanity.
OpenAI CEO launches iris-scanning crypto plan to “verify” every human being: “IT’S TIME”.
Sources include:
Tagged Under:
AI, artificial intelligence, big government, Big Tech, ChatGPT, Dangerous, dangerous tech, Department of Defense, future tech, military, military-industrial complex, national security, OpenAI, tech giants, technocrats, technology
This article may contain statements that reflect the opinion of the author
COPYRIGHT © 2017 ROBOTS NEWS