May 9, 2025
Trending News

OpenAI stops 20 cyber operations abusing ChatGPT

  • October 14, 2024
  • 0

This year, OpenAI has already disrupted twenty operations and fraudulent networks that consulted ChatGPT to generate malicious code. Hackers are finding their way into AI. For example, several

AI hacker password crack

This year, OpenAI has already disrupted twenty operations and fraudulent networks that consulted ChatGPT to generate malicious code.

Hackers are finding their way into AI. For example, several reports show that hackers are using AI to create malware attacks. OpenAI shares new findings in its own threat intelligence report showing that the company disrupted around twenty operations and fraudulent networks with its ChatGPT chatbot. For example, hacker groups are said to be using AI to spread disinformation, circumvent security systems and carry out spear phishing attacks.

Abuse of models

While AI models can help developers generate code, hackers have also discovered the power of AI. Recently, some reports revealed that certain malware attacks contained AI-generated code.

OpenAI, the AI ​​company behind the popular chatbot ChatGPT, recently released its Threat Intelligence report. In it, the company highlights that with ChatGPT it has already disrupted twenty operations and fraudulent networks from around the world that wanted to use the AI ​​model to develop malicious code.

In the report, OpenAI also attempts to analyze how threat actors attempt to use AI. Hacker groups are said to have used AI to spread disinformation, circumvent security systems and carry out spearphishing attacks. OpenAI gives a few examples, including the Chinese group “SweetSpecter”, but also CyberAv3ngers and STORM-0817.

The company emphasizes that it continues to work with its intelligence, investigations, security and policy teams to anticipate how malicious actors might use advanced models for dangerous purposes and plan appropriate enforcement actions.

Source: IT Daily

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version