May 4, 2025
Trending News

GPT-4 is incredibly good at hacking unpatched vulnerabilities

  • April 26, 2024
  • 0

Researchers published OpenAI’s GPT-4 on a list of vulnerabilities that have not yet been fixed. GPT-4 was able to hack 87 percent of them as long as a

Ah hackers

Researchers published OpenAI’s GPT-4 on a list of vulnerabilities that have not yet been fixed. GPT-4 was able to hack 87 percent of them as long as a description of the vulnerability was provided.

How good are LLMs at hacking? Four researchers from the University of Illinois put it to the test. They released GPT-4, GPT-3.5 and some open source LLMs from the Mistral and Meta stables for a number of “one-day” vulnerabilities. These are vulnerabilities that have already been registered with NIST, but for which a patch is not yet available and can therefore be actively exploited. The full results are described in a paper.

In the experiment, researchers had each LLM read the NIST description of a vulnerability and develop a script to exploit it. Luckily, most of the LLMs were out of luck and went home with a score of zero percent. OpenAI’s GPT-4 performed terribly well, detecting only two of the fifteen vulnerabilities tested: good for a score of 87 percent.

The description seems necessary for GPT-4: without a clear description, it apparently cannot make arrows from any wood. For the same reason, GPT-4 is completely useless if you release it on zero days. An LLM therefore cannot find vulnerabilities in a system.

Cheaper than a hacker

With their experiment, the researchers want to warn that the explosion of powerful and versatile LLMs is not without risks. Not only do top-of-the-line models like GPT-4 appear to be good at hacking, it would also be much cheaper for malicious actors than hiring a human hacker. According to the study, the cost of successful exploitation is approximately $8.8 per vulnerability: 2.8 times cheaper than human labor.

Source: IT Daily

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version