June 23, 2025
Trending News

OpenAI is developing a tool to detect AI-generated images

  • May 9, 2024
  • 0

The capabilities of AI-based rendering tools have advanced to the point where people can mistake them for real or non-AI images. This raises concerns about their possible misuse.

OpenAI is developing a tool to detect AI-generated images

The capabilities of AI-based rendering tools have advanced to the point where people can mistake them for real or non-AI images. This raises concerns about their possible misuse.


OpenAI has already introduced watermarks for images produced by DALL-E 3 to maintain transparency about the source and respect originality. The company is currently developing a new tool that can distinguish real images from those created by the text-based DALL-E model.

OpenAI’s generative AI tools will include C2PA metadata

The official OpenAI blog announced that they are working on new methods to detect AI-generated content. According to the company, the goal is to help researchers examine content provenance and join the Coalition of Content Proofing and Authenticity Steering Committee (C2PA), a widely used standard for certifying digital content. This will allow creators to tag and certify their content to verify its true origin.

OpenAI says they will also integrate C2PA metadata for Sora, the company’s video rendering model, when it launches widely. For those who don’t know, Sora is also likely to be a premium text-to-video tool like DALL-E 3 that will only be accessible to paid subscribers. According to a previously published report, Sora will be available to the public in 2024.

OpenAI creates new tool to discover content created by DALL-E 3

As mentioned above, OpenAI is also working on a new tool that uses artificial intelligence to detect images created by DALL-E 3. More specifically, it estimates the probability that the image was created by the tool. According to the company, it can be detected even after the image has been compressed, saturation changed, or even cropped. Not to mention, these tools aim to be more resilient to attempts to remove content source signals.

The detection tool achieves 98% accuracy on images created with DALL-E and, more importantly, does not flag non-AI images as AI.

The company has already opened applications to provide access to its new image detection tool for the first batch of tests. It includes research laboratories and nonprofit investigative journalism organizations and aims to collect feedback through its Investigative Access Program.

Source: Port Altele

Leave a Reply

Your email address will not be published. Required fields are marked *