ChatGPT manufacturer OpenAI is exploring its options to develop its own AI chip through an acquisition or partnership.
OpenAI is exploring the possibility of building its own AI chip through an acquisition or closer collaboration with existing chip manufacturers such as Nvidia. According to Reuters, the company has been exploring various options since last year to address the AI chip shortage and high costs. There has been no official response from OpenAI to this news yet.
CEO Sam Altman has repeatedly complained publicly that it is extremely difficult to get AI chips in a market primarily controlled by Nvidia (80% market share).
Making your own AI chips solves two problems: the high cost per chip and better control over the power consumption of current GPUs. Since 2020, OpenAI has been training its generative AI technologies on giant supercomputers from Microsoft. It contains 10,000 Nvidia GPUs.
4 cents per query
ChatGPT costs OpenAI a lot to operate. According to the analysis agency Bernstein, each query costs an average of four cents. If ChatGPT aims to accommodate a tenth of the size of search giant Google, it will cost $48.1 billion in GPUs and $16 billion in chips per year to operate.
With numbers like that, it makes sense for OpenAI to look for other options. On the other hand, developing your own AI chip can easily cost hundreds of millions of dollars per year and the chances of success are not guaranteed. An acquisition can speed everything up, as Amazon acquired Annapurna Labs for its AWS services in 2015.
Whatever decision OpenAI makes, it won’t reap the benefits until years later. Microsoft, the largest supporter of OpenAI, is also currently working on its own AI chip for Azure. It is not yet clear whether cooperation on this is possible.