May 5, 2025
Trending News

ChatGPT costs $700,000 a day to run

  • April 27, 2023
  • 0

Each search costs about 36 cents. According to an analysis, ChatGPT OpenAI costs $694,444 daily. The world is under the spell of ChatGPT, but running such an AI

ChatGPT costs 0,000 a day to run

Each search costs about 36 cents. According to an analysis, ChatGPT OpenAI costs $694,444 daily.

The world is under the spell of ChatGPT, but running such an AI model costs money. First, you need to train the AI ​​model with lots of data through machine learning to arrive at a Large Language Model (LLM) which is GPT-4, the latest model that ChatGPT runs on. SemiAnalysis did the math for GPT-3 and came up with a cost of $841,346. It took this into account with 175 billion parameters and 300 billion tokens trained on Nvidia A100 GPUs.

Important note: These are pure calculation costs (inference), not buying the hardware or hiring the data scientists who tinker with it. GPT-4 contains an even larger data set than GPT-3, which drives up costs. It goes without saying that Sam Altman, CEO of OpenAI, sees a trend where huge amounts of data are gradually disappearing. All in all, training GPT-4 cost about $100 million, including hardware and personnel costs, he said.

He claims that scaling up the model size has limited benefits. OpenAI must also consider the physical limits of how many data centers the company can build and how quickly they can be built.

Feed data daily

Training is one thing, keeping the AI ​​model running with the numerous questions asked also costs money. According to the same semi-analysis, we can expect ChatGPT to cost $694,444 daily. If you take into account the number of search queries entered on average, each search costs around 36 cents.

This makes Microsoft’s investment in OpenAI, the company behind ChatGPT, all the more important. As of January, it has already poured $10 billion into the company before investing another hundred millions in hardware to train the neural network and run it on its Bing platform and other applications like Microsoft Copilot in Word and PowerPoint or Business Keep chat flowing in Teams.

Bing wants to gain market share

As of last week, ChatGPT replies are also showing up in Bing search results, further driving up costs. A cost analysis is also presented to us there. In the SemiAnalysis example, we look at Google, which processed an average of 320,000 searches per second in 2022. If the search giant enabled its Bard variant of ChatGPT in every search result, it would cost it $36 billion.

Add to this the investment of more than $100 billion to buy 512,820 Nvidia A100 HGX servers with 4,102,568 A100 GPUs on board and you understand that AI will not replace the search engine immediately.

This new Bing is forcing Google to go out and dance. I want people to know that we made them dance.

Satya Nadella, Microsoft CEO

It makes sense that Microsoft would want to make this investment with search results in Bing, according to CEO Satya Nadella. “For every market share our search engine gains, there’s a $2 billion revenue opportunity from advertising.” Since Bing has a very small market share, 2.88% compared to Google’s 93.17% (via Statcounter), you can understand why Microsoft is now pulling out all the stops to grab valuable market share.

Nadella wants to challenge Google to the maximum during this time. “This new Bing is forcing Google to go out and dance. I want people to know that we made them dance.”

Develop your own AI chips

Buying chips from Nvidia is a quick fix today, but in the long term Microsoft would prefer to put its own AI chips in Azure servers. It has reportedly been tinkered with since 2019, long before the hype that ChatGPT represents today. Codenamed “Athena,” the chip is said to be more efficient than those from third-party suppliers. In addition, Microsoft does not have to buy the microchip externally, which has a positive effect on the final bill.

Microsoft doesn’t plan to become the new Nvidia. The chips of this AI specialist remain relevant. Athena is a necessary extra that will support future workloads. The first Athena chips are currently being tested by Microsoft itself together with OpenAI. The goal is to launch the first generation of Athena in 2024. In principle, new generations would then follow.

Source: IT Daily

Leave a Reply

Your email address will not be published. Required fields are marked *