April 19, 2025
Trending News

Increasing computing power does not make AI smarter

  • November 28, 2024
  • 0

The ability of graphics processors to scale well computing performance under conditions of slowing down the classic “Moore’s Law” has long been presented by Nvidia management as a

Increasing computing power does not make AI smarter

The ability of graphics processors to scale well computing performance under conditions of slowing down the classic “Moore’s Law” has long been presented by Nvidia management as a lifeline for all humanity. As the explosive growth of AI systems begins to slow, new performance scaling challenges begin to loom on the horizon.


As the Financial Times noted, for many in Silicon Valley, “Moore’s Law” has been replaced by a new concept: the “law of scaling” of artificial intelligence. Until recently, it was believed that scaling the computing infrastructure and saturating it with larger volumes of data would lead to qualitative changes in artificial intelligence systems. In fact, this was expected to make AI “smarter”. As a result, all major companies in the technology sector have focused on actively increasing the computing power of their data centers for several consecutive quarters.

Previously, it was believed that the current productivity growth rate of data processing centers would be maintained until a “superintelligence” was created that could surpass human intelligence, but was based on software algorithms and dependencies. Only in recent weeks have experts begun to express concern that the latest native language models from OpenAI, Google and Anthropic do not provide the necessary progress on previous trends.

Ilya Sutzkever, one of the founders of the startup who left OpenAI, recently said: “The 2010s were an age of scale, but now we are back in an age of discovery and curiosity.”. Remarkably, a year ago Sutzkever was confident that the entire Earth’s surface should be covered with solar panels that would power numerous data processing centers.

Many market participants agree that the phase of active learning of language models is over, but it is necessary to move on to the next phase to maintain the current pace of progress. Microsoft president Satya Nadella (Satya Nadella) believes that the slowdown in learning large language models does not particularly limit the pace of progress, since artificial intelligence systems have the opportunity to think. According to Nvidia founder Jensen Huang (Jensen Huang), even the reduced need for computing resources to train language models will not mean a decrease in demand for its products. Developers of artificial intelligence systems will try to shorten the time it takes for the system to respond to questions asked by users. According to the head of Nvidia’s stable, this race will require even more hardware resources, and that’s good for the company’s business. Microsoft President Brad Smith believes market demand for accelerator chips will continue to grow for at least another year.

Also read – Virtual reality reveals how animals see the world

However, the transition of artificial intelligence systems to a new stage of development should be ensured by the emergence of real application areas that are useful for the business world. There are still problems with this, because any innovation must bring material benefits, and the impact of the application of artificial intelligence in its current form in many sectors of the economy is not yet so clear. This does not stop tech giants from investing large sums of money to expand their computing resources. This year, Microsoft, Amazon, Google and Meta’s combined capital expenditures should exceed $200 billion, according to Morgan Stanley representatives, and next year it will likely exceed $300 billion.

Source: Port Altele

Leave a Reply

Your email address will not be published. Required fields are marked *