May 17, 2025
Gadget

https://www.xataka.com/robotica-e-ia/lanzamiento-nuevo-superchip-ia-nvidia-deja-pregunta-donde-estan-sus-chips-ia-para-pcs-moviles

  • March 19, 2024
  • 0

Jensen Huang was in his own atmosphere yesterday. Knowing that today competitors are still trying to gain ground by keeping their tongues out, he, among other things, offered

Jensen Huang was in his own atmosphere yesterday. Knowing that today competitors are still trying to gain ground by keeping their tongues out, he, among other things, offered his own opinion. new artificial intelligence chipsB200 with Blackwell architecture.

These computing monsters make a clear quantitative leap: from the 80,000 million transistors of the prestigious H100 models with Hopper architecture, they now have 208,000 million transistors. This is what makes them 2.5 times more powerful than their predecessors, which will undoubtedly make them highly in-demand products in this age of artificial intelligence, where processing capacity makes the difference.

NVIDIA’s GTC 2024 event also dropped other important news, such as the digital twin of our planet Earth-2 for analyzing meteorology, the GROOT Project for humanoid robots or the new quantum simulation platform.

All promising and clear confirmation that the company wants to capitalize on its current inertia continue to dominate data centers on which we depend for numerous processes, both professional and daily. But NVIDIA is apparently forgetting something.

From users.

In fact, it is particularly shocking that there is no news about the ability to run AI applications natively, which is the other slowly emerging pillar of AI.

In the first months of 2024, we start to see where things could go. PC, laptop, and mobile phone manufacturers are using AI—perhaps excessively—to sell us their new devices. While the approaches for now are interesting but relatively modest (transcribing conversations, translating in real time, producing summaries), everything indicates that these features will go further.

However, it will be important to have special chips to do this. Just like we have graphics cards for games, It’s reasonable to think that AI accelerator cards would be useful. For such applications. It is true that current SoCs from Apple, Intel, AMD or Qualcomm integrate dedicated cores (normally known as NPUs, Neural Processing Units), but they have to share the space with the rest of the components and in some cases this may not be enough. .

Screenshot on 2024 03 19 13 51 17

This Blackwell compute node features two Grace CPUs and four new B200 GPUs with Blackwell architecture. Its performance and artificial intelligence can reach 80 petaFLOPS, according to the company. It’s a complete disgrace.

And there’s NVIDIA, which has a magnificent opportunity to become indispensable in PCs, laptops, tablets and mobile phones, and almost every other device, and for which it does not seem to have any solutions. It is true that RTX graphics also serve to speed up such processes, but they are not specifically designed for them. And therein lies the trick; Meanwhile, [kíd]NO [kuíd]—the problem.

This particular rabbit was raised many moons ago by Groq (not to be confused with Grok), a startup specializing in language processing that precisely designs AI chips called LPUs. Or what is the same: ChatGPT accelerators.

His suggestion is very remarkable, especially since it opens the doors to this new trend; This means that in our computer or smartphone we not only have a CPU or GPU, but also an LPU that will enable interaction with chatbots such as ChatGPT, Copilot. or Gemini is particularly fast.

However, NVIDIA does not seem to be interested in this sector at the moment. It’s partly understandable: thanks to chips like the H100, the company is taking off like a rocket, and it can be predicted that it will continue to do so with the new B200. But still, no one is in a better position to propose such solutions.

Maybe he has bad memories of his previous adventures in a world without graphics cards. NVIDIA Tegra chips were SoCs It’s like what Qualcomm, Apple or MediaTek do now: they had ARM CPU and GPU and focused on efficient multimedia processing.

They didn’t get too bad a result in the experiment: Tegra

But there have been no developments or significant improvements since then, other than talk that they will stop producing them in 2021, but it looks like they have restarted their production again due to the launch of Switch OLEDs. Tegra on chips. Thanks to the success of these consoles, they’ve sold millions of chips, so… why not beef up this product line?

Only NVIDIA knows. The truth is that the clear focus on the segment of dedicated graphics cards and now on AI chips seems to exclude the possibility of planning AI accelerator chips for our mobile phones or computers. It will be a matter of time I know if this is a mistakebut it already looks like an interesting area where the company could launch its offering.

in Xataka | Supermicro is the hidden giant of the artificial intelligence age. Growing more than NVIDIA

Source: Xataka

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version