NVIDIA and its formula with RTX for IA PC
- May 21, 2024
- 0
If until last week AI PC was more or less present as a set of basic technical characteristics that a computer must meet in order to perform artificial
If until last week AI PC was more or less present as a set of basic technical characteristics that a computer must meet in order to perform artificial
If until last week AI PC was more or less present as a set of basic technical characteristics that a computer must meet in order to perform artificial intelligence tasks locally, the event held yesterday by Microsoft, where it presented Copilot+ PC, elevated to what it aims to be the new paradigm that will drive the future of the PC at least in the short to medium term, it’s something that has proven the speedup that CPU manufacturers provide when adding NPUs to their packages.
Now, though The initial definition of the IA PC focused on the NPU and its computing capacity, the truth is that it is only one of the existing options to provide computers with the computing capacity necessary to competently solve AI-based tasks. And this is where NVIDIA is focusing, as we saw today at Microsoft Build 2024, when it proposes a version of the IA PC in which the said computing capacity falls on the GPU.
To that end, NVIDIA announced that we’ll see it hit the market in the coming months the first Copilot+ computers equipped with RTX GPUs which, of course, will be able to use all the functions that Microsoft announced yesterday, but also the functions of the software ecosystem of NVIDIA, which have been available for some time to users of graphics adapters of this faithful giant.
This NVIDIA announcement, which you can find more information about at this link, revives a very interesting debate that this technology opened some time agoand which focuses on the fact that at the current level of hardware development lThe capabilities of the GPU are higher than those offered by the NPU. It is true, yes, that in the short and medium term we will experience a great development of the latter, but as shown in the “snapshot” from the company’s presentation earlier this month, today the difference between the two options is more than remarkable. You can see the image shown above this paragraph.
Now it’s not all about the benefits, it’s also necessary developers have the necessary tools so that they can use them as optimally as possible in their applications. And let’s not forget, we’re talking about performing tasks that have until now been almost exclusively the property of cloud platforms on a PC. Here are the main new features NVIDIA announced today in this regard:
This movement, as I have already indicated, is part of a really interesting discussion that will undoubtedly grow in the near future. Does it make sense to limit PC AI exclusively to systems that integrate NPUs, or should it instead be based on the total computing capacity of the system?
Personally, and although I understand both philosophies, I’m currently leaning towards the NVIDIA designed model with a Copilot+ PC equipped with an RTX GPUboth for greater computing capacity (both in integer operations and especially in floating point operations) and for a greater range of functions, since those presented by Microsoft and which will be integrated into Windows, we must add all those that NVIDIA has developed over the years, since the debut of the RTX 20 generation, and which range from those focused on the world of gaming (from DLSS to ACE) to other general purposes such as NVIDIA Broadcast and Chat with RTX
Source: Muy Computer
Donald Salinas is an experienced automobile journalist and writer for Div Bracket. He brings his readers the latest news and developments from the world of automobiles, offering a unique and knowledgeable perspective on the latest trends and innovations in the automotive industry.