Google warns that we will need a lot of RAM to move AI to smartphones
April 1, 2024
0
Artificial intelligence cit consumes a huge amount of RAM when running locally, a reality that applies to most current AI models and doesn’t differentiate between devices. This perfectly
Artificial intelligence cit consumes a huge amount of RAM when running locally, a reality that applies to most current AI models and doesn’t differentiate between devices. This perfectly explains why Google decided to limit the implementation of Gemini AI on the Pixel 8 Pro, since this terminal has a larger amount of RAM.
We already had the opportunity to talk about this topic last year and told you that Android smartphones will need a lot of RAM to move AI models locally and that the minimum requirement for generative AI models It should have been in 12GB of RAM. This number is exactly what the Pixel 8 Pro comes with, while the standard model only has 8GB of RAM.
In a recent video from the “Made by Google” series, the Mountain View giant explained that it decided to limit the launch of Gemini AI to the Pixel 8 Pro because the smartphone has 12 GB of RAM It was the perfect platform. for that. By this, Google means that this RAM configuration was the optimal minimum to offer a good user experience with the Gemini Nano, which is a version of Gemini AI adapted to the limitations of a smartphone.
According to Google, they could adapt the Gemini Nano to work on the Pixel 8, but that version would offer a “degraded” user experience, clearly worse if we compare it to the same one running on the Pixel 8 Pro, which is why they finally decided to keep it as an exclusive for the latter. All in all, Google has confirmed that it has launched the Gemini Nano version for the Pixel 8 limited to developers, because he understands that it can be interesting to improve and optimize the said model.
We can also clearly tell from the video that RAM will continue to be key in the future of AI, and that the incremental improvements it will see will mean that it needs more and more RAM to function optimally. For example, according to Seang Chau, vice president of devices and software services at Google, they are working on creating some AI-powered features such as intelligent response, are permanently stored in RAM.
This means that these functions will consume more RAM, although in exchange they will always be ready to use and offer completely optimal performance and user experience. Continuous consumption of a large amount of RAM on a smartphone that does not have enough of this resource can cause significant slowdowns and seriously affect performance.
Donald Salinas is an experienced automobile journalist and writer for Div Bracket. He brings his readers the latest news and developments from the world of automobiles, offering a unique and knowledgeable perspective on the latest trends and innovations in the automotive industry.