Hugging Face will distribute tens of millions of dollars in GPU computing power to small AI developers. In this way, the company wants to become competitive with the big companies.
Hugging Face is reaching out to help smaller AI developers develop AI applications. Developing such applications can be expensive as they require valuable GPU computing power. Big players like AWS or OpenAI have access to huge AI supercomputers, but smaller players have to rent this computing power. That’s why Hugging Face is making ten million dollars of GPU computing power available for free.
Start-ups and academics
The computing power is intended for start-ups, small developers and academics. Hugging Face, which was created as a kind of GitHub for AI and ML models, wants to counteract the centralization of AI. This is of course in the interests of Hugging Face itself, which plays a central role in a large and diverse AI ecosystem.
āWe are fortunate to be able to invest in the community,ā Hugging Face CEO Clem Delangue told The Verge. Hugging Face is now more or less self-funded and is valued at $4.5 billion after a new round of investment of $235 million.
ZeroGPU
However, the company fears that the high costs of AI development will make diversity in the ecosystem difficult. Great innovation comes from players with lots of resources who keep their successes to themselves. In addition, it is not easy for smaller players to obtain the required computing power from cloud providers as entry costs are high.
The freely available computing power should solve this. Hugging Face calls the sponsorship program ZeroGPU. The computing power will be available via Spaces. The pool of computing power is shared between users and applications so that ZeroGPU is used as optimally as possible. On the server side, the program is designed for Nvidia A100 GPUs.