Create your own private AI cluster with this software
- July 19, 2024
- 0
No Nvidia GPUs at home to run LLMs? Don’t worry, this software tool lets you turn the GPU in your laptop or smartphone into powerful AI chips. Running
No Nvidia GPUs at home to run LLMs? Don’t worry, this software tool lets you turn the GPU in your laptop or smartphone into powerful AI chips. Running
No Nvidia GPUs at home to run LLMs? Don’t worry, this software tool lets you turn the GPU in your laptop or smartphone into powerful AI chips.
Running LLM models typically requires servers with powerful graphics chips, so GPUs are in demand, but demand is greater than the available supply. But what if you could use the GPUs in your smartphone or laptop to run large AI models? It’s a lot less complex than it sounds.
The software tool exo-explore turns GPUs in household appliances into powerful GPUs and is publicly available via Github. It works on Android, macOS and Linux devices. In a short demo video
Exo-Explore brings together GPUs from multiple devices into an AI cluster. The software scans your network for connected devices with a suitable GPU and distributes the workload across the devices in proportion to the available memory the device can handle. The LLM is input via an API.
To be clear: this is an experimental tool. Don’t expect the software to run completely smoothly. The lack of Windows compatibility can’t be called a small limitation. A smartphone and a laptop aren’t enough to run the most powerful LLMs in your living room. For hobbyists, it offers an easily accessible way to experiment with private AI.
Source: IT Daily
As an experienced journalist and author, Mary has been reporting on the latest news and trends for over 5 years. With a passion for uncovering the stories behind the headlines, Mary has earned a reputation as a trusted voice in the world of journalism. Her writing style is insightful, engaging and thought-provoking, as she takes a deep dive into the most pressing issues of our time.