OpenELM are small open source language models that can be run locally on a single device. These new models were released by Apple on Hugging Face.
Apple launches OpenELM (Open-Source Efficient Language Models), a new family of small open-source AI models designed to run on a single device, compared to other LLMs that require connection to cloud servers are designed. OpenELM includes eight language models, four of which are pre-trained and four of which are tailored to instructions. This gives us further insight into what Apple is currently doing in the AI space. Apple has made OpenELM publicly available in the Hugging Face AI community.
Local open source models
Unlike most large language models that connect to cloud servers, the new OpenELMs run locally on a single device. Apple has introduced a total of eight OpenELMs on Hugging Face, four of which are pre-trained and four of which are cued. The models have different parameter sizes between 270 million and 3 billion.
Sample Code License
Apple offers the OpenELMs as a “sample code license,” which does not prohibit commercial use or modifications. This is what Apple considers a sample code license along with various checkpoints for training, statistics, pre-training instructions, evaluation, instruction optimization, and parameter-efficient tuning.
The OpenELMs are intended to run entirely on a smartphone or laptop. WWDC 2024 is intended to make it clear what possibilities OpenELMs offer.