Experiments with large language models (LLMs) on home computers have become increasingly popular, especially with the rise of DeepSeek. Apple Silicon Macs, depending on their RAM and processor configuration, are well-suited for this purpose. Apple has released MLX, a special framework to accelerate machine learning (ML) on ARM-based Macs. This “Array Framework for Apple Silicon” aims to make machine learning on current Macs particularly efficient. A new free app available on the Mac App Store specializes in models using this technology.
Pico AI Homelab, from Starling Protocol Inc, is available for macOS 15 (Sequoia) and allows users to experiment with different models easily. Currently, there are over 300 models available. These include various versions of the distilled DeepSeek R1, as well as models like Mistral, Meta Llama, Alibaba Qwen, Google Gemma, and Microsoft Phi, all adapted for MLX to enhance performance compared to GGUF models. Pico AI Homelab is compatible with Ollama and uses its API, allowing integration with alternative chat apps such as Open WebUI, MindMac, or Ollamac. The application runs as a local HTTP server (localhost), meaning chats are conducted in the browser. The entire system operates offline, with no data sent to the internet. Pico AI Homelab does not collect user information.
Pico AI Homelab runs on all Apple Silicon Macs starting from the M1 model. A minimum of 16 GB of RAM is required, though having more than 32 GB is highly beneficial for large language models. Users do not need command-line skills to use Pico AI Homelab. “Thanks to the guided one-click installation, even beginners can quickly get started, while experienced users benefit from flexible customization options,” state the developers.
The app is currently completely free, though it is unclear if this will change in the future. There are no API costs or subscription fees from AI providers. However, one should not expect too much from local LLMs: due to significantly lower computing power compared to server models, the outputs are poorer, and there are more hallucinations. Nonetheless, local models can be enjoyable to experiment with.
The app provides an opportunity to explore the capabilities of language models on personal devices. It allows users to engage with AI without the need for extensive technical knowledge or significant financial investment. With the growing interest in AI and machine learning, Pico AI Homelab offers a platform for both beginners and advanced users to explore and learn.
While the performance of local models may not match that of server-based solutions, the convenience and privacy of running models locally make it an attractive option for many users. The ability to experiment with a wide range of models and configurations provides valuable insights into the potential of AI technologies.
Overall, Pico AI Homelab represents a step forward in making advanced AI tools accessible to a broader audience. By leveraging the power of Apple Silicon Macs and the MLX framework, users can explore the possibilities of machine learning in a user-friendly environment. Whether for educational purposes, personal interest, or professional development, Pico AI Homelab offers a practical and engaging way to interact with large language models.