Ollama | Manual Installation
Read documentation on manually installing Ollama outside of the Pieces Desktop App for use with PiecesOS.
Manual Installation Guide
Ollama is an optional dependency that enables local AI inference for Pieces Copilot and other AI-powered features in Pieces for Developers.
If you prefer to run LLMs on-device instead of using cloud-based AI, you will need to install Ollama manually.
This guide will walk you through the download, installation, and verification process for Ollama across Windows, macOS, and Linux.
Minimum Version Requirement
PiecesOS requires a specific minimum version of Ollama to ensure compatibility, which is 0.5.5.
Always install the latest stable release.
Download Ollama
Visit the official Ollama website to download the latest version, then install using the platform-specific instructions below.
Ollama | macOS Installation
Follow the steps below to manually install Ollama for your macOS device.
-
Download the macOS installer (
.pkg
) from the Ollama website. -
Double-click the
.pkg
file and follow the installation steps. -
After installation, verify the setup in Terminal by running
ollama --version
.
Verify Ollama Integration
Once installed, ensure PiecesOS can detect and use Ollama.
-
Open the Pieces Quick Menu from your system tray or menu bar.
-
Navigate to
ML Processing
. -
If Ollama is installed and recognized, it will appear under
Local AI Models
.
If it does not appear, restart PiecesOS and try again.
If PiecesOS still doesn’t detect Ollama, refer to troubleshooting.
Update Ollama
To update or uninstall Ollama on macOS or Windows, you can either download the latest version from the official Ollama website, update it through the Pieces Desktop App (if installed) or update it directly from the background Ollama process on your device.
For Linux, open your terminal and run sudo apt update && sudo apt upgrade ollama
.
Uninstall Ollama
If you no longer need local AI models or wish to remove the Ollama wrapper from your system, follow the instructions below specific to your platform.
Ollama | macOS Uninstallation
- Open your terminal and run
sudo rm -rf /[username]/local/bin/ollama
.
Next Steps
You can read documentation about what local LLMs are currently available on Ollama and are supported by PiecesOS, or click here for troubleshooting if you’re experiencing installation issue.