Versionv0.18.0
Revision112
Size2662.8 MB
LicenseMIT
Confinementstrict
Basecore24
CategoriesDevelopment, Science

Get up and running with large language models, locally.


Ollama – Local AI Model Runner

Run, manage, and switch between a wide range of open‑source LLMs directly on your local machine. Ollama provides fast, offline inference with a simple CLI and API, ensuring your data never leaves the device. Ideal for developers, researchers, and anyone who wants powerful AI without cloud dependencies.

Update History

v0.17.4 (109)v0.18.0 (112)
18 Mar 2026, 13:33 UTC
v0.15.1 (105)v0.17.4 (109)
10 Mar 2026, 20:33 UTC

Published9 Feb 2024, 19:35 UTC

Last updated17 Mar 2026, 20:23 UTC

First seen13 Dec 2025, 09:47 UTC