📦

haproxy-spoe-vibes

By Trung Thanh Phan

View on Snapcraft.io
Version1.0
Revision1
Licenseunset
Confinementstrict
Basecore22

Local LLM REST API using llama-server and Qwen 0.5B


A fully automated deployment of a lightweight, CPU-only Large Language Model.
This snap packages llama.cpp's server with Qwen2.5-0.5B to expose an
OpenAI-compatible REST API.

Update History

1.0 (1)
30 Apr 2026, 14:15 UTC

Published30 Apr 2026, 14:13 UTC

Last updated30 Apr 2026, 14:05 UTC

First seen30 Apr 2026, 14:15 UTC