llama.cpp (5882+dfsg-3) unstable; urgency=medium * Upload to unstable * Package description fixes -- Christian Kastner Wed, 27 Aug 2025 07:01:15 +0200 llama.cpp (5882+dfsg-3~exp3) experimental; urgency=medium [ Christian Kastner ] * Switch over to SOVERsioned, dynamic-backend-loading ggml * libllama0: Drop spurious python3 dependency [ Mathieu Baudier ] * llama.cpp-tools: Introduce bash completion -- Christian Kastner Thu, 07 Aug 2025 12:43:22 +0200 llama.cpp (5882+dfsg-3~exp2) experimental; urgency=medium * Correct the Section field of a few packages -- Christian Kastner Mon, 14 Jul 2025 18:54:14 +0200 llama.cpp (5882+dfsg-3~exp1) experimental; urgency=medium * Split llama.cpp package into subpackages * Build new package python3-gguf * d/rules: Pass in LLAMA_BUILD_{NUMBER,COMMIT} * Add gguf-py-depends-on-the-requests-library.patch * Add Add-soversion-to-libraries.patch * Rename private directories llama.cpp -> llama * Improve llama-server theme handling * Generate manpages using help2man -- Christian Kastner Mon, 14 Jul 2025 17:17:43 +0200 llama.cpp (5882+dfsg-2) unstable; urgency=medium * Build-Depend on the exact version of ggml. For the same reason the binaries depend on the exact version. Avoids FTBFS because of frequent API/ABI breakages -- Christian Kastner Sun, 13 Jul 2025 11:16:13 +0200 llama.cpp (5882+dfsg-1) unstable; urgency=medium * New upstream version 5882+dfsg * Rebase patches * Fix broken path to llama-server theme * Bump ggml dependency * d/gbp.conf: Convert to DEP-14 layout * d/gbp.conf: Enforce non-numbered patches * Update d/copyright -- Christian Kastner Sat, 12 Jul 2025 17:31:41 +0200 llama.cpp (5760+dfsg-4) unstable; urgency=medium * Fix installability yet again (ggml version still mis-specified) (Closes: #1108925) -- Christian Kastner Tue, 08 Jul 2025 08:44:50 +0200 llama.cpp (5760+dfsg-3) unstable; urgency=medium * Fix installability (ggml version was mis-specified) * Improve lintian overrides -- Christian Kastner Mon, 07 Jul 2025 18:27:22 +0200 llama.cpp (5760+dfsg-2) unstable; urgency=medium * Hard-code (relaxed) ggml dependency We can't deduce the support ggml version, the maintainers must explicitly specify it. In doing so, ignore the Debian revision number -- Christian Kastner Fri, 27 Jun 2025 22:13:39 +0200 llama.cpp (5760+dfsg-1) unstable; urgency=medium * New upstream version 5760+dfsg (Closes: #1108368) - Includes a fix for CVE-2025-52566 * Refactor/add missing copyrights for vendored code * Refresh patches -- Christian Kastner Fri, 27 Jun 2025 07:55:00 +0200 llama.cpp (5713+dfsg-1) unstable; urgency=medium * New upstream release (Closes: #1108113) - Includes a fix for CVE-2025-49847 * Refresh patches * Update d/copyright * Document ggml/llama.cpp/whisper.cpp update procedure * Install the new mtmd headers -- Christian Kastner Fri, 20 Jun 2025 21:00:33 +0200 llama.cpp (5318+dfsg-2) unstable; urgency=medium [ Mathieu Baudier ] * Install public headers and build configurations to private directories * Fix private directories for pkg-config [ Christian Kastner ] * Depend on exact build-time ggml version Upstream ships llama.cpp with a specific version of ggml. We have no guarantees that any version earlier or later than that, in fact it's common for newer versions to break something. So going forward, we ship llama.cpp and ggml in tandem, with ggml being updated first, and llama.cpp depending on the exact version used at build-time. * Install all free server themes * Enable changing server theme using update-alternatives * Simplify server frontend patches * Begin shipping the tests -- Christian Kastner Thu, 19 Jun 2025 23:17:31 +0200 llama.cpp (5318+dfsg-1) unstable; urgency=medium * Upload to unstable. * New upstream version 5318+dfsg - Refresh patches * Update d/copyright -- Christian Kastner Fri, 09 May 2025 09:54:32 +0200 llama.cpp (5151+dfsg-1~exp3) experimental; urgency=medium * Initial release (Closes: #1063673) -- Christian Kastner Sat, 19 Apr 2025 21:59:05 +0200