Commit Graph

17 Commits

Author SHA1 Message Date
63c3b81292 Upgrade ollama from 0.9.3 (IPEX-LLM) to 0.15.6 (official) with Vulkan Intel GPU
Replace the IPEX-LLM portable zip (bundling a patched ollama 0.9.3 with SYCL)
with the official ollama 0.15.6 release using the Vulkan backend for Intel GPU
acceleration. The official ollama project does not ship a SYCL backend; Vulkan
is their supported path for Intel GPUs.

- Use official ollama binary with Vulkan runner (OLLAMA_VULKAN=1)
- Strip CUDA/MLX runners from image to save space
- Add mesa-vulkan-drivers for Intel ANV Vulkan ICD
- Remove all IPEX-LLM env vars and wrapper scripts
- Simplify entrypoint to /usr/bin/ollama serve directly
- Clean up docker-compose.yml: remove IPEX build args and env vars

Tested: Intel Arc Graphics (MTL) detected, 17/17 layers offloaded to Vulkan0
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-02-12 15:34:03 +00:00
8debf2010b Fix ollama not reachable from host due to hardcoded OLLAMA_HOST in entrypoint
The IPEX-LLM bundled start-ollama.sh hardcodes OLLAMA_HOST=127.0.0.1 and
OLLAMA_KEEP_ALIVE=10m, overriding docker-compose environment variables and
preventing external connections through Docker port mapping.

- Add custom start-ollama.sh that honours env vars with sensible defaults
- Mount it read-only into the container
- Fix LD_LIBRARY_PATH env var syntax (: -> =)
- Add .gitignore for IDE/swap/webui data files
- Update CHANGELOG and README with fix documentation

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-02-12 15:18:37 +00:00
96913a2a18 Update Intel GPU stack and ipex-llm to latest available versions
- level-zero v1.22.4 -> v1.28.0
- IGC v2.11.7 -> v2.28.4
- compute-runtime 25.18.33578.6 -> 26.05.37020.3
- libigdgmm 22.7.0 -> 22.9.0
- ipex-llm ollama nightly 2.3.0b20250612 -> 2.3.0b20250725
- Docker compose: disable webui auth, stateless webui volume
- README formatting and GPU model update

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-02-12 15:00:53 +00:00
0a7f974c04 Update Docker configurations and Intel GPU runtimes for improved performance 2025-06-21 00:35:20 +01:00
17592946fa Update Docker configurations for deployment improvements
Revised `IPEXLLM_RELEASE_REPO` value and adjusted file and path references for consistency. Updated `docker-compose.yml` with refined environment variables, device mapping, restart policies, and added necessary port bindings for better functionality and maintainability.
2025-04-22 17:56:04 +01:00
Adam Gibson
504a1d388f Update default to ipex-llm v2.2.0 (guide for v2.3.0-nightly in docs) 2025-04-16 21:25:38 +08:00
Matt Curfman
61288f5f6c Update to ipex-llm-2.2.0b20250313 2025-03-17 10:44:00 -07:00
Adam Gibson
451f91080c Revert compose to cached .tgz by default. 2025-03-17 19:29:53 +08:00
Adam Gibson
1e92fbe888 Updates to allow latest ollama in compose file, with fallback to cached in Dockerfile (if no build args provided) 2025-03-16 16:47:45 +08:00
Adam Gibson
2c82aed59c Update compose file with build args 2025-03-16 16:10:20 +08:00
Matt Curfman
fa579db492 Increase context window size 2025-02-19 15:26:01 -08:00
Matt Curfman
2fc526511f Update to use new ipex portable .zip packages 2025-02-19 14:56:56 -08:00
Matt Curfman
b74bab0b6a Update to latest open-webui releases 2025-01-23 17:28:33 -08:00
Pepijn de Vos
2e18d91cd7 Update webui 2024-11-07 21:11:57 +01:00
Matt Curfman
698d574fce Update to web ui v0.3.10 2024-08-01 22:08:02 -07:00
mattcurf
025b1b0fc9 Misc. cleanups 2024-04-30 14:38:23 -07:00
mattcurf
2daa02e8f4 Initial version 2024-04-29 17:19:07 -07:00