Upgrade ollama from 0.9.3 (IPEX-LLM) to 0.15.6 (official) with Vulkan Intel GPU

Replace the IPEX-LLM portable zip (bundling a patched ollama 0.9.3 with SYCL)
with the official ollama 0.15.6 release using the Vulkan backend for Intel GPU
acceleration. The official ollama project does not ship a SYCL backend; Vulkan
is their supported path for Intel GPUs.

- Use official ollama binary with Vulkan runner (OLLAMA_VULKAN=1)
- Strip CUDA/MLX runners from image to save space
- Add mesa-vulkan-drivers for Intel ANV Vulkan ICD
- Remove all IPEX-LLM env vars and wrapper scripts
- Simplify entrypoint to /usr/bin/ollama serve directly
- Clean up docker-compose.yml: remove IPEX build args and env vars

Tested: Intel Arc Graphics (MTL) detected, 17/17 layers offloaded to Vulkan0
Co-authored-by: Cursor <cursoragent@cursor.com>
This commit is contained in:
2026-02-12 15:34:03 +00:00
parent 8debf2010b
commit 63c3b81292
2 changed files with 38 additions and 83 deletions
+5 -15
View File
@@ -4,9 +4,7 @@ services:
context: .
dockerfile: Dockerfile
args:
IPEXLLM_RELEASE_REPO: ipex-llm/ipex-llm
IPEXLLM_RELEASE_VERSON: v2.2.0
IPEXLLM_PORTABLE_ZIP_FILENAME: ollama-ipex-llm-2.2.0-ubuntu.tgz
OLLAMA_VERSION: "0.15.6"
container_name: ollama-intel-gpu
restart: unless-stopped
devices:
@@ -15,27 +13,19 @@ services:
volumes:
- /tmp/.X11-unix:/tmp/.X11-unix
- ollama-intel-gpu:/root/.ollama
- ./start-ollama.sh:/start-ollama.sh:ro
shm_size: "16G"
environment:
- ONEAPI_DEVICE_SELECTOR=level_zero:0
#- SYCL_PI_LEVEL_ZERO_USE_IMMEDIATE_COMMANDLISTS=1
#- SYCL_CACHE_PERSISTENT=1
- IPEX_LLM_NUM_CTX=16384
- LD_LIBRARY_PATH=/opt/intel/oneapi/compiler/2024.2/lib
- DISPLAY=${DISPLAY}
- OLLAMA_DEFAULT_KEEPALIVE="6h"
- OLLAMA_HOST=0.0.0.0
- OLLAMA_VULKAN=1
- OLLAMA_DEFAULT_KEEPALIVE=6h
- OLLAMA_KEEP_ALIVE=24h
- OLLAMA_MAX_LOADED_MODELS=1
- OLLAMA_MAX_QUEUE=512
- OLLAMA_MAX_VRAM=0
- OLLAMA_NUM_PARALLEL=1
#- OLLAMA_NOHISTORY=false
#- OLLAMA_NOPRUNE=false
- OLLAMA_NUM_PARALLEL=1
#- IPEXLLM_RELEASE_REPO=ipex-llm/ipex-llm
#- IPEXLLM_RELEASE_VERSON=v2.2.0
#- IPEXLLM_PORTABLE_ZIP_FILENAME=ollama-ipex-llm-2.2.0-ubuntu.tgz
ports:
- 11434:11434
@@ -51,7 +41,7 @@ services:
- ${OLLAMA_WEBUI_PORT-3000}:8080
environment:
- OLLAMA_BASE_URL=http://ollama-intel-gpu:11434
- OLLAMA_DEFAULT_KEEPALIVE="6h"
- OLLAMA_DEFAULT_KEEPALIVE=6h
#- OPENAI_API_BASE_URL=
#- OPENAI_API_KEY=
#