Fix ollama not reachable from host due to hardcoded OLLAMA_HOST in entrypoint

The IPEX-LLM bundled start-ollama.sh hardcodes OLLAMA_HOST=127.0.0.1 and
OLLAMA_KEEP_ALIVE=10m, overriding docker-compose environment variables and
preventing external connections through Docker port mapping.

- Add custom start-ollama.sh that honours env vars with sensible defaults
- Mount it read-only into the container
- Fix LD_LIBRARY_PATH env var syntax (: -> =)
- Add .gitignore for IDE/swap/webui data files
- Update CHANGELOG and README with fix documentation

Co-authored-by: Cursor <cursoragent@cursor.com>
This commit is contained in:
2026-02-12 15:18:37 +00:00
parent 96913a2a18
commit 8debf2010b
5 changed files with 58 additions and 2 deletions
+7 -2
View File
@@ -15,10 +15,14 @@ services:
volumes:
- /tmp/.X11-unix:/tmp/.X11-unix
- ollama-intel-gpu:/root/.ollama
- ./start-ollama.sh:/start-ollama.sh:ro
shm_size: "16G"
environment:
- ONEAPI_DEVICE_SELECTOR=level_zero:0
#- SYCL_PI_LEVEL_ZERO_USE_IMMEDIATE_COMMANDLISTS=1
#- SYCL_CACHE_PERSISTENT=1
- IPEX_LLM_NUM_CTX=16384
- LD_LIBRARY_PATH:/opt/intel/oneapi/compiler/2024.2/lib
- LD_LIBRARY_PATH=/opt/intel/oneapi/compiler/2024.2/lib
- DISPLAY=${DISPLAY}
- OLLAMA_DEFAULT_KEEPALIVE="6h"
- OLLAMA_HOST=0.0.0.0
@@ -38,7 +42,8 @@ services:
ollama-webui:
image: ghcr.io/open-webui/open-webui:latest
container_name: ollama-webui
#volumes:
volumes:
- ./webui/data:/app/backend/data
# - ollama-webui:/app/backend/data
depends_on:
- ollama-intel-gpu