Update to latest open-webui releases

This commit is contained in:
Matt Curfman
2025-01-23 17:28:33 -08:00
parent 07e8a24b3a
commit b74bab0b6a
3 changed files with 3 additions and 3 deletions
+1 -1
View File
@@ -1,6 +1,6 @@
# ollama-intel-gpu # ollama-intel-gpu
This repo illlustrates the use of Ollama with support for Intel ARC GPU based via SYCL. Run the recently released [Meta llama3.1](https://llama.meta.com/) or [Microsoft phi3](https://news.microsoft.com/source/features/ai/the-phi-3-small-language-models-with-big-potential) models on your local Intel ARC GPU based PC using Linux or Windows WSL2. This repo illlustrates the use of Ollama with support for Intel ARC GPU based via ipex-llm. Run the recently released [deepseek-r1](https://github.com/deepseek-ai/DeepSeek-R1) model on your local Intel ARC GPU based PC using Linux or Windows WSL2.
## Screenshot ## Screenshot
![screenshot](doc/screenshot.png) ![screenshot](doc/screenshot.png)
+1 -1
View File
@@ -17,7 +17,7 @@ services:
environment: environment:
- DISPLAY=${DISPLAY} - DISPLAY=${DISPLAY}
ollama-webui: ollama-webui:
image: ghcr.io/open-webui/open-webui:v0.3.35 image: ghcr.io/open-webui/open-webui
container_name: ollama-webui container_name: ollama-webui
volumes: volumes:
- ollama-webui:/app/backend/data - ollama-webui:/app/backend/data
+1 -1
View File
@@ -15,7 +15,7 @@ services:
environment: environment:
- DISPLAY=${DISPLAY} - DISPLAY=${DISPLAY}
ollama-webui: ollama-webui:
image: ghcr.io/open-webui/open-webui:v0.3.35 image: ghcr.io/open-webui/open-webui
container_name: ollama-webui container_name: ollama-webui
volumes: volumes:
- ollama-webui:/app/backend/data - ollama-webui:/app/backend/data