Merge pull request #24 from mattcurf/update-webui
This commit is contained in:
@@ -1,6 +1,6 @@
|
|||||||
# ollama-intel-gpu
|
# ollama-intel-gpu
|
||||||
|
|
||||||
This repo illlustrates the use of Ollama with support for Intel ARC GPU based via SYCL. Run the recently released [Meta llama3.1](https://llama.meta.com/) or [Microsoft phi3](https://news.microsoft.com/source/features/ai/the-phi-3-small-language-models-with-big-potential) models on your local Intel ARC GPU based PC using Linux or Windows WSL2.
|
This repo illlustrates the use of Ollama with support for Intel ARC GPU based via ipex-llm. Run the recently released [deepseek-r1](https://github.com/deepseek-ai/DeepSeek-R1) model on your local Intel ARC GPU based PC using Linux or Windows WSL2.
|
||||||
|
|
||||||
## Screenshot
|
## Screenshot
|
||||||

|

|
||||||
|
|||||||
@@ -17,7 +17,7 @@ services:
|
|||||||
environment:
|
environment:
|
||||||
- DISPLAY=${DISPLAY}
|
- DISPLAY=${DISPLAY}
|
||||||
ollama-webui:
|
ollama-webui:
|
||||||
image: ghcr.io/open-webui/open-webui:v0.3.35
|
image: ghcr.io/open-webui/open-webui
|
||||||
container_name: ollama-webui
|
container_name: ollama-webui
|
||||||
volumes:
|
volumes:
|
||||||
- ollama-webui:/app/backend/data
|
- ollama-webui:/app/backend/data
|
||||||
|
|||||||
+1
-1
@@ -15,7 +15,7 @@ services:
|
|||||||
environment:
|
environment:
|
||||||
- DISPLAY=${DISPLAY}
|
- DISPLAY=${DISPLAY}
|
||||||
ollama-webui:
|
ollama-webui:
|
||||||
image: ghcr.io/open-webui/open-webui:v0.3.35
|
image: ghcr.io/open-webui/open-webui
|
||||||
container_name: ollama-webui
|
container_name: ollama-webui
|
||||||
volumes:
|
volumes:
|
||||||
- ollama-webui:/app/backend/data
|
- ollama-webui:/app/backend/data
|
||||||
|
|||||||
Reference in New Issue
Block a user