Use the docker image provided by Intel directly, removing the Dockerfile.
This commit is contained in:
@@ -1,8 +0,0 @@
|
|||||||
FROM intelanalytics/ipex-llm-inference-cpp-xpu:latest
|
|
||||||
|
|
||||||
ENV DEBIAN_FRONTEND=noninteractive
|
|
||||||
ENV OLLAMA_HOST=0.0.0.0:11434
|
|
||||||
|
|
||||||
COPY ./scripts/serve.sh /usr/share/lib/serve.sh
|
|
||||||
|
|
||||||
ENTRYPOINT ["/bin/bash", "/usr/share/lib/serve.sh"]
|
|
||||||
37
README.md
37
README.md
@@ -48,42 +48,25 @@ When using Open WebUI, you should see this partial output in your console, indic
|
|||||||
* Open your web browser to http://localhost:3000 to access the Open WebUI web page.
|
* Open your web browser to http://localhost:3000 to access the Open WebUI web page.
|
||||||
* For more information on using Open WebUI, refer to the official documentation at https://docs.openwebui.com/ .
|
* For more information on using Open WebUI, refer to the official documentation at https://docs.openwebui.com/ .
|
||||||
|
|
||||||
## Updating the images
|
## Updating the containers
|
||||||
|
If there are new updates in the [ipex-llm-inference-cpp-xpu](https://hub.docker.com/r/intelanalytics/ipex-llm-inference-cpp-xpu) docker Image or in the Open WebUI docker Image, you may want to update your containers, to stay up to date.
|
||||||
|
|
||||||
Before any updates, be sure to stop your containers
|
Before any updates, be sure to stop your containers
|
||||||
```bash
|
```bash
|
||||||
$ podman compose down
|
$ podman compose down
|
||||||
```
|
```
|
||||||
|
|
||||||
### ollama-intel-arc Image
|
Then just run a pull command to retrieve the `latest` images.
|
||||||
If there are new updates in the [ipex-llm docker image](https://hub.docker.com/r/intelanalytics/ipex-llm-inference-cpp-xpu), you may want to update the Ollama image and containers, to stay updated.
|
|
||||||
|
|
||||||
First check any containers running the docker image, and remove them
|
|
||||||
```bash
|
|
||||||
$ podman ps -a
|
|
||||||
CONTAINER ID IMAGE
|
|
||||||
111479fde20f localhost/ollama-intel-arc:latest
|
|
||||||
|
|
||||||
$ podman rm <CONTAINER ID>
|
|
||||||
```
|
|
||||||
|
|
||||||
The go ahead and remove the docker image:
|
|
||||||
```bash
|
|
||||||
$ podman image list
|
|
||||||
REPOSITORY TAG
|
|
||||||
localhost/ollama-intel-arc latest
|
|
||||||
|
|
||||||
$ podman rmi <IMAGE ID>
|
|
||||||
```
|
|
||||||
After that, you can run compose up, to rebuild the image from scratch
|
|
||||||
```bash
|
|
||||||
$ podman compose up
|
|
||||||
```
|
|
||||||
### open-webui Image
|
|
||||||
If there are new updates in Open WebUI, just do a pull and the new changes will be retrieved automatically.
|
|
||||||
```bash
|
```bash
|
||||||
$ podman compose pull
|
$ podman compose pull
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
|
After that, you can run compose up to start your services again.
|
||||||
|
```bash
|
||||||
|
$ podman compose up
|
||||||
|
```
|
||||||
|
|
||||||
## Manually connecting to your Ollama container
|
## Manually connecting to your Ollama container
|
||||||
You can connect directly to your Ollama container by running these commands:
|
You can connect directly to your Ollama container by running these commands:
|
||||||
|
|
||||||
|
|||||||
@@ -1,8 +1,7 @@
|
|||||||
version: '3'
|
version: '3'
|
||||||
services:
|
services:
|
||||||
ollama-intel-arc:
|
ollama-intel-arc:
|
||||||
build: .
|
image: intelanalytics/ipex-llm-inference-cpp-xpu:latest
|
||||||
image: ollama-intel-arc:latest
|
|
||||||
container_name: ollama-intel-arc
|
container_name: ollama-intel-arc
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
devices:
|
devices:
|
||||||
@@ -11,6 +10,15 @@ services:
|
|||||||
- ollama-volume:/root/.ollama
|
- ollama-volume:/root/.ollama
|
||||||
ports:
|
ports:
|
||||||
- 11434:11434
|
- 11434:11434
|
||||||
|
environment:
|
||||||
|
- no_proxy=localhost,127.0.0.1
|
||||||
|
- OLLAMA_HOST=0.0.0.0
|
||||||
|
- DEVICE=Arc
|
||||||
|
- OLLAMA_INTEL_GPU=true
|
||||||
|
- OLLAMA_NUM_GPU=999
|
||||||
|
- ZES_ENABLE_SYSMAN=1
|
||||||
|
command: sh -c 'mkdir -p /llm/ollama && cd /llm/ollama && init-ollama && exec ./ollama serve'
|
||||||
|
|
||||||
open-webui:
|
open-webui:
|
||||||
image: ghcr.io/open-webui/open-webui:latest
|
image: ghcr.io/open-webui/open-webui:latest
|
||||||
container_name: open-webui
|
container_name: open-webui
|
||||||
|
|||||||
@@ -1,8 +0,0 @@
|
|||||||
#!/bin/sh
|
|
||||||
|
|
||||||
cd /llm/scripts/
|
|
||||||
source ipex-llm-init --gpu --device Arc
|
|
||||||
|
|
||||||
bash start-ollama.sh
|
|
||||||
|
|
||||||
tail -f /dev/null
|
|
||||||
Reference in New Issue
Block a user