Use the start-ollama.sh script provided in the official docker image.
This commit is contained in:
@@ -99,4 +99,4 @@ $ /llm/ollama/ollama -v
|
|||||||
|
|
||||||
## References
|
## References
|
||||||
* [Open WebUI documentation](https://docs.openwebui.com/)
|
* [Open WebUI documentation](https://docs.openwebui.com/)
|
||||||
* [Intel - ipex-llm](https://github.com/intel/ipex-llm/blob/main/docs/mddocs/DockerGuides/docker_cpp_xpu_quickstart.md)
|
* [Intel ipex-llm releases](https://github.com/intel/ipex-llm/releases)
|
||||||
|
|||||||
@@ -3,7 +3,6 @@
|
|||||||
cd /llm/scripts/
|
cd /llm/scripts/
|
||||||
source ipex-llm-init --gpu --device Arc
|
source ipex-llm-init --gpu --device Arc
|
||||||
|
|
||||||
mkdir -p /llm/ollama
|
bash start-ollama.sh
|
||||||
cd /llm/ollama
|
|
||||||
init-ollama
|
tail -f /dev/null
|
||||||
./ollama serve
|
|
||||||
|
|||||||
Reference in New Issue
Block a user