diff --git a/README.md b/README.md index 322ee59..5cdb798 100644 --- a/README.md +++ b/README.md @@ -29,6 +29,10 @@ $ docker compose up Then launch your web browser to http://localhost:3000 to launch the web ui. Create a local OpenWeb UI credential, then click the settings icon in the top right of the screen, then select 'Models', then click 'Show', then download a model like 'llama3.1:8b-instruct-q8_0' for Intel ARC A770 16GB VRAM +## Update to the latest IPEX-LLM Portable Zip Version + +To update to the latest portable zip version of IPEX-LLM's Ollama, update the compose file's `IPEXLLM_PORTABLE_ZIP_FILENAME` build argument to the latest `ollama-*.tgz` release from https://github.com/intel/ipex-llm/releases/tag/v2.2.0-nightly , then rebuild the image. + # References * https://dgpu-docs.intel.com/driver/client/overview.html * https://github.com/intel/ipex-llm/blob/main/docs/mddocs/Quickstart/ollama_portablze_zip_quickstart.md