Revert to ipex-llm version of ollama for gpu acceleration

This commit is contained in:
Matt Curfman 2024-08-16 22:41:15 -07:00
parent aec98f6f30
commit 40313a7364
4 changed files with 17 additions and 9 deletions

View File

@ -10,6 +10,8 @@ RUN apt update && \
gnupg \
wget \
curl \
python3 \
python3-pip \
ocl-icd-libopencl1
# Intel GPU compute user-space drivers
@ -34,21 +36,21 @@ RUN wget -qO - https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-P
tee /etc/apt/sources.list.d/oneAPI.list && \
apt update && \
apt install --no-install-recommends -q -y \
intel-oneapi-runtime-libs
intel-basekit
# Required oneAPI environment variables
ENV USE_XETLA=OFF
ENV SYCL_PI_LEVEL_ZERO_USE_IMMEDIATE_COMMANDLISTS=1
ENV SYCL_CACHE_PERSISTENT=1
# Ollama
RUN curl -fsSL https://ollama.com/install.sh | sh
COPY _init.sh /usr/share/lib/init_workspace.sh
COPY _run.sh /usr/share/lib/run_workspace.sh
# Ollama via ipex-llm
RUN pip3 install --pre --upgrade ipex-llm[cpp]
ENV OLLAMA_NUM_GPU=999
ENV OLLAMA_HOST=0.0.0.0:11434
# https://github.com/ollama/ollama/issues/1590
ENV OLLAMA_INTEL_GPU=1
ENTRYPOINT ["/usr/local/bin/ollama", "serve"]
ENTRYPOINT ["/bin/bash", "/usr/share/lib/run_workspace.sh"]

View File

@ -30,6 +30,9 @@ $ docker-compose -f docker-compose-wsl2.yml up
Then launch your web browser to http://localhost:3000 to launch the web ui. Create a local OpenWeb UI credential, then click the settings icon in the top right of the screen, then select 'Models', then click 'Show', then download a model like 'llama3.1:8b-instruct-q8_0' for Intel ARC A770 16GB VRAM
# Known issues
* Little effort has been made to prune the packages pulled into the Ollama docker image for Intel GPU
# References
* https://github.com/ollama/ollama/issues/1590
* https://github.com/ollama/ollama/pull/3278
* https://dgpu-docs.intel.com/driver/client/overview.html
* https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quickstart/ollama_quickstart.html

1
_init.sh Normal file
View File

@ -0,0 +1 @@
source /opt/intel/oneapi/setvars.sh

2
_run.sh Normal file
View File

@ -0,0 +1,2 @@
source /usr/share/lib/init_workspace.sh
/usr/local/lib/python3.10/dist-packages/bigdl/cpp/libs/ollama serve