My employer has a few LLMs available internally via OpenAI-compatible endpoints. They also have an internal-only VSCode extension that’s configured to use those endpoints and provide code completion and a chat interface. Unfortunately, I don’t like using VSCode, plus I find the extension’s interface to be rather limiting. I haven’t fully embraced using an LLM as a coding assistant and instead prefer to use it as a rubber ducky for the most part, with the occasional round of “help me debug this thing that is hard to find on Google because Google sucks now”. So I figured that I’d run Open WebUI locally and configure it to the internal endpoint. Sadly, the endpoint uses a self-signed certificate, and Open WebUI doesn’t have an easy way to disable SSL verification. As such, it unceremoniously fails with

[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain=.

There’s one resolved issue on Github about it, but the recommended workarounds require source code changes. Plus the issue talks about httpx whereas the latest tag for their docker image uses aiohttp, neither of which support disabling SSL verification using environment variables. The change itself is rather simple, every ClientSession constructor needs to be given a connector with verify_ssl set to False, as follows:

aiohhtp.ClientSession(connector=aiohttp.TCPConnector(verify_ssl=False))

So I cloned their Github repo, made the change to backend/open_webui/routers/opeai.py and ran a docker build .. Which promptly failed because NodeJS ran out of heap memory. And then I thought - hey, this is python. There’s no compilation needed, the source file must be in the docker image in some form, what if I just modify that?

Turns out, the file is just copied to /app/backend/open_webui/routers/openai.py. My first attempt at this was adding a volume mount to the container and mounting the local routers folder to the right path. That failed with some error about the CACHE line in audio.py in the same folder, so they are doing some pre-processing somewhere. Or I got my docker tags mixed up, unclear. I didn’t dig any further because I got an even better idea: why not just generate a patch file with my changes and write a new Dockerfile that FROM’s the official docker image and applies the patch on top? And that’s exactly what I did. I ran git diff > patches/openai.patch to generate the patch file, and then wrote a new Dockerfile:

FROM ghcr.io/open-webui/open-webui:latest
COPY patches /opt/patches
RUN cd /app && git apply /opt/patches/openai.patch

After that, it was a matter of docker build . -t open-webui:patched and creating a docker-compose file that uses the new image from my local docker registry. And I had a working Open WebUI without OpenAI endpoint SSL verification!