From 18370326c1f90740e2aa6fc48070ab3819ecd1f3 Mon Sep 17 00:00:00 2001 From: irfansofyana Date: Mon, 20 Jan 2025 01:11:04 +0700 Subject: [PATCH] update post --- _til/2025-01-20-extra-hosts-on-docker.md | 16 +++++----------- 1 file changed, 5 insertions(+), 11 deletions(-) diff --git a/_til/2025-01-20-extra-hosts-on-docker.md b/_til/2025-01-20-extra-hosts-on-docker.md index 41c5cfc..08908b2 100644 --- a/_til/2025-01-20-extra-hosts-on-docker.md +++ b/_til/2025-01-20-extra-hosts-on-docker.md @@ -7,26 +7,20 @@ description: extra_hosts parameter on docker-compose to connect to host machine tags: [containerization, docker-compose] --- -## What I learned +## What I Learned -I'm currently exploring again how can I use LLM without internet, one of the way is using ollama. +I've been exploring ways to use Ollama offline, one approach being Ollama. My go-to LLM UI is [Open WebUI](https://github.com/open-webui/open-webui), which I manage using Docker Compose. You can see my setup in more detail [here](https://brain.irfansp.dev/#/page/openweb-ui) -I have my favorite LLM UI which is [Open WebUI](https://github.com/open-webui/open-webui) and I currently use the docker-compose to manage Open WebUI, my setup can be seen [here](https://brain.irfansp.dev/#/page/openweb-ui). +I'm looking for an alternative solution, where I keep the Open WebUI running in the container and run Ollama locally on my host machine. The reason behind this is because I have downloaded some models on my host and I think it will be much harder if I use or run the Ollama as a container. -Now, I want something different, which is keeping my Open WebUI setup on container but keep the Ollama on my host machine. - -There is the solution if we just use docker without docker-compose on this problem, and it can be seen [here](https://github.com/open-webui/open-webui#troubleshooting). But, in this case I run my openweb-ui using docker-compose so it's kinda different. - -After some searched, I found that we can just add parameter `extra_hosts` in the Open WebUI container so that it can connect to Ollama in my host machine, specifically adding following: +After some research, I found that I can use the `extra_hosts`` parameter in Docker Compose to map host.docker.internal to the host machine's gateway IP. I added the following configuration to my Open WebUi container in the docker-compose.yml file: ```yaml extra_hosts: - "host.docker.internal:host-gateway" ``` -The extra_hosts directive adds an entry to the container’s /etc/hosts file, mapping host.docker.internal to the host’s gateway IP. - -and in this case, to connect my Open WebUI to Ollama on my host machine I can use +This directive updates the container's /etc/hosts file, mapping host.docker.internal to the host machine's gateway IP. With this setup, I can access Ollama on my host machine by using from the Open WebUI. ## References