Skip to content

Commit

Permalink
update post
Browse files Browse the repository at this point in the history
  • Loading branch information
irfansofyana committed Jan 19, 2025
1 parent d6ae6f9 commit 1837032
Showing 1 changed file with 5 additions and 11 deletions.
16 changes: 5 additions & 11 deletions _til/2025-01-20-extra-hosts-on-docker.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,26 +7,20 @@ description: extra_hosts parameter on docker-compose to connect to host machine
tags: [containerization, docker-compose]
---

## What I learned
## What I Learned

I'm currently exploring again how can I use LLM without internet, one of the way is using ollama.
I've been exploring ways to use Ollama offline, one approach being Ollama. My go-to LLM UI is [Open WebUI](https://github.com/open-webui/open-webui), which I manage using Docker Compose. You can see my setup in more detail [here](https://brain.irfansp.dev/#/page/openweb-ui)

I have my favorite LLM UI which is [Open WebUI](https://github.com/open-webui/open-webui) and I currently use the docker-compose to manage Open WebUI, my setup can be seen [here](https://brain.irfansp.dev/#/page/openweb-ui).
I'm looking for an alternative solution, where I keep the Open WebUI running in the container and run Ollama locally on my host machine. The reason behind this is because I have downloaded some models on my host and I think it will be much harder if I use or run the Ollama as a container.

Now, I want something different, which is keeping my Open WebUI setup on container but keep the Ollama on my host machine.

There is the solution if we just use docker without docker-compose on this problem, and it can be seen [here](https://github.com/open-webui/open-webui#troubleshooting). But, in this case I run my openweb-ui using docker-compose so it's kinda different.

After some searched, I found that we can just add parameter `extra_hosts` in the Open WebUI container so that it can connect to Ollama in my host machine, specifically adding following:
After some research, I found that I can use the `extra_hosts`` parameter in Docker Compose to map host.docker.internal to the host machine's gateway IP. I added the following configuration to my Open WebUi container in the docker-compose.yml file:

```yaml
extra_hosts:
- "host.docker.internal:host-gateway"
```
The extra_hosts directive adds an entry to the container’s /etc/hosts file, mapping host.docker.internal to the host’s gateway IP.
and in this case, to connect my Open WebUI to Ollama on my host machine I can use <http://host.docker.internal:11434>
This directive updates the container's /etc/hosts file, mapping host.docker.internal to the host machine's gateway IP. With this setup, I can access Ollama on my host machine by using <http://host.docker.internal:11434> from the Open WebUI.
## References
Expand Down

0 comments on commit 1837032

Please sign in to comment.