diff --git a/README.md b/README.md index f438800..fe5e136 100644 --- a/README.md +++ b/README.md @@ -140,7 +140,7 @@ NVIDIA offers free inference endpoints for powerful models like Moonshot AI's Ki - Copy the entire Python code snippet. - Paste the code into a chat with OpenClaw and use this prompt: ``` -Hey, here's the Python code from NVIDIA Build for the Kimi model (with my API key already included). +Hey, here's the Python code from NVIDIA Build to call an API with a model (my API key already included). 1. Test the code to make sure it works. 2. Configure this model as a fallback for my primary model. ``` diff --git a/docker/Dockerfile b/docker/Dockerfile index e47efe1..3413e5b 100644 --- a/docker/Dockerfile +++ b/docker/Dockerfile @@ -1,5 +1,5 @@ -ARG OPENCLAW_VERSION=2026.3.22 -ARG OLLAMA_VERSION=0.18.2 +ARG OPENCLAW_VERSION=2026.3.28 +ARG OLLAMA_VERSION=0.19.0 ARG NODE_VERSION=24 # Use the official Ollama image to get the binary