Skip to content

Commit ab8ec6a

Browse files
committed
docs: Add Docker usage instructions to README for running llmscript securely
1 parent 67578a7 commit ab8ec6a

File tree

1 file changed

+6
-5
lines changed

1 file changed

+6
-5
lines changed

README.md

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -52,6 +52,12 @@ go install github.com/statico/llmscript/cmd/llmscript@latest
5252

5353
(Can't find it? Check `~/go/bin`.)
5454

55+
Or, if you're spooked by running LLM-generated shell scripts (good for you!), consider running llmscript via Docker:
56+
57+
```
58+
docker run --network host -it -v "$(pwd):/data" -w /data ghcr.io/statico/llmscript --verbose examples/hello-world
59+
```
60+
5561
## Usage
5662

5763
Create a script file like the above example, or check out the [examples](examples) directory for more. You can use a shebang like:
@@ -68,11 +74,6 @@ $ llmscript hello-world
6874

6975
By default, llmscript will use Ollama with the `llama3.2` model. You can configure this by creating a config file with the `llmscript --write-config` command to create a config file in `~/.config/llmscript/config.yaml` which you can edit. You can also use command-line args (see below).
7076

71-
> [!NOTE]
72-
> Spooked by running LLM-generated shell scripts? Good for you! Consider running this via Docker:
73-
>
74-
> `docker run --network host -it -v "$(pwd):/data" -w /data ghcr.io/statico/llmscript --verbose examples/hello-world`
75-
7677
## How it works
7778

7879
Want to see it all in action? Run `llmscript --verbose examples/hello-world`

0 commit comments

Comments
 (0)