7
7
8
8
9
9
Promptwright is a Python library from [ Stacklok] ( https://stacklok.com ) designed
10
- for generating large synthetic datasets using a local LLM and most LLM service
10
+ for generating large synthetic datasets using a either a local LLM and most LLM service
11
11
providers (openAI, Anthropic, OpenRouter etc). The library offers
12
12
a flexible and easy-to-use set of interfaces, enabling users the ability to
13
13
generate prompt led synthetic datasets.
14
14
15
15
Promptwright was inspired by the [ redotvideo/pluto] ( https://github.com/redotvideo/pluto ) ,
16
16
in fact it started as fork, but ended up largley being a re-write.
17
17
18
- The library interfaces with LiteLLM, making it easy to just pull a model and run
19
- locally with say something like Ollama, or call directly to an online LLM provider.
20
-
21
18
## Features
22
19
23
- - ** Multiple Providers Support** : Works with most LLM service providers and LocalLLM's via Ollam , VLLM etc
20
+ - ** Multiple Providers Support** : Works with most LLM service providers and LocalLLM's via Ollama , VLLM etc
24
21
- ** Configurable Instructions and Prompts** : Define custom instructions and system prompts
25
22
- ** YAML Configuration** : Define your generation tasks using YAML configuration files
26
23
- ** Command Line Interface** : Run generation tasks directly from the command line
@@ -33,8 +30,6 @@ locally with say something like Ollama, or call directly to an online LLM provid
33
30
34
31
- Python 3.11+
35
32
- Poetry (for dependency management)
36
- - Ollama CLI installed and running (see [ Ollama Installation] ( https://ollama.com/ )
37
- - A Model pulled via Ollama (see [ Model Compatibility] ( #model-compatibility ) )
38
33
- (Optional) Hugging Face account and API token for dataset upload
39
34
40
35
### Installation
0 commit comments