| 
 | 1 | +# llama.cpp/example/infill  | 
 | 2 | + | 
 | 3 | +This example shows how to use the infill mode with Code Llama models supporting infill mode.  | 
 | 4 | +Currently the 7B and 13B models support infill mode.  | 
 | 5 | + | 
 | 6 | +Infill supports most of the options available in the main example.  | 
 | 7 | + | 
 | 8 | +For further information have a look at the main README.md in llama.cpp/example/main/README.md  | 
 | 9 | + | 
 | 10 | +## Common Options  | 
 | 11 | + | 
 | 12 | +In this section, we cover the most commonly used options for running the `infill` program with the LLaMA models:  | 
 | 13 | + | 
 | 14 | +-   `-m FNAME, --model FNAME`: Specify the path to the LLaMA model file (e.g., `models/7B/ggml-model.bin`).  | 
 | 15 | +-   `-i, --interactive`: Run the program in interactive mode, allowing you to provide input directly and receive real-time responses.  | 
 | 16 | +-   `-n N, --n-predict N`: Set the number of tokens to predict when generating text. Adjusting this value can influence the length of the generated text.  | 
 | 17 | +-   `-c N, --ctx-size N`: Set the size of the prompt context. The default is 512, but LLaMA models were built with a context of 2048, which will provide better results for longer input/inference.  | 
 | 18 | + | 
 | 19 | +## Input Prompts  | 
 | 20 | + | 
 | 21 | +The `infill` program provides several ways to interact with the LLaMA models using input prompts:  | 
 | 22 | + | 
 | 23 | +-   `--in-prefix PROMPT_BEFORE_CURSOR`: Provide the prefix directly as a command-line option.  | 
 | 24 | +-   `--in-suffix PROMPT_AFTER_CURSOR`: Provide the suffix directly as a command-line option.  | 
 | 25 | +-   `--interactive-first`: Run the program in interactive mode and wait for input right away. (More on this below.)  | 
 | 26 | + | 
 | 27 | +## Interaction  | 
 | 28 | + | 
 | 29 | +The `infill` program offers a seamless way to interact with LLaMA models, allowing users to receive real-time infill suggestions. The interactive mode can be triggered using `--interactive`, and `--interactive-first`  | 
 | 30 | + | 
 | 31 | +### Interaction Options  | 
 | 32 | + | 
 | 33 | +-   `-i, --interactive`: Run the program in interactive mode, allowing users to get real time code suggestions from model.  | 
 | 34 | +-   `--interactive-first`: Run the program in interactive mode and immediately wait for user input before starting the text generation.  | 
 | 35 | +-   `--color`: Enable colorized output to differentiate visually distinguishing between prompts, user input, and generated text.  | 
 | 36 | + | 
 | 37 | +### Example  | 
 | 38 | + | 
 | 39 | +```bash  | 
 | 40 | +./infill -t 10 -ngl 0 -m models/codellama-13b.Q5_K_S.gguf -c 4096 --temp 0.7 --repeat_penalty 1.1 -n 20 --in-prefix "def helloworld():\n    print(\"hell" --in-suffix "\n   print(\"goodbye world\")\n    "  | 
 | 41 | +```  | 
0 commit comments