Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
39 changes: 39 additions & 0 deletions js/genkit/.guides/config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
{
"mcpServers": {
"genkit": {
"command": "genkit",
"args": ["mcp"]
}
},
"docs": [
{
"url": "https://genkit.dev/docs/models.md",
"name": "generate-content",
"title": "Generate Content",
"description": "how to generate content (text, structured data, images, videos) with Genkit."
},
{
"url": "https://genkit.dev/docs/flows.md",
"name": "flows",
"title": "Using Flows to build GenAI Workflows",
"description": "how to construct strongly typed AI workflows with Genkit."
},
{
"url": "https://genkit.dev/docs/tool-calling.md",
"name": "tool-calling",
"title": "Tool Calling",
"description": "an in-depth guide to providing tools/functions to Genkit for GenAI"
},
{
"url": "https://genkit.dev/docs/interrupts.md",
"name": "tool-calling/interrupts",
"title": "Interrupts (Tool Calling with Human-in-the-Loop)",
"description": "instructions on how to use interrupts to provide human-in-the-loop capabilities to Genkit agents"
},
{
"url": "https://genkit.dev/docs/context.md",
"name": "context",
"description": "how to pass context to tools and flows without exposing sensitive data to the LLM"
}
]
}
30 changes: 30 additions & 0 deletions js/genkit/.guides/setup.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
Follow these instructions to set up Genkit in the current codebase. These instructions are general-purpose and have not been written with specific codebase knowledge, so use your best judgement when following them.

0. Tell the user "I'm going to check out your workspace and set you up to use Genkit for GenAI workflows."
1. If the current workspace is empty or is a starter template, your goal will be to create a simple image generation flow that allows someone to generate an image based on a prompt and selectable style. If the current workspace is not empty, you will create a simple example flow to help get the user started.
2. Check to see if any Genkit provider plugin (such as `@genkit-ai/google-genai` or `@genkit-ai/oai-compat` or others, may start with `genkitx-*`) is installed. If not, install `@genkit-ai/google-genai`. If this is a Next.js app, install `@genkit-ai/next` as well.
3. Search the codebase for the exact string `genkit(` (remember to escape regexes properly) which would indicate that the user has already set up Genkit in the codebase. If found, no need to set it up again, tell the user "Genkit is already configured in this app." and exit this workflow.
4. Create an `ai` directory in the primary source directory of the project (this may be e.g. `src` but is project-dependent).
5. Create `{sourceDir}/ai/genkit.ts` and populate it using the example below. DO NOT add a `next` plugin to the file, ONLY add a model provider plugin to the plugins array:

```ts
import { genkit, z } from 'genkit';
import { googleAI } from '@genkit-ai/google-genai';

export const ai = genkit({
plugins: [googleAI()],
});

export { z };
```

6. Create `{sourceDir}/ai/tools` and `{sourceDir}/ai/flows` directories, but leave them empty for now.
7. Create `{sourceDir}/ai/index.ts` and populate it with the following (change the import to match import aliases in `tsconfig.json` as needed):

```ts
import './genkit.js';
// import each created flow, tool, etc. here for use in the Genkit Dev UI
```

8. Add a `genkit:ui` script to `package.json` that runs `npx genkit-cli start -- npx tsx {sourceDir}/ai/index.ts` (or `pnpm dlx` or `yarn dlx` for those package managers). DO NOT try to run the script now.
9. Tell the user "Genkit is now configured and ready for use." as setup is now complete. Also remind them to set appropriate env variables (e.g. `GEMINI_API_KEY` for Google providers). Wait for the user to prompt further before creating any specific flows.
1 change: 1 addition & 0 deletions js/genkit/.guides/style.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
- Prefer destructuring generate calls e.g. `const {text} = await ai.generate(...)`
49 changes: 49 additions & 0 deletions js/genkit/.guides/usage.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
## Basic Example

```ts
import { ai, z } from "@/ai/genkit"; // or wherever genkit is initialized

const myTool = ai.defineTool({name, description, inputSchema: z.object(...)}, (input) => {...});

const {text} = await ai.generate({
model: googleAI.model('gemini-2.5-flash'), // optional if default model is configured
system: "the system instructions", // optional
prompt: "the content of the prompt",
// OR, for multi-modal content
prompt: [{text: "what is this image?"}, {media: {url: "data:image/png;base64,..."}}],
tools: [myTool],
});

// structured output
const CharacterSchema = z.object({...}); // make sure to use .describe() on fields
const {output} = await ai.generate({
prompt: "generate an RPG character",
output: {schema: CharacterSchema},
});
```

## Important API Clarifications

**IMPORTANT:** This app uses Genkit v1.19 which has changed significantly from pre-1.0 versions. Important changes include:

```ts
const response = await ai.generate(...);

response.text // CORRECT 1.x syntax
response.text() // INCORRECT pre-1.0 syntax

response.output // CORRECT 1.x syntax
response.output() // INCORRECT pre-1.0 syntax

const {stream, response} = ai.generateStream(...); // IMPORTANT: no `await` needed
for await (const chunk of stream) { } // CORRECT 1.x syntax
for await (const chunk of stream()) { } // INCORRECT pre-1.0 syntax
await response; // CORRECT 1.x syntax
await response(); // INCORRECT pre-1.0 syntax
await ai.generate({..., model: googleAI.model('gemini-2.5-flash')}); // CORRECT 1.x syntax
await ai.generate({..., model: gemini15Pro}); // INCORRECT pre-1.0 syntax
```

- Use `import {z} from "genkit"` when you need Zod to get an implementation consistent with Genkit.
- When defining Zod schemas, ONLY use basic scalar, object, and array types. Use `.optional()` when needed and `.describe('...')` to add descriptions for output schemas.
- Genkit has many capabilities, make sure to read docs when you need to use them.
110 changes: 110 additions & 0 deletions js/plugins/express/.guides/usage.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
Genkit's Express integration makes it easy to expose Genkit flows as Express API endpoints:

```ts
import express from 'express';
import { expressHandler } from '@genkit-ai/express';
import { simpleFlow } from './flows/simple-flow.js';

const app = express();
app.use(express.json());

app.post('/simpleFlow', expressHandler(simpleFlow));

app.listen(8080);
```

You can also handle auth using context providers:

```ts
import { UserFacingError } from 'genkit';
import { ContextProvider, RequestData } from 'genkit/context';

const context: ContextProvider<Context> = (req: RequestData) => {
if (req.headers['authorization'] !== 'open sesame') {
throw new UserFacingError('PERMISSION_DENIED', 'not authorized');
}
return {
auth: {
user: 'Ali Baba',
},
};
};

app.post(
'/simpleFlow',
authMiddleware,
expressHandler(simpleFlow, { context })
);
```

Flows and actions exposed using the `expressHandler` function can be accessed using `genkit/beta/client` library:

```ts
import { runFlow, streamFlow } from 'genkit/beta/client';

const result = await runFlow({
url: `http://localhost:${port}/simpleFlow`,
input: 'say hello',
});

console.log(result); // hello
```

```ts
// set auth headers (when using auth policies)
const result = await runFlow({
url: `http://localhost:${port}/simpleFlow`,
headers: {
Authorization: 'open sesame',
},
input: 'say hello',
});

console.log(result); // hello
```

```ts
// and streamed
const result = streamFlow({
url: `http://localhost:${port}/simpleFlow`,
input: 'say hello',
});
for await (const chunk of result.stream) {
console.log(chunk);
}
console.log(await result.output);
```

You can use `startFlowServer` to quickly expose multiple flows and actions:

```ts
import { startFlowServer } from '@genkit-ai/express';
import { genkit } from 'genkit';

const ai = genkit({});

export const menuSuggestionFlow = ai.defineFlow(
{
name: 'menuSuggestionFlow',
},
async (restaurantTheme) => {
// ...
}
);

startFlowServer({
flows: [menuSuggestionFlow],
});
```

You can also configure the server:

```ts
startFlowServer({
flows: [menuSuggestionFlow],
port: 4567,
cors: {
origin: '*',
},
});
```
106 changes: 106 additions & 0 deletions js/plugins/google-genai/.guides/docs/editing-images.prompt
Original file line number Diff line number Diff line change
@@ -0,0 +1,106 @@
---
title: Edit images with `gemini-2.5-flash-image-preview` (aka "Nano Banana")
description: read this if you need to perform sophisticated image edits such as background removal, post matching, character replacement, relighting, on an existing image
---

The `gemini-2.5-flash-image-preview` model (also known as "Nano Banana") can perform sophisticated image edits.

- You must ALWAYS add `{config: {responseModalities: ['TEXT', 'IMAGE']}}` to your `ai.generate` calls when using this model.

<example>
```ts
// generate an image from a prompt

import { ai } from "@/ai/genkit"; // or wherever genkit is initialized
import { googleAI } from "@genkit-ai/google-genai";

const {media} = await ai.generate({
model: googleAI.model('gemini-2.5-flash-image-preview'),
config: {responseModalities: ['TEXT', 'IMAGE']}},
prompt: "generate a picture of a unicorn wearing a space suit on the moon",
});

return media.url; // --> "data:image/png;base64,..."
```
</example>

<example>
```ts
// edit an image with a text prompt

import { ai } from "@/ai/genkit"; // or wherever genkit is initialized
import { googleAI } from "@genkit-ai/google-genai";

const {media} = await ai.generate({
model: googleAI.model('gemini-2.5-flash-image-preview'),
config: {responseModalities: ['TEXT', 'IMAGE']}},
prompt: [
{text: "change the person's outfit to a banana costume"},
{media: {url: "https://..." /* or 'data:...' */}},
],
});

return media.url; // --> "data:image/png;base64,..."
```
</example>

<example>
```ts
// combine multiple images together

import { ai } from "@/ai/genkit"; // or wherever genkit is initialized
import { googleAI } from "@genkit-ai/google-genai";

const {personImageUri, animalImageUri, sceneryImageUri} = await loadImages(...);

const {media} = await ai.generate({
model: googleAI.model('gemini-2.5-flash-image-preview'),
config: {responseModalities: ['TEXT', 'IMAGE']}},
prompt: [
// the model tends to match aspect ratio of the *last* image provided
{text: "[PERSON]:\n"},
{media: {url: personImageUri}},
{text: "\n[ANIMAL]:\n"},
{media: {url: animalImageUri}},
{text; "\n[SCENERY]:\n"},
// IMPORTANT: the model tends to match aspect ratio of the *last* image provided
{media: {url: sceneryImageUri}},
{text: "make an image of [PERSON] riding a giant version of [ANIMAL] with a background of [SCENERY]"},
],
});

return media.url; // --> "data:image/png;base64,..."
```
</example>

<example>
```ts
// use an annotated image to guide generation

import { ai } from "@/ai/genkit"; // or wherever genkit is initialized
import { googleAI } from "@genkit-ai/google-genai";

const originalImageUri = "data:..."; // the original image
const annotatedImageUri = "data:..."; // the image with annotations on top of it

const {media} = await ai.generate({
model: googleAI.model('gemini-2.5-flash-image-preview'),
config: {responseModalities: ['TEXT', 'IMAGE']}},
prompt: [

{text: "follow the instructions in the following annotated image:"},
{media: {url: annotatedImageUri}},
{text: "\n\napply the annotated instructions to the original image, making sure to follow the instructions of the annotations.\n\noriginal image:\n"},
{media: {url: originalImageUri}},
],
});

return media.url; // --> "data:image/png;base64,..."
```
</example>

## Prompting tips for image editing

- For complex edits prefer a chain of small edits to a single complex edit. Feed the output of one generation as input to the next.
- Be specific and detailed about the edits you want to make.
- Be clear whether added images are meant as style or subject references.
Loading