Skip to content

Commit f6b1947

Browse files
MQ37TC-MO
andauthored
feat: mastra mcp agent (#1505)
Will be merged after the `timeout` MCP option is released for `@mastra/mcp` npm package. --------- Co-authored-by: Michał Olender <[email protected]>
1 parent 17e5a42 commit f6b1947

File tree

5 files changed

+249
-2
lines changed

5 files changed

+249
-2
lines changed
Lines changed: 229 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,229 @@
1+
---
2+
title: Mastra MCP integration
3+
sidebar_label: Mastra
4+
description: Learn how to build AI Agents with Mastra via Apify Actors MCP server
5+
sidebar_position: 1
6+
slug: /integrations/mastra
7+
---
8+
9+
**Learn how to build AI agents with Mastra and Apify Actors MCP Server.**
10+
11+
---
12+
13+
## What is Mastra
14+
15+
[Mastra](https://mastra.ai) is an open-source TypeScript framework for building AI applications efficiently. It provides essential tools like agents, workflows, retrieval-augmented generation (RAG), integrations, and evaluations. Supporting any LLM (e.g., GPT-4, Claude, Gemini). You can run it locally or deploy it to a serverless cloud like [Apify](https://apify.com).
16+
17+
:::note Explore Mastra
18+
19+
Check out the [Mastra docs](https://mastra.ai/docs) for more information.
20+
21+
:::
22+
23+
## What is MCP server
24+
25+
A [Model Context Protocol](https://modelcontextprotocol.io) (MCP) server exposes specific data sources or tools to agents via a standardized protocol. It acts as a bridge, connecting large language models (LLMs) to external systems like databases, APIs, or local filesystems. Built on a client-server architecture, MCP servers enable secure, real-time interaction, allowing agents to fetch context or execute actions without custom integrations. Think of it as a modular plugin system for agents, simplifying how they access and process data. Apify provides [Actors MCP Server](https://apify.com/apify/actors-mcp-server) to expose [Apify Actors](https://docs.apify.com/platform/actors) from the [Apify Store](https://apify.com/store) as tools via the MCP protocol.
26+
27+
## How to use Apify with Mastra via MCP
28+
29+
This guide demonstrates how to integrate Apify Actors with Mastra by building an agent that uses the [RAG Web Browser](https://apify.com/apify/rag-web-browser) Actor to search Google for TikTok profiles and the [TikTok Data Extractor](https://apify.com/clockworks/free-tiktok-scraper) Actor to extract and analyze data from the TikTok profiles via MCP.
30+
31+
### Prerequisites
32+
33+
- _Apify API token_: To use Apify Actors, you need an Apify API token. Learn how to obtain it in the [Apify documentation](https://docs.apify.com/platform/integrations/api).
34+
- _LLM provider API key_: To power the agents, you need an LLM provider API key. For example, get one from the [OpenAI](https://platform.openai.com/account/api-keys) or [Anthropic](https://console.anthropic.com/settings/keys).
35+
- _Node.js_: Ensure you have Node.js installed.
36+
- _Packages_: Install the following packages:
37+
38+
```bash
39+
npm install @mastra/core @mastra/mcp @ai-sdk/openai
40+
```
41+
42+
### Building the TikTok profile search and analysis agent
43+
44+
First, import all required packages:
45+
46+
```typescript
47+
import { Agent } from '@mastra/core/agent';
48+
import { MastraMCPClient } from '@mastra/mcp';
49+
import { openai } from '@ai-sdk/openai';
50+
// For Anthropic use
51+
// import { anthropic } from '@ai-sdk/anthropic';
52+
```
53+
54+
Next, set the environment variables for the Apify API token and OpenAI API key:
55+
56+
```typescript
57+
process.env.APIFY_TOKEN = "your-apify-token";
58+
process.env.OPENAI_API_KEY = "your-openai-api-key";
59+
// For Anthropic use
60+
// process.env.ANTHROPIC_API_KEY = "your-anthropic-api-key";
61+
```
62+
63+
Instantiate the Mastra MCP client:
64+
65+
```typescript
66+
const mcpClient = new MastraMCPClient({
67+
name: 'apify-client',
68+
server: {
69+
url: new URL('https://actors-mcp-server.apify.actor/sse'),
70+
requestInit: {
71+
headers: { Authorization: `Bearer ${process.env.APIFY_TOKEN}` }
72+
},
73+
// The EventSource package augments EventSourceInit with a "fetch" parameter.
74+
// You can use this to set additional headers on the outgoing request.
75+
// Based on this example: https://github.com/modelcontextprotocol/typescript-sdk/issues/118
76+
eventSourceInit: {
77+
async fetch(input: Request | URL | string, init?: RequestInit) {
78+
const headers = new Headers(init?.headers || {});
79+
headers.set('authorization', `Bearer ${process.env.APIFY_TOKEN}`);
80+
return fetch(input, { ...init, headers });
81+
}
82+
}
83+
},
84+
timeout: 300_000, // 5 minutes tool call timeout
85+
});
86+
```
87+
88+
Connect to the MCP server and fetch the tools:
89+
90+
```typescript
91+
console.log('Connecting to Mastra MCP server...');
92+
await mcpClient.connect();
93+
console.log('Fetching tools...');
94+
const tools = await mcpClient.tools();
95+
```
96+
97+
Instantiate the agent with the OpenAI model:
98+
99+
```typescript
100+
const agent = new Agent({
101+
name: 'Social Media Agent',
102+
instructions: 'You’re a social media data extractor. Find TikTok URLs and analyze profiles with precision.',
103+
// You can swap to any other AI-SDK LLM provider
104+
model: openai('gpt-4o-mini')
105+
});
106+
```
107+
108+
Generate a response using the agent and the Apify tools:
109+
110+
```typescript
111+
const prompt = 'Search the web for the OpenAI TikTok profile URL, then extract and summarize its data.';
112+
console.log(`Generating response for prompt: ${prompt}`);
113+
const response = await agent.generate(prompt, {
114+
toolsets: { apify: tools }
115+
});
116+
```
117+
118+
Print the response and disconnect from the MCP server:
119+
120+
```typescript
121+
console.log(response.text);
122+
await mcpClient.disconnect();
123+
```
124+
125+
Before running the agent, we need to start the [Actors MCP Server](https://apify.com/apify/actors-mcp-server) by sending a request:
126+
127+
```bash
128+
curl https://actors-mcp-server.apify.actor/?token=YOUR_APIFY_TOKEN&actors=apify/rag-web-browser,clockworks/free-tiktok-scraper
129+
```
130+
131+
Replace `YOUR_APIFY_TOKEN` with your Apify API token. You can also open the URL in a browser to start the server.
132+
133+
:::note Use any Apify Actor
134+
135+
Since it uses the [Actors MCP Server](https://apify.com/apify/actors-mcp-server), swap in any Apify Actor from the [Apify Store](https://apify.com/store) by updating the startup request’s `actors` parameter. No other changes are needed in the agent code.
136+
137+
:::
138+
139+
Run the agent:
140+
141+
```bash
142+
npx tsx mastra-agent.ts
143+
```
144+
145+
:::note Search and analysis may take some time
146+
147+
The agent's execution may take some time as it searches the web for the OpenAI TikTok profile and extracts data from it.
148+
149+
:::
150+
151+
You will see the agent’s output in the console, showing the results of the search and analysis.
152+
153+
```text
154+
Connecting to Mastra MCP server...
155+
Fetching tools...
156+
Generating response for prompt: Search the web for the OpenAI TikTok profile URL, then extract and summarize its data.
157+
### OpenAI TikTok Profile Summary
158+
- **Profile URL**: [OpenAI on TikTok](https://www.tiktok.com/@openai?lang=en) - **Followers**: 608,100
159+
- **Likes**: 3.4 million
160+
- **Videos Posted**: 156
161+
- **Bio**: "low key research previews"
162+
...
163+
```
164+
165+
If you want to test the whole example, create a new file, `mastra-agent.ts`, and copy the full code into it:
166+
167+
```typescript
168+
import { Agent } from '@mastra/core/agent';
169+
import { MastraMCPClient } from '@mastra/mcp';
170+
import { openai } from '@ai-sdk/openai';
171+
// For Anthropic use
172+
// import { anthropic } from '@ai-sdk/anthropic';
173+
174+
process.env.APIFY_TOKEN = "your-apify-token";
175+
process.env.OPENAI_API_KEY = "your-openai-api-key";
176+
// For Anthropic use
177+
// process.env.ANTHROPIC_API_KEY = "your-anthropic-api-key";
178+
179+
const mcpClient = new MastraMCPClient({
180+
name: 'apify-client',
181+
server: {
182+
url: new URL('https://actors-mcp-server.apify.actor/sse'),
183+
requestInit: {
184+
headers: { Authorization: `Bearer ${process.env.APIFY_TOKEN}` }
185+
},
186+
// The EventSource package augments EventSourceInit with a "fetch" parameter.
187+
// You can use this to set additional headers on the outgoing request.
188+
// Based on this example: https://github.com/modelcontextprotocol/typescript-sdk/issues/118
189+
eventSourceInit: {
190+
async fetch(input: Request | URL | string, init?: RequestInit) {
191+
const headers = new Headers(init?.headers || {});
192+
headers.set('authorization', `Bearer ${process.env.APIFY_TOKEN}`);
193+
return fetch(input, { ...init, headers });
194+
}
195+
}
196+
},
197+
timeout: 300_000, // 5 minutes tool call timeout
198+
});
199+
200+
console.log('Connecting to Mastra MCP server...');
201+
await mcpClient.connect();
202+
console.log('Fetching tools...');
203+
const tools = await mcpClient.tools();
204+
205+
const agent = new Agent({
206+
name: 'Social Media Agent',
207+
instructions: 'You’re a social media data extractor. Find TikTok URLs and analyze profiles with precision.',
208+
// You can swap to any other AI-SDK LLM provider
209+
model: openai('gpt-4o-mini')
210+
});
211+
212+
const prompt = 'Search the web for the OpenAI TikTok profile URL, then extract and summarize its data.';
213+
console.log(`Generating response for prompt: ${prompt}`);
214+
const response = await agent.generate(prompt, {
215+
toolsets: { apify: tools }
216+
});
217+
218+
console.log(response.text);
219+
await mcpClient.disconnect();
220+
```
221+
222+
## Resources
223+
224+
- [Apify Actors](https://docs.apify.com/platform/actors)
225+
- [Mastra Documentation](https://mastra.ai/docs)
226+
- [Apify MCP Server](https://apify.com/apify/actors-mcp-server)
227+
- [Apify Store](https://apify.com/store)
228+
- [What are AI Agents?](https://blog.apify.com/what-are-ai-agents/)
229+
- [How to Build an AI Agent](https://blog.apify.com/how-to-build-an-ai-agent/)

sources/platform/integrations/index.mdx

Lines changed: 20 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -150,13 +150,31 @@ If you are working on an AI/LLM-related project, we recommend you look into the
150150

151151
<CardGrid>
152152
<Card
153-
title="Langchain"
153+
title="Mastra"
154+
to="./integrations/mastra"
155+
imageUrl="/img/platform/integrations/mastra.png"
156+
smallImage
157+
/>
158+
<Card
159+
title="CrewAI"
160+
to="./integrations/crewai"
161+
imageUrl="/img/platform/integrations/crewai.png"
162+
smallImage
163+
/>
164+
<Card
165+
title="LangGraph"
166+
to="./integrations/langgraph"
167+
imageUrl="/img/platform/integrations/langgraph.png"
168+
smallImage
169+
/>
170+
<Card
171+
title="LangChain"
154172
to="./integrations/langchain"
155173
imageUrl="/img/platform/integrations/langchain.png"
156174
smallImage
157175
/>
158176
<Card
159-
title="Llamaindex"
177+
title="LlamaIndex"
160178
to="./integrations/llama"
161179
imageUrl="/img/platform/integrations/llamaindex.jpeg"
162180
smallImage
14.3 KB
Loading
5.57 KB
Loading
19.5 KB
Loading

0 commit comments

Comments
 (0)