Skip to content

Conversation

j-christl
Copy link

Context

So far, only the text translation and text rephrasing functionality of DeepL is exposed via MCP tools.
This change adds a tool for translating documents via the DeepL API. This allows the user/LLM to translate documents such as *.pdf, *.docx and more (see "Document Formats" resource below).

Example usage

Request:

Please translate this file into german (de): /Users/myuser/Downloads/my_document.pdf

Response:

[...]
Perfect! ✅ Your PDF has been successfully translated from the original language to German. The translated document has been saved as /tmp/my_document de.pdf.

Limitations

Currently, Claude cannot transmit files via the MCP protocol that have been drag&dropped into Claude Desktop. Hence, this implementation depends on the user specifying the path of the file, rather than drag&dropping the file into Claude Desktop.

Resources

@j-christl j-christl changed the title feat: Implemented translate-document tool Implement translate-document tool for translating documents via the DeepL API Jun 27, 2025
Copy link
Collaborator

@morsssss morsssss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry - I didn't realize that no one had looked at your PR!

So... I wonder if we could support files in a more natural way, via one of two mechanisms:

  • files brought into the flow from use of a tool like https://github.com/modelcontextprotocol/servers/tree/main/src/filesystem . In such a case, we'd get the file from their read_text_file, read_media_file, or read_multiple_files methods. Not sure whether this would be a common use case, but for users who'd installed a filesystem tool, we'd enable them to ask their AI client something like "Can you find that file in my Downloads folder that has Greek in it, and translate that into Spanish?"

  • This resource, at least, tells us that LLMs have various ways of working with files. It does talk about Claude's way in particular.

Are you up for investigating either of these?

@MarouaneZhani
Copy link

Hi, any news on this feature ?
translating files directly would be really a nice feature to have !

@morsssss
Copy link
Collaborator

Hi, any news on this feature ? translating files directly would be really a nice feature to have !

Hey @MarouaneZhani ! Glad to hear you're interested in this, because I don't think any of us have been pursuing it further. I don't know whether there's a standard way to access files an AI client has downloaded.

Can you say more about your use case, and how precisely you'd like to use this? This could help us understand the best way to implement.

@MarouaneZhani
Copy link

Thanks @morsssss for your fast answer.
We can start by just making sure that the AI client is providing a path to the downloaded file, we can for example, from the client side, setup a shared volume that the AI client and the DeepL MCP have access to.

@morsssss
Copy link
Collaborator

morsssss commented Oct 1, 2025

Thanks @morsssss for your fast answer. We can start by just making sure that the AI client is providing a path to the downloaded file, we can for example, from the client side, setup a shared volume that the AI client and the DeepL MCP have access to.

Can you provide more details here? How would you set up this shared volume? Would you then need the MCP to have filesystem access as well? (In this case, it might be easier to compose our server with an existing filesystem server.)

@MarouaneZhani
Copy link

My idea would be that the AI client and DeepL MCP share a PVC, and then the LLM transmit the path to this file ...
But now I get the challenge because it the LLM will maybe have a hard time finding the correct path of the needed file, and maybe would be needed that it's copied and so on ... will think about a concept and come back to you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants