Use Ollama-compatible models via cloud hosting — the convenience of hosted inference with Ollama's open model ecosystem.
🚧 This page is under construction. Content coming soon — contributions welcome!
Ollama Cloud provides hosted inference for Ollama-compatible models. GoClaw connects using the OpenAI-compatible API, giving you access to open-source models without managing local hardware.
{
"providers": {
"ollama-cloud": {
"provider_type": "ollama-cloud",
"api_key": "your-ollama-cloud-api-key",
"api_base": "https://api.ollama.ai/v1"
}
}
}- Provider Overview
- Ollama — run models locally instead
- Custom / OpenAI-Compatible