A local-first web application that converts natural language queries into SPARQL, executes them against a GraphDB instance, and returns structured results.
- Natural Language Querying: Ask questions in English (e.g., "Who are the authors from MIT?").
- Swappable LLM Providers: Support for Claude (Anthropic), OpenAI (Implementation ready), and a local Mock provider for testing.
- SPARQL Transparency: View, edit, and re-execute the generated SPARQL queries.
- Interactive Results: Browse results in a clean table format.
- Ontology Visualization: View the underlying schema.
- Frontend: React + TypeScript + Vite + Tailwind CSS
- Backend: FastAPI + Python 3.11
- Database: GraphDB (Running via Docker)
- LLM: Anthropic Claude API (or Mock)
- Docker & Docker Compose
- Node.js 18+ (for local frontend dev)
- Python 3.11+ (for local backend dev)
- Anthropic API Key (optional, can use Mock mode)
Create a .env file in the backend/ directory (optional, for LLM keys):
ANTHROPIC_API_KEY=your_key_here
LLM_PROVIDER=claude # or 'mock'Generate the synthetic knowledge graph data:
pip3 install rdflib
python3 data/generate_data.pyThe easiest way to run the full stack:
docker compose up --build- Frontend: http://localhost:5173
- Backend: http://localhost:8000
- GraphDB: http://localhost:7200
- Open GraphDB at http://localhost:7200
- Create a repository named
research-kg. - Import
data/ontology.ttlanddata/seed.ttlinto the repository.
cd backend
pip install -r requirements.txt
uvicorn app.main:app --reloadcd frontend
npm install
npm run dev