Skip to content

vanshaj2023/SentinelAI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SentinelAI — Defence-Grade Surveillance with a Tool-Using AI Analyst

Real-time multi-object tracking, configurable restricted-zone breach alerts, and an agentic Groq-powered analyst that calls live backend tools to answer operational questions, generate after-action reports, and reconfigure surveillance zones from natural language.

Stack chosen to match what defence-tech teams actually ship: realtime CV pipeline, FastAPI/WebSockets, modern React, Dockerised, agentic LLM with function calling.

What's interesting in here

Capability How it works
Tool-using AI analyst Browser → /api/query → Groq Llama 3.3 70B with tools schema → loops calling FastAPI endpoints (get_alerts_stats, get_alerts_in_window, get_alerts_timeline, etc.) → streams an NDJSON trace of every tool call back to the UI so the user sees the agent's reasoning.
Auto-generated incident reports One-click → backend stats + alerts → Groq produces a structured Markdown after-action report (Exec Summary / Metrics / Notable Events / Pattern Analysis / Recommended Actions) → downloadable.
Natural-language zone config "Cover the bottom half — vehicle approach lane" → Groq with structured JSON output → validated and pushed to the live FastAPI backend → restricted zone reconfigured at runtime, no restart.
Realtime CV pipeline YOLOv8 + ByteTrack via ultralytics; WebSocket streams annotated frames + per-frame breach events; SQLite alert log with cooldown to suppress duplicates.
One-command setup docker-compose up brings up backend + frontend with a single GROQ_API_KEY env var.

Stack

Layer Tech
Detection / tracking YOLOv8 (ultralytics) with built-in ByteTrack
Frame processing OpenCV
Backend FastAPI · WebSockets · SQLite · Pydantic
Frontend Next.js 16 (App Router) · React 19 · Tailwind v4
LLM Groq (qwen/qwen3-32b) — OpenAI-compatible · tool calling + structured JSON output. Provider-agnostic via env vars (swap to xAI / OpenAI / Together by changing LLM_BASE_URL).
Deployment Docker · docker-compose

Architecture

              ┌────────────────────────────────────────────────────────────────┐
              │                        Next.js 16 frontend                     │
              │   VideoFeed   StatsBar   AlertPanel   QueryBox   ZoneConfig    │
              │                       IncidentReport                           │
              └─────────┬────────────────────┬────────────────────┬────────────┘
                        │                    │                    │
                  /api/query           /api/configure-zone   /api/report
                (NDJSON stream)        (JSON-mode LLM)      (markdown report)
                        │                    │                    │
                        └─────────┬──────────┴────────────────────┘
                                  ▼
                          Groq Llama 3.3 70B  (tool calling)
                                  │
                                  ▼
                       FastAPI backend  (Python)
        ┌──────────────────────┬──────────────────────────────────────┐
        │  /alerts             │   /alerts/stats                      │
        │  /alerts/window      │   /alerts/timeline                   │
        │  /zone (GET / PUT)   │   /ws  (frames + tracking)           │
        └──────────────────────┴──────────────────────────────────────┘
                                  │
                YOLOv8 + ByteTrack ─► SQLite (alert log)

Project layout

sentinelai/
├── docker-compose.yml
├── .env.example
├── backend/
│   ├── Dockerfile
│   ├── main.py          FastAPI · REST + WebSocket + zone API
│   ├── tracker.py       YOLOv8 + ByteTrack
│   ├── zones.py         Runtime-configurable restricted zone (thread-safe)
│   ├── database.py      SQLite alert log + aggregate queries
│   ├── models.py        Pydantic models
│   └── requirements.txt
└── frontend/
    ├── Dockerfile
    └── src/
        ├── app/
        │   ├── page.js
        │   └── api/
        │       ├── query/route.js          ← agentic tool-calling loop
        │       ├── report/route.js         ← incident-report generator
        │       └── configure-zone/route.js ← NL → zone JSON → backend
        └── components/
            ├── VideoFeed.jsx
            ├── StatsBar.jsx
            ├── AlertPanel.jsx
            ├── QueryBox.jsx       ← shows agent's tool-call trace
            ├── IncidentReport.jsx ← Generate Report button + .md download
            └── ZoneConfig.jsx     ← natural-language zone control

Quick start (Docker)

cp .env.example .env
# edit .env and add your GROQ_API_KEY  (free at https://console.groq.com/keys)

# drop a demo video at backend/sample_video.mp4
docker-compose up --build

Open http://localhost:3000.

Quick start (local)

Backend

cd backend
python -m venv .venv
source .venv/bin/activate         # or .venv\Scripts\activate on Windows
pip install -r requirements.txt
uvicorn main:app --reload --port 8000

First run downloads yolov8n.pt (~6 MB).

Frontend

cd frontend
cp .env.local.example .env.local  # add your GROQ_API_KEY
npm install
npm run dev

Open http://localhost:3000.

Try the AI features

Agentic queries (QueryBox):

  • "How many vehicles have entered the zone?"
  • "Which class triggered the most breaches?"
  • "Summarise the last 5 minutes of activity."
  • Watch the trace — you'll see the agent call get_alerts_stats, get_alerts_timeline, etc. Each tool call is streamed live to the UI.

Natural-language zone config (ZoneConfig):

  • "Cover the entire bottom half — vehicle approach lane"
  • "A tight box in the upper-left corner — guard post"
  • The LLM returns validated JSON; the FastAPI backend updates the zone at runtime.

Incident reports (IncidentReport):

  • Click Generate Report → Groq produces a structured Markdown after-action report from live data → downloadable as .md.

API

Surveillance

Endpoint Description
GET / Health check
GET /alerts?limit=N Recent breach events
GET /alerts/stats Counts by class, unique tracks, totals
GET /alerts/window?start=&end=&label= Filtered alerts
GET /alerts/timeline?bucket_seconds=60 Bucketed counts
DELETE /alerts Clear log
GET /zone / PUT /zone Read / update restricted zone
WS /ws Per-frame: { frame: base64, object_count, breach_count }

Frontend AI routes

Route Method Behaviour
/api/query POST Agentic tool-calling loop, NDJSON stream
/api/report POST Single-call structured Markdown report
/api/configure-zone POST NL → JSON-mode LLM → backend PUT /zone

How tool-calling works (the interesting bit)

frontend/src/app/api/query/route.js defines a TOOLS schema (OpenAI-compatible function calling). On every POST:

  1. Send messages + tools to Groq Llama 3.3 70B.
  2. If the response contains tool_calls, execute each one against the FastAPI backend, append the results as role: "tool" messages, loop.
  3. Otherwise stream the final text answer.

Each tool call is emitted to the client as an NDJSON event so the UI can render a live reasoning trace. The loop is bounded (max 6 turns) and tool execution is sandboxed to a fixed set of backend endpoints.

Limitations and future work

  • Tracking robustness: ByteTrack loses IDs across long occlusions. ReID embedding would help.
  • Zones are axis-aligned rectangles. Polygonal zones with cv2.pointPolygonTest would unlock real perimeter geometries.
  • No auth. Single-tenant demo; production deployment would need RBAC for operator vs analyst roles.
  • CPU inference. GPU sizing and yolov8s/m/l/x choice would be tuned per deployment.

Deployment notes

  • Backend: the Procfile works on Railway/Render. CMD: uvicorn main:app --host 0.0.0.0 --port $PORT.
  • Frontend: deploy frontend/ to Vercel. Set GROQ_API_KEY, SENTINEL_API_URL (your backend URL), NEXT_PUBLIC_API_URL, NEXT_PUBLIC_WS_URL (wss://...). To use a different LLM provider, also set LLM_BASE_URL and LLM_MODEL.

Demo checklist

  • backend/sample_video.mp4 exists
  • Live stream loads within 3 seconds
  • Bounding boxes and tracking IDs visible
  • Restricted-zone rectangle drawn on the feed
  • AI Analyst answers "Which class breached most?" and shows tool-call trace
  • "Cover the upper-left" reconfigures the zone live
  • "Generate Report" produces a downloadable Markdown report

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors