Skip to content

heyheyjp/labor-ai

Repository files navigation

labor-ai

Overview

Labor & AI is a labor market intelligence web app that helps people understand how AI is reshaping their specific occupation. Users search for their job, get an instant AI-generated analysis grounded in BLS and Pew research data, and can ask follow-up questions conversationally.

The core value prop: answers about automation risk, job growth, and wages in plain English with inline citations to real research.

Stack

Layer Technology Host
Frontend Next.js 15, React 19, TypeScript, Tailwind CSS v4 Vercel
Backend FastAPI, Python 3.12 Railway
Database Postgres 16 + pgvector Supabase
Auth GitHub OAuth via Supabase Auth Supabase
Rate limiting Redis Upstash
LLM Claude Sonnet Anthropic
Embeddings Voyage AI (voyage-3) Voyage AI

Prerequisites

  1. Install Docker Desktop if not already installed (required locally for the Postgres database).

  2. Run the setup command, which will install CLI tools (uv, pnpm) and project dependencies:

    make setup

Configuration

Backend

  1. Copy the example config file:

    cp backend/.env.example backend/.env
  2. Retrieve the values needed to configure your local environment for integration with external services, and save them in the copied config file:

    Service Values needed Where to find them Notes
    Supabase SUPABASE_URL
    SUPABASE_ANON_KEY
    SUPABASE_JWT_SECRET
    Project Settings → API → "Project URL", "anon public", and "JWT Secret" Also set up the GitHub OAuth provider under Authentication → Providers.
    Anthropic ANTHROPIC_API_KEY Console → API Keys Claude Sonnet
    Voyage AI VOYAGE_API_KEY Dashboard → API Keys Used for generating embeddings at ingestion time and at query time for research search.
    Upstash UPSTASH_REDIS_REST_URL
    UPSTASH_REDIS_REST_TOKEN
    Select the Redis database → REST API section Create a Redis database. Used for per-user rate limiting.
    Sentry SENTRY_DSN Project Settings → Client Keys Create a Python project for the backend.

Frontend

  1. Copy the example config file:

    cp frontend/.env.local.example frontend/.env.local
  2. Retrieve the values needed to configure your local environment for integration with external services, and save them in the copied config file:

    Service Values needed Where to find them Notes
    Supabase NEXT_PUBLIC_SUPABASE_URL
    NEXT_PUBLIC_SUPABASE_ANON_KEY
    Project Settings → API → "Project URL" and "anon public" Same Supabase project as the backend.
    Sentry NEXT_PUBLIC_SENTRY_DSN Project Settings → Client Keys Create a Next.js project for the frontend (separate from the backend Python project).

Databases

Postgres: initializing the database

  1. With Docker running, start the local Postgres db:

    make db-up
  2. Run the db migrations

    make db-migrate

Postgres: creating a migration

make db-create-migration MSG="describe what changed"
# review the generated file in backend/migrations/versions/
make db-migrate

Running the apps

Backend

To start the API service:

make server-dev

The service is listening at http://localhost:8000.

API docs are served at http://localhost:8000/docs.

Frontend

To start the dev server for the web app:

make client-dev

The app is served at http://localhost:3000.

Code quality checks

Backend

Commands for linting, formatting, type-checking, and testing:

make server-lint   # ruff check
make server-format # format check
make server-type   # mypy (strict)
make server-test   # pytest (requires Docker DB running)

To run them all:

make ci-server

Frontend

Commands for linting, formatting, type-checking, and testing:

make client-lint    # ESLint
make client-type    # TypeScript (tsc --noEmit)
make client-test    # Vitest
make client-format  # Prettier format check

To run them all:

make ci-client

CI/CD

Backend

The FastAPI service is automatically deployed to the production Railway project on push/merge to the main branch.

Frontend

The Next.js app is automatically deployed to the production Vercel project on push/merge to the main branch.

Observability

Backend

Errors and performance are tracked via Sentry (Python SDK). Structured JSON logs include request_id, app_user_id, and per-phase timing for each analysis request. Token usage (prompt + output) is persisted to the database per message for cost tracking.

Frontend

--

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors