# Set Node version
nvm use
# Install yarn v1
npm i -g yarn
# Install dependencies
yarn
# Setup environment secrets
# Create api/.env.local with your OAuth credentials
cp api/.env api/.env.local
# Edit api/.env.local and add your GOOGLE_CLIENT_ID, GOOGLE_CLIENT_SECRET,
# and generate new ACCESS_TOKEN_SECRET and REFRESH_TOKEN_SECRET with:
# node -e "console.log(require('crypto').randomBytes(64).toString('hex'))"
# Start docker, create & seed DBs
docker compose up -d
# Run database migrations
yarn migrate:oauth
# Build common types
yarn build
# Start API (terminal 1)
yarn run start:api
# Start web app (terminal 2)
yarn run start:webCreate api/.env.local (gitignored) with your actual secrets:
# Copy template
cp api/.env api/.env.localThen edit api/.env.local and set:
-
Google OAuth credentials (from Google Cloud Console)
GOOGLE_CLIENT_IDGOOGLE_CLIENT_SECRET
-
JWT secrets (generate new random secrets):
# Generate ACCESS_TOKEN_SECRET node -e "console.log(require('crypto').randomBytes(64).toString('hex'))" # Generate REFRESH_TOKEN_SECRET node -e "console.log(require('crypto').randomBytes(64).toString('hex'))"
-
OpenAI API Key (for journal analysis feature)
OPENAI_API_KEY- Your OpenAI API key from https://platform.openai.com/api-keys
The .env file contains default/placeholder values for version control.
The .env.local file (gitignored) overrides with actual secrets.
# follow logs for postgres
docker-compose logs -f postgresyarn test:apiRequires API running on port 3000:
# Terminal 1: API server
yarn start:api
# Terminal 2: Run tests
yarn test:api:e2eRequires both API (port 3000) and web app (port 4200) running:
# Terminal 1: Web dev server
yarn start:web
# Terminal 2: Run Cypress
yarn test:webThe prompts service is a daily cronjob that generates personalized reflection questions for users based on their journal analysis and recent entries. When users visit the "Create New Journal" page, they're presented with a thoughtful prompt in a modal to inspire their writing.
- Daily Generation: A Kubernetes CronJob runs every morning at 2:00 AM
- User Scanning: Iterates through all users in the system
- Prompt Creation: For each user with analysis data:
- Fetches their latest user analysis (themes, patterns)
- Retrieves their most recent journal entry
- Calls OpenAI GPT-4o-mini to generate a single reflective question
- Stores the prompt in the
user_promptsdatabase table
- Modal Presentation: When a user visits
/journals/new, the modal displays their latest unacknowledged prompt
The user_prompts table stores generated prompts:
CREATE TABLE user_prompts (
"promptId" VARCHAR(255) PRIMARY KEY,
"userId" VARCHAR(255) NOT NULL,
"prompt" TEXT NOT NULL,
"acknowledgedAt" TIMESTAMP,
"createdAt" TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY ("userId") REFERENCES "user"("userId") ON DELETE CASCADE
);- Dismiss (click outside): Modal closes but will reappear on next visit (stored in localStorage)
- Acknowledge (click button): Modal closes permanently, calls API to mark prompt as acknowledged
Run the prompts generation script manually:
# Requires API and database to be running
yarn start:promptsThis will:
- Connect to the database
- Scan all users
- Generate prompts for users with analysis data
- Log success/skip/error counts
- Exit when complete
The CronJob is configured in k8s/base/prompts/cronjob.yaml:
- Schedule:
0 2 * * *(2:00 AM daily) - Concurrency Policy:
Forbid(prevents overlapping runs) - Job History: Keeps last 3 successful and 1 failed job
- Timeout: 30 minutes
- Resources: 128Mi-512Mi memory, 100m-500m CPU
Build and deploy:
# Build Docker image
docker build -t journal-prompts:local -f prompts/Dockerfile .
# Deploy to Kubernetes (if using minikube)
kubectl apply -f k8s/base/prompts/cronjob.yaml
# Manually trigger a job for testing
kubectl create job --from=cronjob/prompts-generator manual-prompts-test -n journal
# Check job status
kubectl get jobs -n journal
kubectl logs job/manual-prompts-test -n journal