diff --git a/.gitignore b/.gitignore index 6339aeb..ef02837 100644 --- a/.gitignore +++ b/.gitignore @@ -25,6 +25,7 @@ share/python-wheels/ *.egg-info/ .installed.cfg *.egg +*.whl MANIFEST # PyInstaller diff --git a/integration_test/.gcloudignore b/integration_test/.gcloudignore new file mode 100644 index 0000000..34de70c --- /dev/null +++ b/integration_test/.gcloudignore @@ -0,0 +1,24 @@ +# .gcloudignore for integration test Cloud Build +# Include all files needed for the build + +# Ignore node_modules as we'll install them fresh +node_modules/ +generated/ + +# Ignore local auth files (we'll use secrets instead) +sa.json +sa-v2.json +test-config.json + +# Ignore local test artifacts +test_failures.txt +*.log + +# Keep git info for reference +.git +.gitignore + +# Ignore temp files +*.tmp +*~ +.DS_Store \ No newline at end of file diff --git a/integration_test/.gitignore b/integration_test/.gitignore new file mode 100644 index 0000000..2afe58b --- /dev/null +++ b/integration_test/.gitignore @@ -0,0 +1,8 @@ +node_modules/ +generated/ +.test-artifacts/ +*.log +.DS_Store +package-lock.json +firebase-debug.log +firebase-functions-python-local.whl \ No newline at end of file diff --git a/integration_test/README.md b/integration_test/README.md new file mode 100644 index 0000000..74326ab --- /dev/null +++ b/integration_test/README.md @@ -0,0 +1,342 @@ +# Firebase Functions Python SDK Integration Tests + +This directory contains integration tests for the Firebase Functions Python SDK. The framework allows testing Python Firebase Functions by deploying them to real Firebase projects and verifying their behavior through comprehensive test suites. + +## Overview + +The integration test framework: +- Generates Python functions from Handlebars templates with unique test IDs +- Deploys functions to Firebase projects for real-world testing +- Uses Jest tests to verify function behavior +- Supports all Firebase trigger types (Firestore, Database, Storage, Auth, etc.) + +## Prerequisites + +1. **Build the Python SDK**: + ```bash + # From the root firebase-functions-python directory + ./scripts/pack-for-integration-tests.sh + ``` + This creates `integration_test/firebase-functions-python-local.whl` + +2. **Firebase Project**: Python SDK only supports 2nd gen Cloud Functions: + - **All tests**: `functions-integration-tests-v2` + - **Note**: V1/V2 in suite names refers to Firebase service API versions, not Cloud Functions generations + +3. **Dependencies**: + - Node.js 18+ (for test runner and generation scripts) + - Python 3.10+ (for Firebase Functions) + - Firebase CLI (`npm install -g firebase-tools`) + - uv (Python package manager) + +## Quick Start + +### 1. Generate Python Functions + +```bash +cd integration_test + +# Generate all suites +npm run generate + +# Generate specific test suite +node scripts/generate.js v2_firestore + +# List available suites +node scripts/generate.js --list +``` + +### 2. Deploy Functions + +```bash +npm run deploy +# OR manually: +cd generated/functions +firebase deploy --only functions --project functions-integration-tests-v2 +``` + +### 3. Run Tests + +```bash +# Run all tests sequentially +npm run test:all + +# Run specific test suite +npm run test:firestore + +# Run tests in parallel (faster but harder to debug) +npm run test:all:parallel +``` + +### 4. Cleanup + +```bash +# Clean up deployed functions +npm run cleanup + +# Remove generated files +npm run clean +``` + +## Project Structure + +``` +integration_test/ +├── config/ +│ └── suites.yaml # Unified test suite configuration +├── templates/ +│ └── functions/ # Python function templates +│ ├── firebase.json.hbs +│ ├── requirements.txt.hbs +│ └── src/ +│ ├── main.py.hbs +│ ├── utils.py.hbs +│ └── v1/v2/ # Trigger-specific templates +├── scripts/ +│ ├── generate.js # Function generator +│ ├── run-tests.js # Test runner +│ ├── cleanup-suite.sh # Cleanup script +│ └── pack-for-integration-tests.sh # SDK build script +├── tests/ # Jest test suites +│ ├── v1/ # V1 function tests +│ └── v2/ # V2 function tests +└── generated/ # Generated functions (gitignored) +``` + +## Configuration + +### Suite Configuration (`config/suites.yaml`) + +```yaml +defaults: + projectId: functions-integration-tests-v2 + region: us-central1 + timeout: 540 + dependencies: + firebase-admin: "^6.0.1" + firebase-functions: "{{sdkTarball}}" # Replaced with wheel path + devDependencies: {} + +suites: + - name: v2_firestore + description: "V2 Firestore trigger tests for Python" + version: v2 + service: firestore + functions: + - name: firestoreOnDocumentCreated + trigger: onDocumentCreated + document: "v2tests/{testId}" +``` + +## How It Works + +1. **Template Generation**: The `generate.js` script: + - Reads YAML configuration for test suites + - Generates Python functions from Handlebars templates + - Injects unique `TEST_RUN_ID` for function isolation + - Creates `requirements.txt` with local SDK wheel + +2. **Function Structure**: Generated Python functions: + ```python + @firestore_fn.on_document_created( + document="tests/{testId}", + region="us-central1", + timeout_sec=540 + ) + def firestoreDocumentOnCreateTests_t24vxpkcr(event): + # Store event context for verification + test_id = event.params.get("testId") + context_data = { + "eventId": event.id, + "timestamp": event.time, + # ... other context + } + firestore.client().collection("firestoreDocumentOnCreateTests").document(test_id).set(context_data) + ``` + +3. **Test Execution**: Jest tests: + - Trigger functions by manipulating Firebase resources + - Verify function execution by checking stored context data + - Wait for function completion using retry logic + +## Adding New Trigger Types + +To add support for a new trigger type: + +1. **Create Template**: Add template in `templates/functions/src/v1/[service]_tests.py.hbs` +2. **Update Generator**: Add mapping in `generate.js`: + ```javascript + const templateMap = { + // ... existing mappings + newservice: { + v1: "functions/src/v1/newservice_tests.py.hbs" + } + } + ``` +3. **Add Configuration**: Update `config/suites.yaml` +4. **Add Tests**: Create Jest test file in `tests/v1/[service].test.ts` + +## Environment Variables + +- `TEST_RUN_ID` - Override test run ID (default: auto-generated) +- `PROJECT_ID` - Override project ID from config +- `REGION` - Override region from config +- `SDK_PACKAGE` - Path to SDK wheel file + +## Supported Trigger Types + +### V1 Functions +- ✅ Firestore (onCreate, onDelete, onUpdate, onWrite) +- 🚧 Realtime Database +- 🚧 Storage +- 🚧 Auth (blocking and non-blocking) +- 🚧 Pub/Sub +- 🚧 Remote Config +- 🚧 Test Lab +- 🚧 Task Queues + +### V2 Functions +- 🚧 All V2 trigger types + +## Troubleshooting + +### Common Issues + +1. **SDK not found**: Run `./scripts/pack-for-integration-tests.sh` first +2. **Import errors**: Ensure virtual environment is activated +3. **Deployment fails**: Check Firebase project permissions and quotas +4. **Tests fail**: Verify TEST_RUN_ID matches deployed functions +5. **Build failures**: Check Cloud Build logs in GCP Console +6. **Service account permissions**: Ensure service account has necessary permissions +7. **Project access**: Verify projects exist and are accessible + +### Debug Commands + +```bash +# Check generated functions +ls -la generated/functions/src/ + +# View function logs for V1 +firebase functions:log --project functions-integration-tests + +# View function logs for V2 +firebase functions:log --project functions-integration-tests-v2 + +# Test locally (limited functionality) +cd generated/functions +functions-framework --target=main --debug + +# Download artifacts from GCS bucket (V1) +gsutil ls gs://functions-integration-tests-test-results/ + +# Download artifacts from GCS bucket (V2) +gsutil ls gs://functions-integration-tests-v2-test-results/ +``` + +### Cleanup Issues + +If functions aren't cleaned up properly: +```bash +# Delete functions +firebase functions:delete --project functions-integration-tests-v2 --force + +# Or use cleanup script +firebase functions:delete --project functions-integration-tests-v2 --force +``` + +## Cloud Build Integration + +The integration tests are run via Cloud Build. Python SDK only supports 2nd gen functions, so all tests deploy to the same project. + +### Configuration + +#### `cloudbuild.yaml` +- **Project**: `functions-integration-tests-v2` +- **What it does**: + - Builds the Python SDK wheel + - Generates Python functions (both v1 and v2 API suites) + - Deploys to functions-integration-tests-v2 + - Runs integration tests + - Cleans up deployed functions + +**Usage** (from repository root): +```bash +gcloud builds submit --config=integration_test/cloudbuild.yaml --project=functions-integration-tests-v2 . +# Or use npm script: +npm run cloudbuild +``` + +### Configuration Details + +#### Build Environment +- **Machine Type**: `E2_HIGHCPU_8` (for faster builds) +- **Timeout**: 3600s (1 hour) per configuration +- **Python Version**: 3.11 +- **Node Version**: 20 + +#### Build Steps + +1. **SDK Build** (Python 3.11 container): + - Installs `uv` package manager + - Builds wheel from source + - Copies to `integration_test/firebase-functions-python-local.whl` + +2. **Test Execution** (Node 20 container): + - Installs npm dependencies + - Installs Firebase CLI + - Generates Python functions from templates + - Creates Python venv and installs dependencies + - Deploys functions to Firebase + - Runs Jest integration tests + - Cleans up deployed functions + +#### Artifacts +All builds store: +- Test logs in `integration_test/logs/**/*.log` +- Metadata in `integration_test/generated/.metadata.json` + +Artifacts are uploaded to: +- V1: `gs://functions-integration-tests-test-results/${BUILD_ID}` +- V2: `gs://functions-integration-tests-v2-test-results/${BUILD_ID}` + +### Automated Triggers + +You can set up Cloud Build triggers to run on: +- **Pull requests**: Use `cloudbuild.yaml` for quick feedback on v1 API tests +- **Merges to main**: Run full test suite with `cloudbuild.yaml` +- **Nightly builds**: Run comprehensive tests across all suites + +### Manual CI/CD Steps + +For custom CI/CD pipelines: + +```bash +# Build SDK +./scripts/pack-for-integration-tests.sh + +# Generate functions (v1 and/or v2 API suites) +cd integration_test +node scripts/generate.js 'v1_*' # Or 'v2_*' or specific suites + +# Deploy and test +cd generated/functions +python3.11 -m venv venv +source venv/bin/activate +pip install -r requirements.txt +firebase deploy --project functions-integration-tests-v2 --token $FIREBASE_TOKEN +cd ../.. +npm run test:all:sequential +npm run cleanup +``` + +## Contributing + +When adding new features: +1. Follow the existing template patterns +2. Ensure Python code follows PEP 8 +3. Test with both local and deployed functions +4. Update this documentation + +## License + +Apache 2.0 - See LICENSE file in the root directory. \ No newline at end of file diff --git a/integration_test/cloudbuild.yaml b/integration_test/cloudbuild.yaml new file mode 100644 index 0000000..92a5c67 --- /dev/null +++ b/integration_test/cloudbuild.yaml @@ -0,0 +1,142 @@ +# Cloud Build configuration for Firebase Functions Integration Tests (Python) +# Python SDK only supports 2nd gen functions +# Runs test suites on functions-integration-tests-v2 project + +options: + machineType: 'E2_HIGHCPU_8' + logging: CLOUD_LOGGING_ONLY + +timeout: '3600s' + +substitutions: + _PROJECT_ID: 'functions-integration-tests-v2' + +steps: + # Create storage bucket for test results if it doesn't exist + - name: 'gcr.io/google.com/cloudsdktool/cloud-sdk:stable' + id: 'create-bucket' + entrypoint: 'bash' + args: + - '-c' + - | + # Create bucket for test results if it doesn't exist + BUCKET_NAME="gs://${_PROJECT_ID}-test-results" + echo "Checking if bucket $$BUCKET_NAME exists..." + if ! gsutil ls "$$BUCKET_NAME" &>/dev/null; then + echo "Creating bucket $$BUCKET_NAME..." + gsutil mb -p "${_PROJECT_ID}" "$$BUCKET_NAME" + else + echo "Bucket $$BUCKET_NAME already exists" + fi + + # Step 1: Build Python SDK wheel + # NOTE: Build is now submitted from repo root, so /workspace = repo root + - name: 'python:3.11' + id: 'build-python-sdk' + entrypoint: 'bash' + args: + - '-c' + - | + # Install uv for Python package management + echo "Installing uv..." + pip install uv + + # Verify we're in the repo root + echo "Current directory: $(pwd)" + ls -la pyproject.toml + + # Build the firebase-functions Python SDK from source + echo "Building firebase-functions Python SDK from source..." + uv build + + # Copy the wheel to integration_test directory (preserve original filename) + echo "Copying wheel to integration_test directory..." + cp dist/*.whl integration_test/ + + # Verify the wheel was copied and show actual filename + echo "Checking if wheel exists..." + WHEEL_FILE=$$(ls integration_test/*.whl 2>/dev/null | head -n 1) + if [ -n "$$WHEEL_FILE" ]; then + ls -lh "$$WHEEL_FILE" + echo "✅ Wheel exists at: $$WHEEL_FILE" + else + echo "❌ ERROR: No wheel file found in integration_test/" + ls -la integration_test/ || echo "integration_test directory doesn't exist" + exit 1 + fi + + echo "Python SDK built and packaged successfully" + + # Step 2: Run integration tests using unified test runner + - name: 'node:20' + id: 'run-tests' + entrypoint: 'bash' + args: + - '-c' + - | + set -e # Exit on error + + # Install Python 3.11, gcloud, and other dependencies in node:20 image + echo "Installing Python 3.11 and gcloud..." + apt-get update -qq + apt-get install -y -qq python3.11 python3.11-venv python3-pip curl apt-transport-https ca-certificates gnupg + + # Install gcloud SDK + echo "deb [signed-by=/usr/share/keyrings/cloud.google.gpg] https://packages.cloud.google.com/apt cloud-sdk main" | tee -a /etc/apt/sources.list.d/google-cloud-sdk.list + curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | apt-key --keyring /usr/share/keyrings/cloud.google.gpg add - + apt-get update -qq && apt-get install -y -qq google-cloud-sdk + + # Verify installations + python3.11 --version + gcloud --version + + # Change to integration_test directory + # NOTE: /workspace is now repo root, so cd into integration_test + cd integration_test + + # Debug: Check current directory and contents + echo "Current directory: $(pwd)" + ls -la + + # Install Node.js test dependencies + echo "Installing Node.js dependencies..." + npm ci + + # Install firebase-tools globally + echo "Installing Firebase CLI..." + npm install -g firebase-tools + + # Verify tools are installed + firebase --version + node --version + + # Create logs directory + mkdir -p logs + + # Set project ID for tests + export PROJECT_ID=${_PROJECT_ID} + echo "Running tests on project: ${PROJECT_ID}" + + # Use unified test runner (handles generate, deploy, test, cleanup automatically) + echo "Running integration tests..." + + # Find the actual wheel filename + WHEEL_FILE=$$(ls firebase*.whl 2>/dev/null | head -n 1) + if [ -z "$$WHEEL_FILE" ]; then + echo "❌ ERROR: No wheel file found" + exit 1 + fi + echo "Using wheel: $$WHEEL_FILE" + + node scripts/run-tests.js \ + --sequential \ + v2_firestore v2_database v2_storage v2_pubsub v2_eventarc v2_tasks v2_remoteconfig v2_scheduler v2_testlab v2_identity \ + --use-published-sdk=file:$$WHEEL_FILE + +# Artifacts to store +# NOTE: Paths are relative to /workspace which is now repo root +artifacts: + objects: + location: 'gs://${_PROJECT_ID}-test-results/${BUILD_ID}' + paths: + - 'integration_test/logs/*.log' \ No newline at end of file diff --git a/integration_test/config/suites.schema.json b/integration_test/config/suites.schema.json new file mode 100644 index 0000000..a20cccb --- /dev/null +++ b/integration_test/config/suites.schema.json @@ -0,0 +1,422 @@ +{ + "$schema": "http://json-schema.org/draft-07/schema#", + "$id": "https://firebase.google.com/schemas/functions-integration-test-suites.json", + "title": "Firebase Functions Integration Test Suites Configuration", + "description": "Schema for the unified Firebase Functions integration test suite configuration", + "type": "object", + "required": ["defaults", "suites"], + "additionalProperties": false, + "properties": { + "defaults": { + "type": "object", + "description": "Default values applied to all suites unless overridden", + "required": ["projectId", "region", "timeout", "dependencies", "devDependencies"], + "additionalProperties": false, + "properties": { + "projectId": { + "type": "string", + "description": "Default Firebase project ID for deployments", + "pattern": "^[a-z0-9-]+$", + "minLength": 6, + "maxLength": 30, + "default": "functions-integration-tests-v2" + }, + "region": { + "type": "string", + "description": "Default deployment region", + "enum": [ + "us-central1", + "us-east1", + "us-east4", + "us-west1", + "us-west2", + "us-west3", + "us-west4", + "europe-west1", + "europe-west2", + "europe-west3", + "europe-west6", + "europe-central2", + "asia-east1", + "asia-east2", + "asia-northeast1", + "asia-northeast2", + "asia-northeast3", + "asia-south1", + "asia-southeast1", + "asia-southeast2", + "australia-southeast1", + "northamerica-northeast1", + "southamerica-east1" + ], + "default": "us-central1" + }, + "timeout": { + "type": "integer", + "description": "Default function timeout in seconds", + "minimum": 1, + "maximum": 540, + "default": 540 + }, + "dependencies": { + "type": "object", + "description": "Default npm dependencies for generated functions", + "properties": { + "firebase-admin": { + "type": "string", + "description": "Firebase Admin SDK version", + "pattern": "^(\\^|~)?\\d+\\.\\d+\\.\\d+$|^\\{\\{sdkTarball\\}\\}$" + }, + "firebase-functions": { + "type": "string", + "description": "Firebase Functions SDK version or template variable", + "pattern": "^(\\^|~)?\\d+\\.\\d+\\.\\d+$|^\\{\\{sdkTarball\\}\\}$|^file:" + } + }, + "additionalProperties": { + "type": "string", + "description": "Additional dependency with version specification" + } + }, + "devDependencies": { + "type": "object", + "description": "Default npm dev dependencies for generated functions", + "properties": { + "typescript": { + "type": "string", + "description": "TypeScript version", + "pattern": "^(\\^|~)?\\d+\\.\\d+\\.\\d+$" + } + }, + "additionalProperties": { + "type": "string", + "description": "Additional dev dependency with version specification" + } + } + } + }, + "suites": { + "type": "array", + "description": "Array of test suite configurations", + "minItems": 1, + "items": { + "type": "object", + "required": ["name", "description", "version", "service", "functions"], + "additionalProperties": false, + "properties": { + "name": { + "type": "string", + "description": "Unique identifier for the test suite", + "pattern": "^v2_[a-z0-9_]+$", + "examples": ["v2_firestore", "v2_database", "v2_identity"] + }, + "projectId": { + "type": "string", + "description": "Override default project ID for this suite", + "pattern": "^[a-z0-9-]+$", + "minLength": 6, + "maxLength": 30 + }, + "region": { + "type": "string", + "description": "Override default region for this suite", + "enum": [ + "us-central1", + "us-east1", + "us-east4", + "us-west1", + "us-west2", + "us-west3", + "us-west4", + "europe-west1", + "europe-west2", + "europe-west3", + "europe-west6", + "europe-central2", + "asia-east1", + "asia-east2", + "asia-northeast1", + "asia-northeast2", + "asia-northeast3", + "asia-south1", + "asia-southeast1", + "asia-southeast2", + "australia-southeast1", + "northamerica-northeast1", + "southamerica-east1" + ] + }, + "description": { + "type": "string", + "description": "Human-readable description of the test suite", + "minLength": 1 + }, + "version": { + "type": "string", + "description": "Firebase Functions API version (Python SDK only supports v2)", + "enum": ["v2"] + }, + "service": { + "type": "string", + "description": "Firebase service being tested", + "enum": [ + "firestore", + "database", + "pubsub", + "storage", + "auth", + "tasks", + "remoteconfig", + "testlab", + "scheduler", + "identity", + "alerts", + "eventarc" + ] + }, + "dependencies": { + "type": "object", + "description": "Override default dependencies for this suite", + "additionalProperties": { + "type": "string", + "description": "Dependency with version specification" + } + }, + "devDependencies": { + "type": "object", + "description": "Override default dev dependencies for this suite", + "additionalProperties": { + "type": "string", + "description": "Dev dependency with version specification" + } + }, + "functions": { + "type": "array", + "description": "Array of function configurations for this suite", + "minItems": 1, + "items": { + "type": "object", + "required": ["name"], + "additionalProperties": true, + "properties": { + "name": { + "type": "string", + "description": "Function name (TEST_RUN_ID will be appended)", + "pattern": "^[a-zA-Z][a-zA-Z0-9]*$", + "minLength": 1, + "maxLength": 62 + }, + "trigger": { + "type": "string", + "description": "Trigger type for the function", + "minLength": 1 + }, + "decorator": { + "type": "string", + "description": "Python decorator name (e.g., on_document_created, on_value_deleted)", + "pattern": "^[a-z_]+$", + "examples": ["on_document_created", "on_value_created", "on_object_archived"] + }, + "eventType": { + "type": "string", + "description": "Event type suffix for CloudEvent type field (e.g., created, deleted, metadataUpdated)", + "minLength": 1, + "examples": ["created", "deleted", "updated", "written", "metadataUpdated", "archived", "finalized"] + }, + "type": { + "type": "string", + "description": "Type field for identity platform functions", + "enum": ["beforeUserCreated", "beforeUserSignedIn"] + }, + "timeout": { + "type": "integer", + "description": "Override default timeout for this function", + "minimum": 1, + "maximum": 540 + }, + "collection": { + "type": "string", + "description": "Firestore collection name (defaults to function name)", + "pattern": "^[a-zA-Z][a-zA-Z0-9]*$" + }, + "document": { + "type": "string", + "description": "Firestore document path pattern", + "examples": ["tests/{testId}", "users/{userId}/posts/{postId}"] + }, + "topic": { + "type": "string", + "description": "Pub/Sub topic name", + "pattern": "^[a-zA-Z][a-zA-Z0-9-_]*$" + }, + "schedule": { + "type": "string", + "description": "Cron schedule for scheduled functions", + "examples": ["every 10 hours", "every 5 minutes", "0 */12 * * *"] + }, + "bucket": { + "type": "string", + "description": "Storage bucket name" + }, + "queue": { + "type": "string", + "description": "Cloud Tasks queue name" + }, + "alertType": { + "type": "string", + "description": "Type of alert for alert triggers" + }, + "database": { + "type": "string", + "description": "Realtime Database instance URL" + }, + "path": { + "type": "string", + "description": "Database or storage path pattern" + }, + "blocking": { + "type": "boolean", + "description": "Whether this is a blocking auth function", + "default": false + } + }, + "allOf": [ + { + "if": { + "properties": { + "trigger": { + "enum": [ + "onDocumentCreated", + "onDocumentDeleted", + "onDocumentUpdated", + "onDocumentWritten" + ] + } + }, + "required": ["trigger"] + }, + "then": { + "required": ["document"] + } + }, + { + "if": { + "properties": { + "trigger": { + "enum": ["onCreate", "onDelete", "onUpdate", "onWrite"] + }, + "document": { + "type": "string" + } + }, + "required": ["trigger", "document"] + }, + "then": { + "required": ["document"] + } + }, + { + "if": { + "properties": { + "trigger": { + "enum": ["onCreate", "onDelete", "onUpdate", "onWrite"] + }, + "path": { + "type": "string" + } + }, + "required": ["trigger", "path"] + }, + "then": { + "required": ["path"] + } + }, + { + "if": { + "properties": { + "trigger": { + "enum": [ + "onValueCreated", + "onValueDeleted", + "onValueUpdated", + "onValueWritten" + ] + } + }, + "required": ["trigger"] + }, + "then": { + "required": ["path"] + } + }, + { + "if": { + "properties": { + "trigger": { + "enum": ["onPublish", "onMessagePublished"] + } + }, + "required": ["trigger"] + }, + "then": { + "required": ["topic"] + } + }, + { + "if": { + "properties": { + "trigger": { + "enum": ["onRun", "onSchedule"] + } + }, + "required": ["trigger"] + }, + "then": { + "required": ["schedule"] + } + } + ] + } + } + } + }, + "uniqueItems": true + } + }, + "definitions": { + "versionPattern": { + "type": "string", + "pattern": "^(\\^|~)?\\d+\\.\\d+\\.\\d+$", + "description": "Semantic version with optional range specifier" + }, + "firebaseRegion": { + "type": "string", + "enum": [ + "us-central1", + "us-east1", + "us-east4", + "us-west1", + "us-west2", + "us-west3", + "us-west4", + "europe-west1", + "europe-west2", + "europe-west3", + "europe-west6", + "europe-central2", + "asia-east1", + "asia-east2", + "asia-northeast1", + "asia-northeast2", + "asia-northeast3", + "asia-south1", + "asia-southeast1", + "asia-southeast2", + "australia-southeast1", + "northamerica-northeast1", + "southamerica-east1" + ], + "description": "Valid Firebase Functions deployment regions" + } + } +} diff --git a/integration_test/config/suites.yaml b/integration_test/config/suites.yaml new file mode 100644 index 0000000..b06999a --- /dev/null +++ b/integration_test/config/suites.yaml @@ -0,0 +1,205 @@ +# Firebase Functions Python Integration Test Suites Configuration +# Python SDK only supports V2 APIs - all tests deploy to functions-integration-tests-v2 + +defaults: + projectId: functions-integration-tests-v2 + region: us-central1 + timeout: 540 + dependencies: + firebase-admin: "^6.0.1" + firebase-functions: "{{sdkTarball}}" + devDependencies: {} + +suites: + # ============================================================================ + # V2 API Suites (Python SDK only supports V2 Firebase APIs) + # ============================================================================ + + # V2 Firestore triggers + - name: v2_firestore + description: "V2 Firestore trigger tests for Python" + version: v2 + service: firestore + functions: + - name: firestoreOnDocumentCreated + trigger: onDocumentCreated + decorator: on_document_created + eventType: created + document: "tests/{testId}" + collection: firestoreOnDocumentCreatedTests + - name: firestoreOnDocumentDeleted + trigger: onDocumentDeleted + decorator: on_document_deleted + eventType: deleted + document: "tests/{testId}" + collection: firestoreOnDocumentDeletedTests + - name: firestoreOnDocumentUpdated + trigger: onDocumentUpdated + decorator: on_document_updated + eventType: updated + document: "tests/{testId}" + collection: firestoreOnDocumentUpdatedTests + - name: firestoreOnDocumentWritten + trigger: onDocumentWritten + decorator: on_document_written + eventType: written + document: "tests/{testId}" + collection: firestoreOnDocumentWrittenTests + + # V2 Realtime Database triggers + - name: v2_database + description: "V2 Realtime Database trigger tests for Python" + version: v2 + service: database + functions: + - name: databaseOnValueCreated + trigger: onValueCreated + decorator: on_value_created + eventType: created + path: "databaseCreatedTests/{testId}/start" + collection: databaseCreatedTests + - name: databaseOnValueDeleted + trigger: onValueDeleted + decorator: on_value_deleted + eventType: deleted + path: "databaseDeletedTests/{testId}/start" + collection: databaseDeletedTests + - name: databaseOnValueUpdated + trigger: onValueUpdated + decorator: on_value_updated + eventType: updated + path: "databaseUpdatedTests/{testId}/start" + collection: databaseUpdatedTests + - name: databaseOnValueWritten + trigger: onValueWritten + decorator: on_value_written + eventType: written + path: "databaseWrittenTests/{testId}/start" + collection: databaseWrittenTests + + # V2 Pub/Sub triggers + - name: v2_pubsub + description: "V2 Pub/Sub trigger tests for Python" + version: v2 + service: pubsub + functions: + - name: pubsubOnMessagePublished + trigger: onMessagePublished + decorator: on_message_published + eventType: messagePublished + topic: custom_message_tests + collection: pubsubOnMessagePublishedTests + + # V2 Storage triggers + - name: v2_storage + description: "V2 Storage trigger tests for Python" + version: v2 + service: storage + functions: + - name: storageOnObjectArchived + trigger: onObjectArchived + decorator: on_object_archived + eventType: archived + bucket: functions-integration-tests-v2.firebasestorage.app + collection: storageOnObjectArchivedTests + - name: storageOnObjectDeleted + trigger: onObjectDeleted + decorator: on_object_deleted + eventType: deleted + bucket: functions-integration-tests-v2.firebasestorage.app + collection: storageOnObjectDeletedTests + - name: storageOnObjectFinalized + trigger: onObjectFinalized + decorator: on_object_finalized + eventType: finalized + bucket: functions-integration-tests-v2.firebasestorage.app + collection: storageOnObjectFinalizedTests + - name: storageOnObjectMetadataUpdated + trigger: onObjectMetadataUpdated + decorator: on_object_metadata_updated + eventType: metadataUpdated + bucket: functions-integration-tests-v2.firebasestorage.app + collection: storageOnObjectMetadataUpdatedTests + + # V2 Scheduler triggers + - name: v2_scheduler + description: "V2 Scheduled function tests for Python" + version: v2 + service: scheduler + functions: + - name: schedulerOnSchedule + trigger: onSchedule + decorator: on_schedule + schedule: "every 1 minutes" + collection: schedulerOnScheduleV2Tests + + # V2 Task Queue functions + - name: v2_tasks + description: "V2 Task Queue function tests for Python" + version: v2 + service: tasks + functions: + - name: taskQueueOnEnqueue + trigger: onTaskDispatched + decorator: on_task_dispatched + queueName: "v2-test-queue" + collection: tasksOnTaskDispatchedTests + + # V2 Identity triggers + - name: v2_identity + description: "V2 Identity trigger tests for Python" + version: v2 + service: identity + functions: + - name: identityBeforeUserCreated + trigger: beforeUserCreated + decorator: before_user_created + collection: identityBeforeUserCreatedTests + - name: identityBeforeUserSignedIn + trigger: beforeUserSignedIn + decorator: before_user_signed_in + collection: identityBeforeUserSignedInTests + + # V2 Eventarc triggers + - name: v2_eventarc + description: "V2 Eventarc trigger tests for Python" + version: v2 + service: eventarc + functions: + - name: eventarcOnCustomEventPublished + trigger: onCustomEventPublished + decorator: on_custom_event_published + eventType: achieved-leaderboard + collection: eventarcOnCustomEventPublishedTests + + # V2 Alerts triggers + - name: v2_alerts + description: "V2 Alerts trigger tests for Python" + version: v2 + service: alerts + functions: + - name: alertsOnAlertPublished + trigger: onAlertPublished + alertType: "billing.planAutomatedUpdatePublished" + + # V2 Test Lab triggers + - name: v2_testlab + description: "V2 Test Lab trigger tests for Python" + version: v2 + service: testlab + functions: + - name: testLabOnTestMatrixCompleted + trigger: onTestMatrixCompleted + decorator: on_test_matrix_completed + collection: testLabOnTestMatrixCompletedTests + + # V2 Remote Config triggers + - name: v2_remoteconfig + description: "V2 Remote Config trigger tests for Python" + version: v2 + service: remoteconfig + functions: + - name: remoteConfigOnConfigUpdated + trigger: onConfigUpdated + decorator: on_config_updated + collection: remoteConfigOnConfigUpdatedTests diff --git a/integration_test/jest.config.js b/integration_test/jest.config.js new file mode 100644 index 0000000..a49270b --- /dev/null +++ b/integration_test/jest.config.js @@ -0,0 +1,12 @@ +/** @type {import('jest').Config} */ +const config = { + preset: "ts-jest", + testEnvironment: "node", + testMatch: ["**/tests/**/*.test.ts"], + testTimeout: 120_000, + transform: { + "^.+\\.(t|j)s$": ["ts-jest", { tsconfig: "tsconfig.test.json" }], + }, +}; + +export default config; \ No newline at end of file diff --git a/integration_test/package.json b/integration_test/package.json new file mode 100644 index 0000000..b81529f --- /dev/null +++ b/integration_test/package.json @@ -0,0 +1,39 @@ +{ + "name": "integration-test-declarative", + "version": "1.0.0", + "type": "module", + "description": "Declarative Firebase Functions integration tests", + "scripts": { + "generate": "node scripts/generate.js", + "test": "jest --forceExit", + "run-tests": "node scripts/run-tests.js", + "deploy": "cd generated/functions && firebase deploy --only functions --project functions-integration-tests-v2", + "cloudbuild": "cd .. && gcloud builds submit --config=integration_test/cloudbuild.yaml --project=functions-integration-tests-v2 .", + "test:firestore": "PROJECT_ID=functions-integration-tests-v2 node scripts/run-tests.js v2_firestore", + "test:database": "PROJECT_ID=functions-integration-tests-v2 node scripts/run-tests.js v2_database", + "test:all": "PROJECT_ID=functions-integration-tests-v2 node scripts/run-tests.js --sequential 'v2_*'", + "test:all:sequential": "PROJECT_ID=functions-integration-tests-v2 node scripts/run-tests.js --sequential 'v2_*'", + "test:all:parallel": "PROJECT_ID=functions-integration-tests-v2 node scripts/run-tests.js 'v2_*'", + "cleanup": "./scripts/cleanup-suite.sh --project functions-integration-tests-v2", + "cleanup:list": "./scripts/cleanup-suite.sh --list-artifacts", + "clean": "rm -rf generated/*", + "hard-reset": "./scripts/hard-reset.sh" + }, + "dependencies": { + "@google-cloud/pubsub": "^4.0.0", + "ajv": "^8.17.1", + "chalk": "^4.1.2", + "firebase-admin": "^12.0.0" + }, + "devDependencies": { + "@google-cloud/tasks": "^6.2.0", + "@types/jest": "^29.5.11", + "@types/node": "^20.10.5", + "firebase": "^12.2.1", + "handlebars": "^4.7.8", + "jest": "^29.7.0", + "ts-jest": "^29.1.1", + "typescript": "^5.3.3", + "yaml": "^2.3.4" + } +} diff --git a/integration_test/scripts/cleanup-auth-users.cjs b/integration_test/scripts/cleanup-auth-users.cjs new file mode 100644 index 0000000..efa83e4 --- /dev/null +++ b/integration_test/scripts/cleanup-auth-users.cjs @@ -0,0 +1,58 @@ +#!/usr/bin/env node + +/** + * Cleanup script for auth users created during tests + * Usage: node cleanup-auth-users.js + */ + +const admin = require("firebase-admin"); + +const testRunId = process.argv[2]; +const projectId = process.env.PROJECT_ID || "functions-integration-tests-v2"; + +if (!testRunId) { + console.error("Usage: node cleanup-auth-users.js "); + process.exit(1); +} + +// Initialize admin SDK +if (!admin.apps.length) { + admin.initializeApp({ + projectId, + }); +} + +async function cleanupAuthUsers() { + try { + console.log(`Cleaning up auth users with TEST_RUN_ID: ${testRunId}`); + + // List all users and find ones created by this test run + let pageToken; + let deletedCount = 0; + + do { + const listUsersResult = await admin.auth().listUsers(1000, pageToken); + + for (const user of listUsersResult.users) { + // Check if user email contains the test run ID + if (user.email && user.email.includes(testRunId)) { + try { + await admin.auth().deleteUser(user.uid); + console.log(` Deleted user: ${user.email}`); + deletedCount++; + } catch (error) { + console.error(` Failed to delete user ${user.email}: ${error.message}`); + } + } + } + + pageToken = listUsersResult.pageToken; + } while (pageToken); + + console.log(` Deleted ${deletedCount} test users`); + } catch (error) { + console.error("Error cleaning up auth users:", error); + } +} + +cleanupAuthUsers().then(() => process.exit(0)); \ No newline at end of file diff --git a/integration_test/scripts/cleanup-suite.sh b/integration_test/scripts/cleanup-suite.sh new file mode 100755 index 0000000..3749204 --- /dev/null +++ b/integration_test/scripts/cleanup-suite.sh @@ -0,0 +1,223 @@ +#!/bin/bash + +# Cleanup script for deployed functions +# Usage: +# ./scripts/cleanup-suite.sh # Uses saved metadata +# ./scripts/cleanup-suite.sh # Cleanup specific run +# ./scripts/cleanup-suite.sh --pattern # Cleanup by pattern + +set -e + +# Colors for output +RED='\033[0;31m' +GREEN='\033[0;32m' +YELLOW='\033[1;33m' +BLUE='\033[0;34m' +NC='\033[0m' # No Color + +# Get directories +SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" +ROOT_DIR="$(dirname "$SCRIPT_DIR")" +METADATA_FILE="$ROOT_DIR/generated/.metadata.json" +ARTIFACTS_DIR="$ROOT_DIR/.test-artifacts" + +echo -e "${BLUE}═══════════════════════════════════════════════════════════${NC}" +echo -e "${GREEN}🧹 Firebase Functions Cleanup Tool${NC}" +echo -e "${BLUE}═══════════════════════════════════════════════════════════${NC}" + +# Function to cleanup by TEST_RUN_ID +cleanup_by_id() { + local TEST_RUN_ID="$1" + local PROJECT_ID="$2" + local METADATA_SOURCE="$3" # Optional metadata file + + echo -e "${YELLOW}🗑️ Cleaning up TEST_RUN_ID: $TEST_RUN_ID${NC}" + echo -e "${YELLOW} Project: $PROJECT_ID${NC}" + + # Delete functions + echo -e "${YELLOW} Deleting functions...${NC}" + + # Try to get function names from metadata if available + if [ -n "$METADATA_SOURCE" ] && [ -f "$METADATA_SOURCE" ]; then + # Extract function names from metadata + FUNCTIONS=$(grep -o '"[^"]*_'${TEST_RUN_ID}'"' "$METADATA_SOURCE" | tr -d '"') + + if [ -n "$FUNCTIONS" ]; then + for FUNCTION in $FUNCTIONS; do + echo " Deleting function: $FUNCTION" + firebase functions:delete "$FUNCTION" --project "$PROJECT_ID" --force 2>/dev/null || true + done + fi + else + # Fallback: try common patterns + echo " No metadata found, trying common function patterns..." + FUNCTION_PATTERNS=( + "firestoreDocumentOnCreateTests_${TEST_RUN_ID}" + "firestoreDocumentOnDeleteTests_${TEST_RUN_ID}" + "firestoreDocumentOnUpdateTests_${TEST_RUN_ID}" + "firestoreDocumentOnWriteTests_${TEST_RUN_ID}" + "databaseRefOnCreateTests_${TEST_RUN_ID}" + "databaseRefOnDeleteTests_${TEST_RUN_ID}" + "databaseRefOnUpdateTests_${TEST_RUN_ID}" + "databaseRefOnWriteTests_${TEST_RUN_ID}" + ) + + for FUNCTION in "${FUNCTION_PATTERNS[@]}"; do + firebase functions:delete "$FUNCTION" --project "$PROJECT_ID" --force 2>/dev/null || true + done + fi + + # Clean up Firestore collections + echo -e "${YELLOW} Cleaning up Firestore test data...${NC}" + for COLLECTION in firestoreDocumentOnCreateTests firestoreDocumentOnDeleteTests firestoreDocumentOnUpdateTests firestoreDocumentOnWriteTests; do + firebase firestore:delete "$COLLECTION/$TEST_RUN_ID" --project "$PROJECT_ID" --yes 2>/dev/null || true + done + + # Clean up Realtime Database paths + echo -e "${YELLOW} Cleaning up Realtime Database test data...${NC}" + for PATH in databaseRefOnCreateTests databaseRefOnDeleteTests databaseRefOnUpdateTests databaseRefOnWriteTests; do + firebase database:remove "/$PATH/$TEST_RUN_ID" --project "$PROJECT_ID" --force 2>/dev/null || true + done +} + +# Function to save artifact for future cleanup +save_artifact() { + if [ -f "$METADATA_FILE" ]; then + mkdir -p "$ARTIFACTS_DIR" + + # Extract metadata + TEST_RUN_ID=$(grep '"testRunId"' "$METADATA_FILE" | cut -d'"' -f4) + PROJECT_ID=$(grep '"projectId"' "$METADATA_FILE" | cut -d'"' -f4) + SUITE=$(grep '"suite"' "$METADATA_FILE" | cut -d'"' -f4) + + # Save artifact with timestamp + ARTIFACT_FILE="$ARTIFACTS_DIR/${TEST_RUN_ID}.json" + cp "$METADATA_FILE" "$ARTIFACT_FILE" + + echo -e "${GREEN}✓ Saved cleanup artifact: $ARTIFACT_FILE${NC}" + fi +} + +# Parse arguments +if [ "$1" == "--save-artifact" ]; then + # Save current metadata as artifact for future cleanup + save_artifact + exit 0 + +elif [ "$1" == "--pattern" ]; then + # Cleanup by pattern + PATTERN="$2" + PROJECT_ID="${3:-$PROJECT_ID}" + + if [ -z "$PATTERN" ] || [ -z "$PROJECT_ID" ]; then + echo -e "${RED}❌ Usage: $0 --pattern ${NC}" + exit 1 + fi + + echo -e "${YELLOW}🗑️ Cleaning up functions matching pattern: $PATTERN${NC}" + echo -e "${YELLOW} Project: $PROJECT_ID${NC}" + + FUNCTIONS=$(firebase functions:list --project "$PROJECT_ID" 2>/dev/null | grep "$PATTERN" | awk '{print $1}' || true) + + if [ -z "$FUNCTIONS" ]; then + echo -e "${YELLOW} No functions found matching pattern: $PATTERN${NC}" + else + for FUNCTION in $FUNCTIONS; do + echo " Deleting function: $FUNCTION" + firebase functions:delete "$FUNCTION" --project "$PROJECT_ID" --force 2>/dev/null || true + done + fi + +elif [ "$1" == "--list-artifacts" ]; then + # List saved artifacts + echo -e "${BLUE}📋 Saved test artifacts:${NC}" + if [ -d "$ARTIFACTS_DIR" ]; then + for artifact in "$ARTIFACTS_DIR"/*.json; do + if [ -f "$artifact" ]; then + TEST_RUN_ID=$(grep '"testRunId"' "$artifact" | cut -d'"' -f4) + PROJECT_ID=$(grep '"projectId"' "$artifact" | cut -d'"' -f4) + SUITE=$(grep '"suite"' "$artifact" | cut -d'"' -f4) + TIMESTAMP=$(grep '"generatedAt"' "$artifact" | cut -d'"' -f4) + echo -e "${GREEN} • $TEST_RUN_ID${NC} ($SUITE) - $PROJECT_ID - $TIMESTAMP" + fi + done + else + echo -e "${YELLOW} No artifacts found${NC}" + fi + +elif [ "$1" == "--clean-artifacts" ]; then + # Clean all artifacts and their deployed functions + if [ ! -d "$ARTIFACTS_DIR" ]; then + echo -e "${YELLOW}No artifacts to clean${NC}" + exit 0 + fi + + echo -e "${YELLOW}⚠️ This will clean up ALL saved test runs!${NC}" + read -p "Are you sure? (yes/no): " -r + if [[ ! $REPLY == "yes" ]]; then + echo -e "${GREEN}Cancelled${NC}" + exit 0 + fi + + for artifact in "$ARTIFACTS_DIR"/*.json; do + if [ -f "$artifact" ]; then + TEST_RUN_ID=$(grep '"testRunId"' "$artifact" | cut -d'"' -f4) + PROJECT_ID=$(grep '"projectId"' "$artifact" | cut -d'"' -f4) + cleanup_by_id "$TEST_RUN_ID" "$PROJECT_ID" + rm "$artifact" + fi + done + + echo -e "${GREEN}✅ All artifacts cleaned${NC}" + +elif [ -n "$1" ]; then + # Cleanup specific TEST_RUN_ID + TEST_RUN_ID="$1" + + # Try to find project ID from artifact + if [ -f "$ARTIFACTS_DIR/${TEST_RUN_ID}.json" ]; then + PROJECT_ID=$(grep '"projectId"' "$ARTIFACTS_DIR/${TEST_RUN_ID}.json" | cut -d'"' -f4) + echo -e "${GREEN}Found artifact for $TEST_RUN_ID${NC}" + else + # Fall back to environment or prompt + if [ -z "$PROJECT_ID" ]; then + echo -e "${YELLOW}No artifact found for $TEST_RUN_ID${NC}" + read -p "Enter PROJECT_ID: " PROJECT_ID + fi + fi + + cleanup_by_id "$TEST_RUN_ID" "$PROJECT_ID" + + # Remove artifact if exists + if [ -f "$ARTIFACTS_DIR/${TEST_RUN_ID}.json" ]; then + rm "$ARTIFACTS_DIR/${TEST_RUN_ID}.json" + echo -e "${GREEN}✓ Removed artifact${NC}" + fi + +else + # Default: use current metadata + if [ ! -f "$METADATA_FILE" ]; then + echo -e "${YELLOW}No current deployment found in generated/.metadata.json${NC}" + echo "" + echo "Usage:" + echo " $0 # Clean current deployment" + echo " $0 # Clean specific test run" + echo " $0 --pattern # Clean by pattern" + echo " $0 --list-artifacts # List saved test runs" + echo " $0 --clean-artifacts # Clean all saved test runs" + echo " $0 --save-artifact # Save current deployment for later cleanup" + exit 0 + fi + + # Extract from current metadata + TEST_RUN_ID=$(grep '"testRunId"' "$METADATA_FILE" | cut -d'"' -f4) + PROJECT_ID=$(grep '"projectId"' "$METADATA_FILE" | cut -d'"' -f4) + + cleanup_by_id "$TEST_RUN_ID" "$PROJECT_ID" "$METADATA_FILE" + + # Clean generated files + echo -e "${YELLOW} Cleaning up generated files...${NC}" + /bin/rm -rf "$ROOT_DIR/generated"/* +fi + +echo -e "${GREEN}✅ Cleanup complete!${NC}" \ No newline at end of file diff --git a/integration_test/scripts/config-loader.js b/integration_test/scripts/config-loader.js new file mode 100644 index 0000000..60bde73 --- /dev/null +++ b/integration_test/scripts/config-loader.js @@ -0,0 +1,316 @@ +#!/usr/bin/env node + +/** + * Configuration Loader Module + * Loads and parses the unified YAML configuration for Firebase Functions integration tests + */ + +import { readFileSync, existsSync } from "fs"; +import { parse } from "yaml"; +import { join, dirname } from "path"; +import { fileURLToPath } from "url"; +import Ajv from "ajv"; + +// Get directory paths +const __filename = fileURLToPath(import.meta.url); +const __dirname = dirname(__filename); +const ROOT_DIR = dirname(__dirname); + +// Default configuration path +const DEFAULT_CONFIG_PATH = join(ROOT_DIR, "config", "suites.yaml"); +const SCHEMA_PATH = join(ROOT_DIR, "config", "suites.schema.json"); + +// Initialize AJV validator +let validator = null; + +/** + * Initialize the JSON schema validator + * @returns {Function} AJV validation function + */ +function getValidator() { + if (!validator) { + // Check if schema file exists + if (!existsSync(SCHEMA_PATH)) { + throw new Error( + `Schema file not found at: ${SCHEMA_PATH}\n` + + `Please ensure the schema file exists before using validation.` + ); + } + + const ajv = new Ajv({ + allErrors: true, + verbose: true, + strict: false, // Allow additional properties where specified + }); + + try { + // Load and compile the schema + const schemaContent = readFileSync(SCHEMA_PATH, "utf8"); + const schema = JSON.parse(schemaContent); + validator = ajv.compile(schema); + } catch (error) { + throw new Error(`Failed to load schema from ${SCHEMA_PATH}: ${error.message}`); + } + } + return validator; +} + +/** + * Validate configuration against JSON schema + * @param {Object} config - Configuration object to validate + * @throws {Error} If configuration doesn't match schema + */ +export function validateConfig(config) { + const validate = getValidator(); + const valid = validate(config); + + if (!valid) { + // Format validation errors for better readability + const errors = validate.errors.map((err) => { + const path = err.instancePath || "/"; + const message = err.message || "Unknown validation error"; + + // Provide more specific error messages for common issues + if (err.keyword === "required") { + return `Missing required field '${err.params.missingProperty}' at ${path}`; + } else if (err.keyword === "enum") { + return `Invalid value at ${path}: ${message}. Allowed values: ${err.params.allowedValues.join( + ", " + )}`; + } else if (err.keyword === "pattern") { + return `Invalid format at ${path}: value doesn't match pattern ${err.params.pattern}`; + } else if (err.keyword === "type") { + return `Type error at ${path}: expected ${err.params.type}, got ${typeof err.data}`; + } else { + return `Validation error at ${path}: ${message}`; + } + }); + + throw new Error( + `Configuration validation failed:\n${errors.map((e) => ` - ${e}`).join("\n")}` + ); + } +} + +/** + * Load and parse the unified configuration file + * @param {string} configPath - Path to the configuration file (optional, defaults to config/suites.yaml) + * @returns {Object} Parsed configuration object with defaults and suites + * @throws {Error} If configuration file is not found or has invalid YAML syntax + */ +export function loadUnifiedConfig(configPath = DEFAULT_CONFIG_PATH) { + // Check if config file exists + if (!existsSync(configPath)) { + throw new Error( + `Configuration file not found at: ${configPath}\n` + + `Please create the unified configuration file or run the migration tool.` + ); + } + + try { + // Read and parse YAML file + const configContent = readFileSync(configPath, "utf8"); + const config = parse(configContent); + + // Validate basic structure + if (!config || typeof config !== "object") { + throw new Error("Invalid configuration: File must contain a valid YAML object"); + } + + if (!config.defaults) { + throw new Error("Invalid configuration: Missing 'defaults' section"); + } + + if (!config.suites || !Array.isArray(config.suites)) { + throw new Error("Invalid configuration: Missing or invalid 'suites' array"); + } + + if (config.suites.length === 0) { + throw new Error("Invalid configuration: No suites defined"); + } + + // Validate configuration against schema + try { + validateConfig(config); + } catch (validationError) { + // Re-throw with context about which file failed + throw new Error(`Schema validation failed for ${configPath}:\n${validationError.message}`); + } + + return config; + } catch (error) { + // Enhance YAML parsing errors with context + if (error.name === "YAMLParseError" || error.name === "YAMLException") { + const lineInfo = error.linePos ? ` at line ${error.linePos.start.line}` : ""; + throw new Error(`YAML syntax error in configuration file${lineInfo}:\n${error.message}`); + } + + // Re-throw other errors with context + if (!error.message.includes("Invalid configuration:")) { + throw new Error(`Failed to load configuration from ${configPath}: ${error.message}`); + } + + throw error; + } +} + +/** + * List all available suite names from the configuration + * @param {string} configPath - Path to the configuration file (optional) + * @returns {string[]} Array of suite names + */ +export function listAvailableSuites(configPath = DEFAULT_CONFIG_PATH) { + const config = loadUnifiedConfig(configPath); + return config.suites.map((suite) => suite.name); +} + +/** + * Check if the unified configuration file exists + * @param {string} configPath - Path to check (optional) + * @returns {boolean} True if configuration file exists + */ +export function hasUnifiedConfig(configPath = DEFAULT_CONFIG_PATH) { + return existsSync(configPath); +} + +/** + * Get the configuration with defaults and suites + * This is the raw configuration without suite extraction or defaults application + * @param {string} configPath - Path to the configuration file (optional) + * @returns {Object} Configuration object with defaults and suites array + */ +export function getFullConfig(configPath = DEFAULT_CONFIG_PATH) { + return loadUnifiedConfig(configPath); +} + +/** + * Apply defaults to a suite configuration + * @param {Object} suite - Suite configuration object + * @param {Object} defaults - Default configuration values + * @returns {Object} Suite with defaults applied + */ +function applyDefaults(suite, defaults) { + // Deep clone the suite to avoid modifying the original + const mergedSuite = JSON.parse(JSON.stringify(suite)); + + // Apply top-level defaults + if (!mergedSuite.projectId && defaults.projectId) { + mergedSuite.projectId = defaults.projectId; + } + + if (!mergedSuite.region && defaults.region) { + mergedSuite.region = defaults.region; + } + + // Merge dependencies (suite overrides take precedence) + if (defaults.dependencies) { + mergedSuite.dependencies = { + ...defaults.dependencies, + ...(mergedSuite.dependencies || {}), + }; + } + + // Merge devDependencies (suite overrides take precedence) + if (defaults.devDependencies) { + mergedSuite.devDependencies = { + ...defaults.devDependencies, + ...(mergedSuite.devDependencies || {}), + }; + } + + // Apply function-level defaults + if (mergedSuite.functions && Array.isArray(mergedSuite.functions)) { + mergedSuite.functions = mergedSuite.functions.map((func) => { + const mergedFunc = { ...func }; + + // Apply timeout default (540 seconds) if not specified + if (mergedFunc.timeout === undefined && defaults.timeout !== undefined) { + mergedFunc.timeout = defaults.timeout; + } + + // Apply collection default (use function name) if not specified + if (!mergedFunc.collection && mergedFunc.name) { + mergedFunc.collection = mergedFunc.name; + } + + return mergedFunc; + }); + } + + return mergedSuite; +} + +/** + * Get a specific suite configuration with defaults applied + * @param {string} suiteName - Name of the suite to extract + * @param {string} configPath - Path to the configuration file (optional) + * @returns {Object} Suite configuration with defaults applied + * @throws {Error} If suite is not found + */ +export function getSuiteConfig(suiteName, configPath = DEFAULT_CONFIG_PATH) { + const config = loadUnifiedConfig(configPath); + + // Find the requested suite + const suite = config.suites.find((s) => s.name === suiteName); + + if (!suite) { + // Provide helpful error with available suites + const availableSuites = config.suites.map((s) => s.name); + const suggestions = availableSuites + .filter( + (name) => name.includes(suiteName.split("_")[0]) || name.includes(suiteName.split("_")[1]) + ) + .slice(0, 3); + + let errorMsg = `Suite '${suiteName}' not found in configuration.\n`; + errorMsg += `Available suites: ${availableSuites.join(", ")}\n`; + + if (suggestions.length > 0) { + errorMsg += `Did you mean: ${suggestions.join(", ")}?`; + } + + throw new Error(errorMsg); + } + + // Apply defaults to the suite + return applyDefaults(suite, config.defaults); +} + +/** + * Get multiple suite configurations with defaults applied + * @param {string[]} suiteNames - Array of suite names to extract + * @param {string} configPath - Path to the configuration file (optional) + * @returns {Object[]} Array of suite configurations with defaults applied + * @throws {Error} If any suite is not found + */ +export function getSuiteConfigs(suiteNames, configPath = DEFAULT_CONFIG_PATH) { + return suiteNames.map((name) => getSuiteConfig(name, configPath)); +} + +/** + * Get all suites matching a pattern with defaults applied + * @param {string} pattern - Pattern to match (e.g., "v1_*" for all v1 suites) + * @param {string} configPath - Path to the configuration file (optional) + * @returns {Object[]} Array of matching suite configurations with defaults applied + */ +export function getSuitesByPattern(pattern, configPath = DEFAULT_CONFIG_PATH) { + const config = loadUnifiedConfig(configPath); + + // Convert pattern to regex (e.g., "v1_*" -> /^v1_.*$/) + const regexPattern = pattern.replace(/\*/g, ".*").replace(/\?/g, "."); + const regex = new RegExp(`^${regexPattern}$`); + + // Filter and apply defaults to matching suites + const matchingSuites = config.suites + .filter((suite) => regex.test(suite.name)) + .map((suite) => applyDefaults(suite, config.defaults)); + + if (matchingSuites.length === 0) { + throw new Error(`No suites found matching pattern '${pattern}'`); + } + + return matchingSuites; +} + +// Export default configuration path for use by other modules +export const CONFIG_PATH = DEFAULT_CONFIG_PATH; diff --git a/integration_test/scripts/generate.js b/integration_test/scripts/generate.js new file mode 100644 index 0000000..50021fc --- /dev/null +++ b/integration_test/scripts/generate.js @@ -0,0 +1,400 @@ +#!/usr/bin/env node + +/** + * Python Function Generator Script + * Generates Python Firebase Functions from YAML configuration using templates + */ + +import Handlebars from "handlebars"; +import { readFileSync, writeFileSync, mkdirSync, existsSync, copyFileSync } from "fs"; +import { join, dirname } from "path"; +import { fileURLToPath } from "url"; +import { getSuiteConfig, getSuitesByPattern, listAvailableSuites } from "./config-loader.js"; + +const __filename = fileURLToPath(import.meta.url); +const __dirname = dirname(__filename); +const ROOT_DIR = dirname(__dirname); + +// Register Handlebars helpers +Handlebars.registerHelper("eq", (a, b) => a === b); +Handlebars.registerHelper("or", (a, b) => a || b); +Handlebars.registerHelper("unless", function (conditional, options) { + if (!conditional) { + return options.fn(this); + } + return options.inverse(this); +}); + +/** + * Generate Python Firebase Functions from templates + * @param {string[]} suitePatterns - Array of suite names or patterns + * @param {Object} options - Generation options + * @param {string} [options.testRunId] - Test run ID to use + * @param {string} [options.configPath] - Path to config file + * @param {string} [options.projectId] - Override project ID + * @param {string} [options.region] - Override region + * @param {string} [options.sdkPackage] - Path to SDK package (wheel) + * @param {boolean} [options.quiet] - Suppress console output + * @returns {Promise} - Metadata about generated functions + */ +export async function generateFunctions(suitePatterns, options = {}) { + const { + testRunId = `t${Math.random().toString(36).substring(2, 10)}`, + configPath: initialConfigPath = null, + projectId: overrideProjectId = process.env.PROJECT_ID, + region: overrideRegion = process.env.REGION, + sdkPackage = process.env.SDK_PACKAGE || "file:firebase-functions-python-local.whl", + quiet = false + } = options; + + const log = quiet ? () => {} : console.log.bind(console); + const error = quiet ? () => {} : console.error.bind(console); + + log(`🚀 Generating Python functions: ${suitePatterns.join(", ")}`); + log(` TEST_RUN_ID: ${testRunId}`); + + // Load suite configurations + const suites = []; + let projectId, region; + let configPath = initialConfigPath; + + for (const pattern of suitePatterns) { + try { + let suitesToAdd = []; + + // Check if it's a pattern (contains * or ?) + if (pattern.includes("*") || pattern.includes("?")) { + // Use unified config file (Python only supports 2nd gen) + if (!configPath) { + configPath = join(ROOT_DIR, "config", "suites.yaml"); + } + suitesToAdd = getSuitesByPattern(pattern, configPath); + } else { + // Single suite name + if (!configPath) { + // Use unified config file (Python only supports 2nd gen) + configPath = join(ROOT_DIR, "config", "suites.yaml"); + } + suitesToAdd = [getSuiteConfig(pattern, configPath)]; + } + + // Add suites and extract project/region from first suite + for (const suite of suitesToAdd) { + if (!projectId) { + projectId = suite.projectId || overrideProjectId || "demo-test"; + region = suite.region || overrideRegion || "us-central1"; + } + suites.push(suite); + } + + // Reset configPath for next pattern (allows mixing v1 and v2) + if (!initialConfigPath) { + configPath = null; + } + } catch (err) { + error(`❌ Error loading suite(s) '${pattern}': ${err.message}`); + throw err; + } + } + + if (suites.length === 0) { + throw new Error("No suites found to generate"); + } + + log(` PROJECT_ID: ${projectId}`); + log(` REGION: ${region}`); + log(` Loaded ${suites.length} suite(s)`); + + // Helper function to generate from template + function generateFromTemplate(templatePath, outputPath, context) { + const fullTemplatePath = join(ROOT_DIR, "templates", templatePath); + + if (!existsSync(fullTemplatePath)) { + error(`❌ Template not found: ${fullTemplatePath}`); + return false; + } + + const templateContent = readFileSync(fullTemplatePath, "utf8"); + const template = Handlebars.compile(templateContent); + const output = template(context); + + const outputFullPath = join(ROOT_DIR, "generated", outputPath); + mkdirSync(dirname(outputFullPath), { recursive: true }); + writeFileSync(outputFullPath, output); + log(` ✅ Generated: ${outputPath}`); + return true; + } + + // Template mapping for service types and versions + const templateMap = { + firestore: { + v1: "functions/src/v1/firestore_tests.py.hbs", + v2: "functions/src/v2/firestore_tests.py.hbs", + }, + database: { + v1: "functions/src/v1/database_tests.py.hbs", + v2: "functions/src/v2/database_tests.py.hbs", + }, + pubsub: { + v1: "functions/src/v1/pubsub_tests.py.hbs", + v2: "functions/src/v2/pubsub_tests.py.hbs", + }, + storage: { + v1: "functions/src/v1/storage_tests.py.hbs", + v2: "functions/src/v2/storage_tests.py.hbs", + }, + auth: { + v1: "functions/src/v1/auth_tests.py.hbs", + v2: "functions/src/v2/auth_tests.py.hbs", + }, + tasks: { + v1: "functions/src/v1/tasks_tests.py.hbs", + v2: "functions/src/v2/tasks_tests.py.hbs", + }, + remoteconfig: { + v1: "functions/src/v1/remoteconfig_tests.py.hbs", + v2: "functions/src/v2/remoteconfig_tests.py.hbs", + }, + testlab: { + v1: "functions/src/v1/testlab_tests.py.hbs", + v2: "functions/src/v2/testlab_tests.py.hbs", + }, + scheduler: { + v2: "functions/src/v2/scheduler_tests.py.hbs", + }, + identity: { + v2: "functions/src/v2/identity_tests.py.hbs", + }, + eventarc: { + v2: "functions/src/v2/eventarc_tests.py.hbs", + }, + alerts: { + v2: "functions/src/v2/alerts_tests.py.hbs", + }, + }; + + log(`\n📁 Generating Python functions...`); + + // Collect all dependencies from all suites + const allDependencies = {}; + const allDevDependencies = {}; + + // Generate test files for each suite + const generatedSuites = []; + for (const suite of suites) { + const { name, service, version } = suite; + + // Select the appropriate template + const templatePath = templateMap[service]?.[version]; + if (!templatePath) { + error(`❌ No template found for service '${service}' version '${version}'`); + error(`Available templates:`); + Object.entries(templateMap).forEach(([svc, versions]) => { + Object.keys(versions).forEach((ver) => { + error(` - ${svc} ${ver}`); + }); + }); + continue; // Skip this suite but continue with others + } + + log(` 📋 ${name}: Using service: ${service}, version: ${version}`); + + // Create context for this suite's template + const context = { + ...suite, + testRunId, + sdkPackage, + timestamp: new Date().toISOString(), + projectId: "functions-integration-tests-v2", + }; + + // Generate the test file for this suite + const outputPath = `functions/${version}/${service}_tests.py`; + + if (generateFromTemplate(templatePath, outputPath, context)) { + // Collect dependencies + Object.assign(allDependencies, suite.dependencies || {}); + Object.assign(allDevDependencies, suite.devDependencies || {}); + + // Track generated suite info for main file + generatedSuites.push({ + name, + service, + version, + projectId: suite.projectId, + region: suite.region, + functions: suite.functions.map((f) => `${f.name}${testRunId}`), + }); + } + } + + if (generatedSuites.length === 0) { + throw new Error("No functions were generated"); + } + + // Generate shared files (only once) + const sharedContext = { + projectId, + region, + testRunId, + sdkPackage, + timestamp: new Date().toISOString(), + dependencies: allDependencies, + devDependencies: allDevDependencies, + }; + + // Create __init__ files for packages + mkdirSync(join(ROOT_DIR, "generated", "functions", "v1"), { recursive: true }); + mkdirSync(join(ROOT_DIR, "generated", "functions", "v2"), { recursive: true }); + + writeFileSync(join(ROOT_DIR, "generated", "functions", "__init__.py"), ""); + writeFileSync(join(ROOT_DIR, "generated", "functions", "v1", "__init__.py"), ""); + writeFileSync(join(ROOT_DIR, "generated", "functions", "v2", "__init__.py"), ""); + + // Generate utils.py + generateFromTemplate("functions/src/utils.py.hbs", "functions/utils.py", sharedContext); + + // Generate main.py with all suites + const mainContext = { + projectId, + testRunId, + suites: generatedSuites.map((s) => ({ + name: s.name, + service: s.service, + version: s.version, + })), + }; + + generateFromTemplate("functions/src/main.py.hbs", "functions/main.py", mainContext); + + // Generate requirements.txt + const requirementsContext = { + ...sharedContext, + sdkPackage: sdkPackage.startsWith("file:") + ? sdkPackage.replace("file:", "./") + : sdkPackage, + dependencies: [ + // Add any additional Python dependencies here + ] + }; + + generateFromTemplate("functions/requirements.txt.hbs", "functions/requirements.txt", requirementsContext); + + // Generate firebase.json + generateFromTemplate("functions/firebase.json.hbs", "firebase.json", sharedContext); + + // Write metadata for cleanup and reference + const metadata = { + projectId, + region, + testRunId, + language: "python", + generatedAt: new Date().toISOString(), + suites: generatedSuites, + }; + + writeFileSync(join(ROOT_DIR, "generated", ".metadata.json"), JSON.stringify(metadata, null, 2)); + + // Copy the SDK package into the functions directory if using local SDK + if (sdkPackage.startsWith("file:")) { + const packageFileName = sdkPackage.replace("file:", ""); + const packageSourcePath = join(ROOT_DIR, packageFileName); + const packageDestPath = join(ROOT_DIR, "generated", "functions", packageFileName); + + if (existsSync(packageSourcePath)) { + copyFileSync(packageSourcePath, packageDestPath); + log(` ✅ Copied SDK package to functions directory`); + } else { + error(` ⚠️ Warning: SDK package not found at ${packageSourcePath}`); + error(` Run './scripts/pack-for-integration-tests.sh' from the root directory first`); + } + } + + log("\n✨ Generation complete!"); + log( + ` Generated ${generatedSuites.length} suite(s) with ${generatedSuites.reduce( + (acc, s) => acc + s.functions.length, + 0 + )} function(s)` + ); + + log("\nNext steps:"); + log(" 1. cd generated/functions"); + log(" 2. python -m venv venv && source venv/bin/activate"); + log(" 3. pip install -r requirements.txt"); + log(` 4. firebase deploy --project ${projectId}`); + + return metadata; +} + +// CLI interface when run directly +if (import.meta.url === `file://${process.argv[1]}`) { + const args = process.argv.slice(2); + + // Handle help + if (args.length === 0 || args.includes("--help") || args.includes("-h")) { + console.log("Usage: node generate.js [options]"); + console.log("\nExamples:"); + console.log(" node generate.js v2_firestore # Single suite"); + console.log(" node generate.js v2_firestore v2_database # Multiple suites"); + console.log(" node generate.js 'v2_*' # All v2 suites (pattern)"); + console.log(" node generate.js --list # List available suites"); + console.log(" node generate.js --config config/suites.yaml v2_firestore"); + console.log("\nOptions:"); + console.log(" --config Path to configuration file (default: auto-detect)"); + console.log(" --list List all available suites"); + console.log(" --help, -h Show this help message"); + console.log("\nEnvironment variables:"); + console.log(" TEST_RUN_ID Override test run ID (default: auto-generated)"); + console.log(" PROJECT_ID Override project ID from config"); + console.log(" REGION Override region from config"); + console.log(" SDK_PACKAGE Path to Firebase Functions SDK package (wheel)"); + process.exit(0); + } + + // Handle --list option + if (args.includes("--list")) { + const configPath = join(ROOT_DIR, "config", "suites.yaml"); + + console.log("\nAvailable test suites:"); + + if (existsSync(configPath)) { + console.log("\n📁 All Suites (config/suites.yaml):"); + const suites = listAvailableSuites(configPath); + suites.forEach((suite) => console.log(` - ${suite}`)); + } else { + console.error("❌ Config file not found:", configPath); + } + + process.exit(0); + } + + // Parse options + const options = { + testRunId: process.env.TEST_RUN_ID, + projectId: process.env.PROJECT_ID, + region: process.env.REGION, + sdkPackage: process.env.SDK_PACKAGE + }; + + // Extract config path if provided + const configIndex = args.indexOf("--config"); + if (configIndex !== -1 && configIndex < args.length - 1) { + options.configPath = args[configIndex + 1]; + args.splice(configIndex, 2); + } + + // Remaining args are suite patterns + const suitePatterns = args.filter((arg) => !arg.startsWith("--")); + + if (suitePatterns.length === 0) { + console.error("Error: No suite names provided"); + console.log("Run with --help for usage information"); + process.exit(1); + } + + generateFunctions(suitePatterns, options).catch((err) => { + console.error("Generation failed:", err.message); + process.exit(1); + }); +} \ No newline at end of file diff --git a/integration_test/scripts/run-tests.js b/integration_test/scripts/run-tests.js new file mode 100644 index 0000000..bf577ac --- /dev/null +++ b/integration_test/scripts/run-tests.js @@ -0,0 +1,1338 @@ +#!/usr/bin/env node + +/** + * Unified Test Runner for Firebase Functions Integration Tests + * Combines functionality from run-suite.sh and run-sequential.sh into a single JavaScript runner + */ + +import { spawn } from "child_process"; +import { existsSync, readFileSync, writeFileSync, mkdirSync, rmSync, renameSync } from "fs"; +import { join, dirname } from "path"; +import { fileURLToPath } from "url"; +import chalk from "chalk"; +import { getSuitesByPattern, listAvailableSuites } from "./config-loader.js"; +import { generateFunctions } from "./generate.js"; + +// Get directory paths +const __filename = fileURLToPath(import.meta.url); +const __dirname = dirname(__filename); +const ROOT_DIR = dirname(__dirname); + +// Configuration paths +const V1_CONFIG_PATH = join(ROOT_DIR, "config", "v1", "suites.yaml"); +const V2_CONFIG_PATH = join(ROOT_DIR, "config", "v2", "suites.yaml"); +const ARTIFACTS_DIR = join(ROOT_DIR, ".test-artifacts"); +const LOGS_DIR = join(ROOT_DIR, "logs"); +const GENERATED_DIR = join(ROOT_DIR, "generated"); +const SA_JSON_PATH = join(ROOT_DIR, "sa.json"); + +// Default configurations +const DEFAULT_REGION = "us-central1"; +const MAX_DEPLOY_ATTEMPTS = 5; +const DEFAULT_BASE_DELAY = 30000; // Base delay in ms (30 seconds) +const DEFAULT_MAX_DELAY = 120000; // Max delay in ms (120 seconds/2 minutes) + +class TestRunner { + constructor(options = {}) { + this.testRunId = options.testRunId || this.generateTestRunId(); + this.sequential = options.sequential || false; + this.saveArtifact = options.saveArtifact || false; + this.skipCleanup = options.skipCleanup || false; + this.filter = options.filter || ""; + this.exclude = options.exclude || ""; + this.usePublishedSDK = options.usePublishedSDK || null; + this.verbose = options.verbose || false; + this.cleanupOrphaned = options.cleanupOrphaned || false; + this.timestamp = new Date().toISOString().replace(/[:.]/g, "-").substring(0, 19); + this.logFile = join(LOGS_DIR, `test-run-${this.timestamp}.log`); + this.deploymentSuccess = false; + this.results = { passed: [], failed: [] }; + } + + /** + * Generate a unique test run ID + */ + generateTestRunId() { + const chars = "abcdefghijklmnopqrstuvwxyz0123456789"; + let id = "t"; + for (let i = 0; i < 8; i++) { + id += chars.charAt(Math.floor(Math.random() * chars.length)); + } + return id; + } + + /** + * Calculate exponential backoff delay with jitter + * Based on util.sh exponential_backoff function + */ + calculateBackoffDelay(attempt, baseDelay = DEFAULT_BASE_DELAY, maxDelay = DEFAULT_MAX_DELAY) { + // Calculate delay: baseDelay * 2^(attempt-1) + let delay = baseDelay * Math.pow(2, attempt - 1); + + // Cap at maxDelay + if (delay > maxDelay) { + delay = maxDelay; + } + + // Add jitter (±25% random variation) + const jitter = delay / 4; + const randomJitter = Math.random() * jitter * 2 - jitter; + delay = delay + randomJitter; + + // Ensure minimum delay of 1 second + if (delay < 1000) { + delay = 1000; + } + + return Math.round(delay); + } + + /** + * Retry function with exponential backoff + * Based on util.sh retry_with_backoff function + */ + async retryWithBackoff( + operation, + maxAttempts = MAX_DEPLOY_ATTEMPTS, + baseDelay = DEFAULT_BASE_DELAY, + maxDelay = DEFAULT_MAX_DELAY + ) { + let attempt = 1; + + while (attempt <= maxAttempts) { + this.log(`🔄 Attempt ${attempt} of ${maxAttempts}`, "warn"); + + try { + const result = await operation(); + this.log("✅ Operation succeeded", "success"); + return result; + } catch (error) { + if (attempt < maxAttempts) { + const delay = this.calculateBackoffDelay(attempt, baseDelay, maxDelay); + this.log( + `⚠️ Operation failed. Retrying in ${Math.round(delay / 1000)} seconds...`, + "warn" + ); + await new Promise((resolve) => setTimeout(resolve, delay)); + } else { + this.log(`❌ Operation failed after ${maxAttempts} attempts`, "error"); + throw error; + } + } + + attempt++; + } + } + + /** + * Log message to console and file + */ + log(message, level = "info") { + const timestamp = new Date().toISOString(); + const logEntry = `[${timestamp}] ${message}`; + + // Ensure logs directory exists + if (!existsSync(LOGS_DIR)) { + mkdirSync(LOGS_DIR, { recursive: true }); + } + + // Write to log file + try { + writeFileSync(this.logFile, logEntry + "\n", { flag: "a" }); + } catch (e) { + // Ignore file write errors + } + + // Console output with colors + switch (level) { + case "error": + console.log(chalk.red(message)); + break; + case "warn": + console.log(chalk.yellow(message)); + break; + case "success": + console.log(chalk.green(message)); + break; + case "info": + console.log(chalk.blue(message)); + break; + default: + console.log(message); + } + } + + /** + * Execute a shell command + */ + async exec(command, options = {}) { + return new Promise((resolve, reject) => { + const child = spawn(command, { + shell: true, + cwd: options.cwd || ROOT_DIR, + env: { ...process.env, ...options.env }, + stdio: options.silent ? "pipe" : ["inherit", "pipe", "pipe"], + }); + + let stdout = ""; + let stderr = ""; + + // Always capture output for error reporting, even when not silent + child.stdout.on("data", (data) => { + const output = data.toString(); + stdout += output; + if (!options.silent) { + process.stdout.write(output); + } + }); + + child.stderr.on("data", (data) => { + const output = data.toString(); + stderr += output; + if (!options.silent) { + process.stderr.write(output); + } + }); + + child.on("exit", (code) => { + if (code === 0) { + resolve({ stdout, stderr, code }); + } else { + // Include both stdout and stderr in error message for better debugging + const errorOutput = stderr || stdout || "No output captured"; + reject(new Error(`Command failed with code ${code}: ${command}\n${errorOutput}`)); + } + }); + + child.on("error", (error) => { + reject(error); + }); + }); + } + + /** + * Get all available suites from configuration + */ + getAllSuites() { + const suites = []; + + // Get V1 suites + if (existsSync(V1_CONFIG_PATH)) { + try { + const v1Suites = listAvailableSuites(V1_CONFIG_PATH); + suites.push(...v1Suites); + } catch (e) { + this.log(`Warning: Could not load V1 suites: ${e.message}`, "warn"); + } + } + + // Get V2 suites + if (existsSync(V2_CONFIG_PATH)) { + try { + const v2Suites = listAvailableSuites(V2_CONFIG_PATH); + suites.push(...v2Suites); + } catch (e) { + this.log(`Warning: Could not load V2 suites: ${e.message}`, "warn"); + } + } + + return suites; + } + + /** + * Filter suites based on patterns and exclusions + */ + filterSuites(suitePatterns) { + let suites = []; + + // If patterns include wildcards, get matching suites + for (const pattern of suitePatterns) { + if (pattern.includes("*") || pattern.includes("?")) { + // Check both v1 and v2 configs + if (existsSync(V1_CONFIG_PATH)) { + const v1Matches = getSuitesByPattern(pattern, V1_CONFIG_PATH); + suites.push(...v1Matches.map((s) => s.name)); + } + if (existsSync(V2_CONFIG_PATH)) { + const v2Matches = getSuitesByPattern(pattern, V2_CONFIG_PATH); + suites.push(...v2Matches.map((s) => s.name)); + } + } else { + // Direct suite name + suites.push(pattern); + } + } + + // Remove duplicates + suites = [...new Set(suites)]; + + // Apply filter pattern if specified + if (this.filter) { + suites = suites.filter((suite) => suite.includes(this.filter)); + } + + // Apply exclusions + if (this.exclude) { + // Use simple string matching instead of regex to avoid injection + suites = suites.filter((suite) => !suite.includes(this.exclude)); + } + + return suites; + } + + /** + * Generate functions from templates + */ + async generateFunctions(suiteNames) { + this.log("📦 Generating functions...", "info"); + + // Use SDK tarball (must be provided via --use-published-sdk or pre-packed) + let sdkTarball; + if (this.usePublishedSDK) { + sdkTarball = this.usePublishedSDK; + this.log(` Using provided SDK: ${sdkTarball}`, "info"); + } else { + // Default to local tarball (should be pre-packed by Cloud Build or manually) + sdkTarball = `file:firebase-functions-local.tgz`; + this.log(` Using local SDK: firebase-functions-local.tgz`, "info"); + } + + try { + // Call the generate function directly instead of spawning subprocess + const metadata = await generateFunctions(suiteNames, { + testRunId: this.testRunId, + sdkPackage: sdkTarball, // Note: variable name is sdkTarball but param is sdkPackage + quiet: true, // Suppress console output since we have our own logging + }); + + // Store project info + this.projectId = metadata.projectId; + this.region = metadata.region || DEFAULT_REGION; + + this.log( + `✓ Generated ${suiteNames.length} suite(s) for project: ${this.projectId}`, + "success" + ); + + // Debug: Verify wheel was copied to generated/functions + if (sdkTarball.startsWith("file:")) { + const wheelFileName = sdkTarball.replace("file:", ""); + const wheelPath = join(GENERATED_DIR, "functions", wheelFileName); + if (existsSync(wheelPath)) { + this.log(`✓ SDK wheel exists at: ${wheelPath}`, "success"); + } else { + this.log(`⚠️ WARNING: SDK wheel not found at: ${wheelPath}`, "warn"); + this.log(` This will cause deployment to fail!`, "warn"); + } + } + + // Save artifact if requested + if (this.saveArtifact) { + this.saveTestArtifact(metadata); + } + + return metadata; + } catch (error) { + throw new Error(`Failed to generate functions: ${error.message}`); + } + } + + /** + * Build generated functions + */ + async buildFunctions() { + this.log("🔨 Building functions...", "info"); + + const functionsDir = join(GENERATED_DIR, "functions"); + + // Python build: create venv and install dependencies + await this.exec("python3.11 -m venv venv", { cwd: functionsDir }); + await this.exec(". venv/bin/activate && pip install --upgrade pip && pip install -r requirements.txt", { cwd: functionsDir }); + + this.log("✓ Functions built successfully", "success"); + } + + /** + * Deploy functions to Firebase with retry logic + */ + async deployFunctions() { + this.log("☁️ Deploying to Firebase...", "info"); + + try { + await this.retryWithBackoff(async () => { + const result = await this.exec( + `firebase deploy --only functions --project ${this.projectId} --force`, + { cwd: GENERATED_DIR, silent: !this.verbose } + ); + + // Check for successful deployment or acceptable warnings + const output = result.stdout + result.stderr; + if ( + output.includes("Deploy complete!") || + output.includes("Functions successfully deployed but could not set up cleanup policy") + ) { + this.deploymentSuccess = true; + this.log("✅ Deployment succeeded", "success"); + return result; + } else { + // Log output for debugging if deployment didn't match expected success patterns + this.log("⚠️ Deployment output did not match success patterns", "warn"); + this.log(`Stdout: ${result.stdout.substring(0, 500)}...`, "warn"); + this.log(`Stderr: ${result.stderr.substring(0, 500)}...`, "warn"); + throw new Error("Deployment output did not match success patterns"); + } + }); + } catch (error) { + // Enhanced error logging with full details + this.log(`❌ Deployment error: ${error.message}`, "error"); + + // Try to extract more details from the error + if (error.message.includes("Command failed with code 1")) { + this.log("🔍 Full deployment command output:", "error"); + + // Extract the actual Firebase CLI error from the error message + const errorLines = error.message.split("\n"); + const firebaseError = errorLines.slice(1).join("\n").trim(); // Skip the first line which is our generic message + + if (firebaseError) { + this.log(" Actual Firebase CLI error:", "error"); + this.log(` ${firebaseError}`, "error"); + } else { + this.log(" No detailed error output captured", "error"); + } + + this.log(" Common causes:", "error"); + this.log(" - Authentication issues (run: firebase login)", "error"); + this.log(" - Project permissions (check project access)", "error"); + this.log(" - Function code errors (check generated code)", "error"); + this.log(" - Resource limits (too many functions)", "error"); + this.log(" - Network issues", "error"); + } + + // On final failure, provide more detailed error information + this.log("🔍 Final deployment attempt failed. Debugging information:", "error"); + this.log(` Project: ${this.projectId}`, "error"); + this.log(` Region: ${this.region}`, "error"); + this.log(` Generated directory: ${GENERATED_DIR}`, "error"); + this.log(" Try running manually:", "error"); + this.log( + ` cd ${GENERATED_DIR} && firebase deploy --only functions --project ${this.projectId}`, + "error" + ); + throw new Error(`Deployment failed after ${MAX_DEPLOY_ATTEMPTS} attempts: ${error.message}`); + } + } + + /** + * Map suite name to test file path + */ + getTestFile(suiteName) { + const service = suiteName.split("_").slice(1).join("_"); + const version = suiteName.split("_")[0]; + + // Special cases + if (suiteName === "v2_alerts") { + return null; // Deployment only, no tests + } + + // Map service names to test files + const serviceMap = { + firestore: `tests/${version}/firestore.test.ts`, + database: `tests/${version}/database.test.ts`, + pubsub: `tests/${version}/pubsub.test.ts`, + storage: `tests/${version}/storage.test.ts`, + tasks: `tests/${version}/tasks.test.ts`, + remoteconfig: + version === "v1" ? "tests/v1/remoteconfig.test.ts" : "tests/v2/remoteConfig.test.ts", + testlab: version === "v1" ? "tests/v1/testlab.test.ts" : "tests/v2/testLab.test.ts", + scheduler: "tests/v2/scheduler.test.ts", + identity: "tests/v2/identity.test.ts", + eventarc: "tests/v2/eventarc.test.ts", + }; + + return serviceMap[service] || null; + } + + /** + * Run tests for deployed functions + */ + async runTests(suiteNames) { + this.log("🧪 Running tests...", "info"); + + // Check for service account + if (!existsSync(SA_JSON_PATH)) { + this.log( + "⚠️ Warning: sa.json not found. Tests may fail without proper authentication.", + "warn" + ); + } + + // Collect test files for all suites + const testFiles = []; + const seenFiles = new Set(); + let deployedFunctions = []; + + for (const suiteName of suiteNames) { + // Track deployed identity/auth functions + if (suiteName === "v2_identity") { + deployedFunctions.push("beforeUserCreated", "beforeUserSignedIn"); + } + + const testFile = this.getTestFile(suiteName); + if (testFile && !seenFiles.has(testFile)) { + const fullPath = join(ROOT_DIR, testFile); + if (existsSync(fullPath)) { + testFiles.push(testFile); + seenFiles.add(testFile); + } + } + } + + if (testFiles.length === 0) { + this.log("⚠️ No test files found for the generated suites.", "warn"); + this.log(" Skipping test execution (deployment-only suites).", "success"); + return; + } + + // Run Jest tests + const env = { + TEST_RUN_ID: this.testRunId, + PROJECT_ID: this.projectId, + REGION: this.region, + DEPLOYED_FUNCTIONS: deployedFunctions.join(","), + ...process.env, + }; + + if (existsSync(SA_JSON_PATH)) { + env.GOOGLE_APPLICATION_CREDENTIALS = SA_JSON_PATH; + } + + this.log(`Running tests: ${testFiles.join(", ")}`, "info"); + this.log(`TEST_RUN_ID: ${this.testRunId}`, "info"); + + await this.exec(`npm test -- ${testFiles.join(" ")}`, { env }); + } + + /** + * Clean up deployed functions and test data + */ + async cleanup() { + this.log("🧹 Running cleanup...", "warn"); + + const metadataPath = join(GENERATED_DIR, ".metadata.json"); + if (!existsSync(metadataPath)) { + this.log(" No metadata found, skipping cleanup", "warn"); + return; + } + + const metadata = JSON.parse(readFileSync(metadataPath, "utf8")); + + // Only delete functions if deployment was successful + if (this.deploymentSuccess) { + await this.cleanupFunctions(metadata); + } + + // Clean up test data + await this.cleanupTestData(metadata); + + // Clean up generated files + this.log(" Cleaning up generated files...", "warn"); + if (existsSync(GENERATED_DIR)) { + rmSync(GENERATED_DIR, { recursive: true, force: true }); + mkdirSync(GENERATED_DIR, { recursive: true }); + } + } + + /** + * Delete deployed functions + */ + async cleanupFunctions(metadata) { + this.log(" Deleting deployed functions...", "warn"); + + // Extract function names from metadata + const functions = []; + for (const suite of metadata.suites || []) { + for (const func of suite.functions || []) { + functions.push(func); + } + } + + for (const functionName of functions) { + let deleted = false; + + // Try Firebase CLI first + try { + await this.exec( + `firebase functions:delete ${functionName} --project ${metadata.projectId} --region ${ + metadata.region || DEFAULT_REGION + } --force` + ); + + // Verify the function was actually deleted + this.log(` Verifying deletion of ${functionName}...`, "info"); + try { + const listResult = await this.exec( + `firebase functions:list --project ${metadata.projectId}`, + { silent: true } + ); + + // Check if function still exists in the list + const functionStillExists = listResult.stdout.includes(functionName); + + if (!functionStillExists) { + this.log(` ✅ Verified: Function deleted via Firebase CLI: ${functionName}`, "success"); + deleted = true; + } else { + this.log(` ⚠️ Function still exists after Firebase CLI delete: ${functionName}`, "warn"); + } + } catch (listError) { + // If we can't list functions, assume deletion worked + this.log(` ✅ Deleted function via Firebase CLI (unverified): ${functionName}`, "success"); + deleted = true; + } + } catch (error) { + this.log(` ⚠️ Firebase CLI delete failed for ${functionName}: ${error.message}`, "warn"); + } + + // If not deleted yet, try alternative methods + if (!deleted) { + // For v2 functions, try to delete the Cloud Run service directly + if (metadata.projectId === "functions-integration-tests-v2") { + this.log(` Attempting Cloud Run service deletion for v2 function...`, "warn"); + try { + await this.exec( + `gcloud run services delete ${functionName} --region=${ + metadata.region || DEFAULT_REGION + } --project=${metadata.projectId} --quiet`, + { silent: true } + ); + + // Verify deletion + try { + await this.exec( + `gcloud run services describe ${functionName} --region=${ + metadata.region || DEFAULT_REGION + } --project=${metadata.projectId}`, + { silent: true } + ); + // If describe succeeds, function still exists + this.log(` ⚠️ Cloud Run service still exists after deletion: ${functionName}`, "warn"); + } catch { + // If describe fails, function was deleted + this.log(` ✅ Deleted function as Cloud Run service: ${functionName}`, "success"); + deleted = true; + } + } catch (runError) { + this.log(` ⚠️ Cloud Run delete failed: ${runError.message}`, "warn"); + } + } + + // If still not deleted, try gcloud functions delete as last resort + if (!deleted) { + try { + await this.exec( + `gcloud functions delete ${functionName} --region=${ + metadata.region || DEFAULT_REGION + } --project=${metadata.projectId} --quiet`, + { silent: true } + ); + + // Verify deletion + try { + await this.exec( + `gcloud functions describe ${functionName} --region=${ + metadata.region || DEFAULT_REGION + } --project=${metadata.projectId}`, + { silent: true } + ); + // If describe succeeds, function still exists + this.log(` ⚠️ Function still exists after gcloud delete: ${functionName}`, "warn"); + deleted = false; + } catch { + // If describe fails, function was deleted + this.log(` ✅ Deleted function via gcloud: ${functionName}`, "success"); + deleted = true; + } + } catch (e) { + this.log(` ❌ Failed to delete function ${functionName} via any method`, "error"); + this.log(` Last error: ${e.message}`, "error"); + } + } + } + } + } + + /** + * Clean up test data from Firestore + */ + async cleanupTestData(metadata) { + this.log(" Cleaning up Firestore test data...", "warn"); + + // Extract collection names from metadata + const collections = new Set(); + + for (const suite of metadata.suites || []) { + for (const func of suite.functions || []) { + if (func.collection) { + collections.add(func.collection); + } + // Also add function name without TEST_RUN_ID as collection + const baseName = func.name ? func.name.replace(this.testRunId, "") : null; + if (baseName && baseName.includes("Tests")) { + collections.add(baseName); + } + } + } + + // Clean up each collection + for (const collection of collections) { + try { + await this.exec( + `firebase firestore:delete ${collection}/${this.testRunId} --project ${metadata.projectId} --yes`, + { silent: true } + ); + } catch (e) { + // Ignore cleanup errors + } + } + + // Clean up auth users if auth tests were run + if (metadata.suites.some((s) => s.name.includes("auth") || s.name.includes("identity"))) { + this.log(" Cleaning up auth test users...", "warn"); + try { + await this.exec(`node ${join(__dirname, "cleanup-auth-users.cjs")} ${this.testRunId}`, { + silent: true, + }); + } catch (e) { + // Ignore cleanup errors + } + } + + // Clean up Cloud Tasks queues if tasks tests were run + if (metadata.suites.some((s) => s.name.includes("tasks"))) { + await this.cleanupCloudTasksQueues(metadata); + } + } + + /** + * Clean up Cloud Tasks queues created by tests + */ + async cleanupCloudTasksQueues(metadata) { + this.log(" Cleaning up Cloud Tasks queues...", "warn"); + + const region = metadata.region || DEFAULT_REGION; + const projectId = metadata.projectId; + + // Extract queue names from metadata (function names become queue names in v1) + const queueNames = new Set(); + for (const suite of metadata.suites || []) { + if (suite.name.includes("tasks")) { + for (const func of suite.functions || []) { + if (func.name && func.name.includes("Tests")) { + // Function name becomes the queue name in v1 + queueNames.add(func.name); + } + } + } + } + + // Delete each queue + for (const queueName of queueNames) { + try { + this.log(` Deleting Cloud Tasks queue: ${queueName}`, "warn"); + + // Try gcloud command to delete the queue + await this.exec( + `gcloud tasks queues delete ${queueName} --location=${region} --project=${projectId} --quiet`, + { silent: true } + ); + this.log(` ✅ Deleted Cloud Tasks queue: ${queueName}`); + } catch (error) { + // Queue might not exist or already deleted, ignore errors + this.log(` ⚠️ Could not delete queue ${queueName}: ${error.message}`, "warn"); + } + } + } + + /** + * Save test artifact for future cleanup + */ + saveTestArtifact(metadata) { + if (!existsSync(ARTIFACTS_DIR)) { + mkdirSync(ARTIFACTS_DIR, { recursive: true }); + } + + const artifactPath = join(ARTIFACTS_DIR, `${this.testRunId}.json`); + writeFileSync(artifactPath, JSON.stringify(metadata, null, 2)); + this.log(`✓ Saved artifact for future cleanup: ${this.testRunId}.json`, "success"); + } + + /** + * Get project IDs from configuration (YAML files are source of truth) + */ + getProjectId() { + // Python SDK only supports 2nd gen functions - all tests use functions-integration-tests-v2 + const projectId = "functions-integration-tests-v2"; + + this.log(`Using Project ID: ${projectId}`, "info"); + + return projectId; + } + + /** + * Clean up existing test resources before running + */ + async cleanupExistingResources(suiteNames = []) { + this.log("🧹 Checking for existing test functions...", "warn"); + + const projectId = this.getProjectId(); + + // Python SDK only uses one project - clean it + this.log(` Checking project: ${projectId}`, "warn"); + + try { + // List functions and find test functions + const result = await this.exec(`firebase functions:list --project ${projectId}`, { + silent: true, + }); + + // Parse the table output from firebase functions:list + const lines = result.stdout.split("\n"); + const testFunctions = []; + + for (const line of lines) { + // Look for table rows with function names (containing │) + // Skip header rows and empty lines + if (line.includes("│") && !line.includes("Function") && !line.includes("────")) { + const parts = line.split("│"); + if (parts.length >= 2) { + const functionName = parts[1].trim(); + // Add ALL functions for cleanup (not just test functions) + // This ensures a clean slate for testing + if (functionName && functionName.length > 0) { + testFunctions.push(functionName); + } + } + } + } + + if (testFunctions.length > 0) { + this.log( + ` Found ${testFunctions.length} function(s) in ${projectId}. Cleaning up ALL functions...`, + "warn" + ); + + for (const func of testFunctions) { + try { + // Function names from firebase functions:list are just the name, no region suffix + const functionName = func.trim(); + const region = DEFAULT_REGION; + let deleted = false; + + this.log(` Deleting function: ${functionName} in region: ${region}`, "warn"); + + // Try Firebase CLI first + try { + await this.exec( + `firebase functions:delete ${functionName} --project ${projectId} --region ${region} --force`, + { silent: true } + ); + + // Verify the function was actually deleted + this.log(` Verifying deletion of ${functionName}...`, "info"); + try { + const listResult = await this.exec( + `firebase functions:list --project ${projectId}`, + { silent: true } + ); + + // Check if function still exists in the list + const functionStillExists = listResult.stdout.includes(functionName); + + if (!functionStillExists) { + this.log(` ✅ Verified: Function deleted via Firebase CLI: ${functionName}`, "success"); + deleted = true; + } else { + this.log(` ⚠️ Function still exists after Firebase CLI delete: ${functionName}`, "warn"); + } + } catch (listError) { + // If we can't list functions, assume deletion worked + this.log(` ✅ Deleted via Firebase CLI (unverified): ${functionName}`, "success"); + deleted = true; + } + } catch (firebaseError) { + this.log(` ⚠️ Firebase CLI delete failed for ${functionName}: ${firebaseError.message}`, "warn"); + } + + // If not deleted yet, try gcloud as fallback + if (!deleted) { + this.log(` Trying gcloud for: ${functionName}`, "warn"); + try { + await this.exec( + `gcloud functions delete ${functionName} --region=${region} --project=${projectId} --quiet`, + { silent: true } + ); + + // Verify deletion + try { + await this.exec( + `gcloud functions describe ${functionName} --region=${region} --project=${projectId}`, + { silent: true } + ); + // If describe succeeds, function still exists + this.log(` ⚠️ Function still exists after gcloud delete: ${functionName}`, "warn"); + } catch { + // If describe fails, function was deleted + this.log(` ✅ Deleted via gcloud: ${functionName}`, "success"); + deleted = true; + } + } catch (gcloudError) { + this.log(` ❌ Failed to delete: ${functionName}`, "error"); + this.log(` Gcloud error: ${gcloudError.message}`, "error"); + } + } + + if (!deleted) { + this.log(` ❌ Failed to delete function ${functionName} via any method`, "error"); + } + } catch (e) { + this.log(` ❌ Unexpected error deleting ${func}: ${e.message}`, "error"); + } + } + } else { + this.log(` ✅ No functions found in ${projectId}`, "success"); + } + } catch (e) { + // Project might not be accessible + this.log(` ⚠️ Could not list functions in ${projectId}: ${e.message}`, "warn"); + } + + // Clean up orphaned Cloud Tasks queues + await this.cleanupOrphanedCloudTasksQueues(suiteNames); + + // Clean up generated directory + if (existsSync(GENERATED_DIR)) { + this.log(" Cleaning up generated directory...", "warn"); + rmSync(GENERATED_DIR, { recursive: true, force: true }); + } + } + + /** + * Clean up orphaned Cloud Tasks queues from previous test runs + */ + async cleanupOrphanedCloudTasksQueues(suiteNames = []) { + this.log(" Checking for orphaned Cloud Tasks queues...", "warn"); + + const projectId = this.getProjectId(); + const region = DEFAULT_REGION; + this.log(` Checking Cloud Tasks queues in project: ${projectId}`, "warn"); + + try { + // List all queues in the project + const result = await this.exec( + `gcloud tasks queues list --location=${region} --project=${projectId} --format="value(name)"`, + { silent: true } + ); + + const queueNames = result.stdout + .split("\n") + .map((line) => line.trim()) + .filter((line) => line.length > 0); + + // Find test queues (containing "Tests" and test run ID pattern) + const testQueues = queueNames.filter((queueName) => { + const queueId = queueName.split("/").pop(); // Extract queue ID from full path + return queueId && queueId.match(/Tests.*t[a-z0-9]{7,10}/); + }); + + if (testQueues.length > 0) { + this.log( + ` Found ${testQueues.length} orphaned test queue(s) in ${projectId}. Cleaning up...`, + "warn" + ); + + for (const queuePath of testQueues) { + try { + const queueId = queuePath.split("/").pop(); + this.log(` Deleting orphaned queue: ${queueId}`, "warn"); + + await this.exec( + `gcloud tasks queues delete ${queueId} --location=${region} --project=${projectId} --quiet`, + { silent: true } + ); + this.log(` ✅ Deleted orphaned queue: ${queueId}`); + } catch (error) { + this.log(` ⚠️ Could not delete queue ${queuePath}: ${error.message}`, "warn"); + } + } + } else { + this.log(` ✅ No orphaned test queues found in ${projectId}`, "success"); + } + } catch (e) { + // Project might not be accessible or Cloud Tasks API not enabled + this.log(` ⚠️ Could not check queues in ${projectId}: ${e.message}`, "warn"); + } + } + + /** + * Run a single suite + */ + async runSuite(suiteName) { + this.log("═══════════════════════════════════════════════════════════", "info"); + this.log(`🚀 Running suite: ${suiteName}`, "success"); + this.log("═══════════════════════════════════════════════════════════", "info"); + + try { + // Generate functions + const metadata = await this.generateFunctions([suiteName]); + + // Find this suite's specific projectId and region + const suiteMetadata = metadata.suites.find((s) => s.name === suiteName); + if (suiteMetadata) { + this.projectId = suiteMetadata.projectId || metadata.projectId; + this.region = suiteMetadata.region || metadata.region || DEFAULT_REGION; + this.log(` Using project: ${this.projectId}, region: ${this.region}`, "info"); + } + + // Build functions + await this.buildFunctions(); + + // Deploy functions + await this.deployFunctions(); + + // Wait for functions to become fully available + this.log("⏳ Waiting 30 seconds for functions to become fully available...", "info"); + await new Promise(resolve => setTimeout(resolve, 30000)); + + // Run tests + await this.runTests([suiteName]); + + this.results.passed.push(suiteName); + this.log(`✅ Suite ${suiteName} completed successfully`, "success"); + return true; + } catch (error) { + this.results.failed.push(suiteName); + this.log(`❌ Suite ${suiteName} failed: ${error.message}`, "error"); + return false; + } finally { + // Always run cleanup + await this.cleanup(); + } + } + + /** + * Run multiple suites sequentially + */ + async runSequential(suiteNames) { + this.log("═══════════════════════════════════════════════════════════", "info"); + this.log("🚀 Starting Sequential Test Suite Execution", "success"); + this.log("═══════════════════════════════════════════════════════════", "info"); + this.log(`📋 Test Run ID: ${this.testRunId}`, "success"); + this.log(`📝 Main log: ${this.logFile}`, "warn"); + this.log(`📁 Logs directory: ${LOGS_DIR}`, "warn"); + this.log(""); + + this.log(`📋 Running ${suiteNames.length} suite(s) sequentially:`, "success"); + for (const suite of suiteNames) { + this.log(` - ${suite}`); + } + this.log(""); + + // Clean up existing resources unless skipped (only for relevant projects) + if (!this.skipCleanup) { + await this.cleanupExistingResources(suiteNames); + } + + // SDK should be pre-packed (by Cloud Build or manually) + if (!this.usePublishedSDK) { + this.log("📦 Using pre-packed SDK for all suites...", "info"); + } + + // Run each suite + for (const suite of suiteNames) { + await this.runSuite(suite); + this.log(""); + } + + // Final summary + this.printSummary(); + } + + /** + * Run multiple suites in parallel + */ + async runParallel(suiteNames) { + this.log("═══════════════════════════════════════════════════════════", "info"); + this.log("🚀 Running Test Suite(s)", "success"); + this.log("═══════════════════════════════════════════════════════════", "info"); + this.log(`📋 Test Run ID: ${this.testRunId}`, "success"); + this.log(""); + + // First, generate functions to get metadata with projectIds + const metadata = await this.generateFunctions(suiteNames); + + // Group suites by projectId + const suitesByProject = {}; + for (const suite of metadata.suites) { + const projectId = suite.projectId || metadata.projectId; + if (!suitesByProject[projectId]) { + suitesByProject[projectId] = []; + } + suitesByProject[projectId].push(suite.name); + } + + const projectCount = Object.keys(suitesByProject).length; + if (projectCount > 1) { + this.log( + `📊 Found ${projectCount} different projects. Running each group separately:`, + "warn" + ); + for (const [projectId, suites] of Object.entries(suitesByProject)) { + this.log(` - ${projectId}: ${suites.join(", ")}`); + } + this.log(""); + + // Run each project group separately + for (const [projectId, projectSuites] of Object.entries(suitesByProject)) { + this.log(`🚀 Running suites for project: ${projectId}`, "info"); + + // Set project context for this group + this.projectId = projectId; + const suiteMetadata = metadata.suites.find((s) => projectSuites.includes(s.name)); + this.region = suiteMetadata?.region || metadata.region || DEFAULT_REGION; + + try { + // Build functions (already generated) + await this.buildFunctions(); + + // Deploy functions + await this.deployFunctions(); + + // Wait for functions to become fully available + this.log("⏳ Waiting 30 seconds for functions to become fully available...", "info"); + await new Promise(resolve => setTimeout(resolve, 30000)); + + // Run tests for this project's suites + await this.runTests(projectSuites); + + this.results.passed.push(...projectSuites); + } catch (error) { + this.results.failed.push(...projectSuites); + this.log(`❌ Tests failed for ${projectId}: ${error.message}`, "error"); + } + + // Cleanup after each project group + await this.cleanup(); + } + } else { + // All suites use the same project, run normally + try { + // Build functions + await this.buildFunctions(); + + // Deploy functions + await this.deployFunctions(); + + // Wait for functions to become fully available + this.log("⏳ Waiting 30 seconds for functions to become fully available...", "info"); + await new Promise(resolve => setTimeout(resolve, 30000)); + + // Run tests + await this.runTests(suiteNames); + + this.results.passed = suiteNames; + this.log("✅ All tests passed!", "success"); + } catch (error) { + this.results.failed = suiteNames; + this.log(`❌ Tests failed: ${error.message}`, "error"); + throw error; + } finally { + // Always run cleanup + await this.cleanup(); + } + } + } + + /** + * Print test results summary + */ + printSummary() { + this.log("═══════════════════════════════════════════════════════════", "info"); + this.log("📊 Test Suite Summary", "success"); + this.log("═══════════════════════════════════════════════════════════", "info"); + this.log(`✅ Passed: ${this.results.passed.length} suite(s)`, "success"); + this.log(`❌ Failed: ${this.results.failed.length} suite(s)`, "error"); + + if (this.results.failed.length > 0) { + this.log(`Failed suites: ${this.results.failed.join(", ")}`, "error"); + this.log(`📝 Check main log: ${this.logFile}`, "warn"); + } else { + this.log("🎉 All suites passed!", "success"); + } + } +} + +/** + * Main CLI handler + */ +async function main() { + const args = process.argv.slice(2); + + // Parse command line arguments + const options = { + sequential: false, + saveArtifact: false, + skipCleanup: false, + filter: "", + exclude: "", + testRunId: null, + usePublishedSDK: null, + verbose: false, + cleanupOrphaned: false, + list: false, + help: false, + }; + + const suitePatterns = []; + + for (let i = 0; i < args.length; i++) { + const arg = args[i]; + + if (arg === "--help" || arg === "-h") { + options.help = true; + } else if (arg === "--list") { + options.list = true; + } else if (arg === "--sequential") { + options.sequential = true; + } else if (arg === "--save-artifact") { + options.saveArtifact = true; + } else if (arg === "--skip-cleanup") { + options.skipCleanup = true; + } else if (arg === "--verbose" || arg === "-v") { + options.verbose = true; + } else if (arg === "--cleanup-orphaned") { + options.cleanupOrphaned = true; + } else if (arg.startsWith("--filter=")) { + options.filter = arg.split("=")[1]; + } else if (arg.startsWith("--exclude=")) { + options.exclude = arg.split("=")[1]; + } else if (arg.startsWith("--test-run-id=")) { + options.testRunId = arg.split("=")[1]; + } else if (arg.startsWith("--use-published-sdk=")) { + options.usePublishedSDK = arg.split("=")[1]; + } else if (!arg.startsWith("-")) { + suitePatterns.push(arg); + } + } + + // Show help + if (options.help || (args.length === 0 && !options.list)) { + console.log(chalk.blue("Usage: node run-tests.js [suites...] [options]")); + console.log(""); + console.log("Examples:"); + console.log(" node run-tests.js v2_firestore # Single suite"); + console.log(" node run-tests.js v2_firestore v2_database # Multiple suites"); + console.log(' node run-tests.js "v2_*" # All v2 suites (pattern)'); + console.log(' node run-tests.js --sequential "v2_*" # Sequential execution'); + console.log(" node run-tests.js --filter=firestore # Filter suites"); + console.log(" node run-tests.js --list # List available suites"); + console.log(""); + console.log("Options:"); + console.log(" --sequential Run suites sequentially instead of in parallel"); + console.log(" --filter=PATTERN Only run suites matching pattern"); + console.log(" --exclude=PATTERN Skip suites matching pattern"); + console.log(" --test-run-id=ID Use specific TEST_RUN_ID"); + console.log( + " --use-published-sdk=VER Use published SDK version instead of local (default: use pre-packed local)" + ); + console.log(" --save-artifact Save test metadata for future cleanup"); + console.log(" --skip-cleanup Skip pre-run cleanup (sequential mode only)"); + console.log(" --verbose, -v Show detailed Firebase CLI output during deployment"); + console.log(" --cleanup-orphaned Clean up orphaned test functions and exit"); + console.log(" --list List all available suites"); + console.log(" --help, -h Show this help message"); + process.exit(0); + } + + // List suites + if (options.list) { + const runner = new TestRunner(); + const allSuites = runner.getAllSuites(); + + console.log(chalk.blue("\nAvailable test suites:")); + console.log(chalk.blue("─────────────────────")); + + const v1Suites = allSuites.filter((s) => s.startsWith("v1_")); + const v2Suites = allSuites.filter((s) => s.startsWith("v2_")); + + if (v1Suites.length > 0) { + console.log(chalk.green("\n📁 V1 Suites:")); + v1Suites.forEach((suite) => console.log(` - ${suite}`)); + } + + if (v2Suites.length > 0) { + console.log(chalk.green("\n📁 V2 Suites:")); + v2Suites.forEach((suite) => console.log(` - ${suite}`)); + } + + process.exit(0); + } + + // Create runner instance + const runner = new TestRunner(options); + + // Handle cleanup-orphaned option + if (options.cleanupOrphaned) { + console.log(chalk.blue("🧹 Cleaning up orphaned test functions...")); + await runner.cleanupExistingResources(); + console.log(chalk.green("✅ Orphaned function cleanup completed")); + process.exit(0); + } + + // Get filtered suite list + let suites; + if (suitePatterns.length === 0 && options.sequential) { + // No patterns specified in sequential mode, run all suites + suites = runner.getAllSuites(); + if (options.filter) { + suites = suites.filter((s) => s.includes(options.filter)); + } + if (options.exclude) { + suites = suites.filter((s) => !s.match(new RegExp(options.exclude))); + } + } else { + suites = runner.filterSuites(suitePatterns); + } + + if (suites.length === 0) { + console.log(chalk.red("❌ No test suites found matching criteria")); + process.exit(1); + } + + try { + // Run tests + if (options.sequential) { + await runner.runSequential(suites); + } else { + await runner.runParallel(suites); + } + + // Exit with appropriate code + process.exit(runner.results.failed.length > 0 ? 1 : 0); + } catch (error) { + console.error(chalk.red(`❌ Test execution failed: ${error.message}`)); + if (error.stack) { + console.error(chalk.gray(error.stack)); + } + process.exit(1); + } +} + +// Handle uncaught errors +process.on("unhandledRejection", (error) => { + console.error(chalk.red("❌ Unhandled error:"), error); + process.exit(1); +}); + +// Run main function +main(); diff --git a/integration_test/scripts/util.sh b/integration_test/scripts/util.sh new file mode 100755 index 0000000..cb33409 --- /dev/null +++ b/integration_test/scripts/util.sh @@ -0,0 +1,90 @@ +#!/bin/bash + +# util.sh - Common utility functions for integration tests + +# Colors for output +RED='\033[0;31m' +GREEN='\033[0;32m' +YELLOW='\033[1;33m' +BLUE='\033[0;34m' +NC='\033[0m' # No Color + +# Default configuration +DEFAULT_MAX_RETRIES=3 +DEFAULT_BASE_DELAY=5 +DEFAULT_MAX_DELAY=60 +DEFAULT_TIMEOUT=300 + +# Exponential backoff with jitter +exponential_backoff() { + local attempt="$1" + local base_delay="$2" + local max_delay="$3" + + # Calculate delay: base_delay * 2^(attempt-1) + local delay=$((base_delay * (2 ** (attempt - 1)))) + + # Cap at max_delay + if [ $delay -gt $max_delay ]; then + delay=$max_delay + fi + + # Add jitter (±25% random variation) + local jitter=$((delay / 4)) + local random_jitter=$((RANDOM % (jitter * 2) - jitter)) + delay=$((delay + random_jitter)) + + # Ensure minimum delay of 1 second + if [ $delay -lt 1 ]; then + delay=1 + fi + + echo $delay +} + +# Retry function with exponential backoff +retry_with_backoff() { + local max_attempts="${1:-$DEFAULT_MAX_RETRIES}" + local base_delay="${2:-$DEFAULT_BASE_DELAY}" + local max_delay="${3:-$DEFAULT_MAX_DELAY}" + local timeout="${4:-$DEFAULT_TIMEOUT}" + local attempt=1 + shift 4 + + while [ $attempt -le $max_attempts ]; do + echo -e "${YELLOW}🔄 Attempt $attempt of $max_attempts: $@${NC}" + + if timeout "${timeout}s" "$@"; then + echo -e "${GREEN}✅ Command succeeded${NC}" + return 0 + fi + + if [ $attempt -lt $max_attempts ]; then + local delay=$(exponential_backoff $attempt $base_delay $max_delay) + echo -e "${YELLOW}⚠️ Command failed. Retrying in ${delay} seconds...${NC}" + sleep $delay + fi + + attempt=$((attempt + 1)) + done + + echo -e "${RED}❌ Command failed after $max_attempts attempts${NC}" + return 1 +} + +# Logging functions +log_info() { + echo -e "${GREEN}[INFO]${NC} $1" +} + +log_warn() { + echo -e "${YELLOW}[WARN]${NC} $1" +} + +log_error() { + echo -e "${RED}[ERROR]${NC} $1" +} + +log_debug() { + echo -e "${BLUE}[DEBUG]${NC} $1" +} \ No newline at end of file diff --git a/integration_test/src/utils/logger.ts b/integration_test/src/utils/logger.ts new file mode 100644 index 0000000..69b96f9 --- /dev/null +++ b/integration_test/src/utils/logger.ts @@ -0,0 +1,165 @@ +import chalk from "chalk"; + +export enum LogLevel { + DEBUG = 0, + INFO = 1, + SUCCESS = 2, + WARNING = 3, + ERROR = 4, + NONE = 5, +} + +export class Logger { + private static instance: Logger; + private logLevel: LogLevel; + private useEmojis: boolean; + + private constructor(logLevel: LogLevel = LogLevel.INFO, useEmojis = true) { + this.logLevel = logLevel; + this.useEmojis = useEmojis; + } + + static getInstance(): Logger { + if (!Logger.instance) { + const level = process.env.LOG_LEVEL + ? LogLevel[process.env.LOG_LEVEL as keyof typeof LogLevel] || LogLevel.INFO + : LogLevel.INFO; + Logger.instance = new Logger(level); + } + return Logger.instance; + } + + setLogLevel(level: LogLevel): void { + this.logLevel = level; + } + + private formatTimestamp(): string { + return new Date().toISOString().replace("T", " ").split(".")[0]; + } + + private shouldLog(level: LogLevel): boolean { + return level >= this.logLevel; + } + + debug(message: string, ...args: any[]): void { + if (!this.shouldLog(LogLevel.DEBUG)) return; + + const timestamp = chalk.gray(this.formatTimestamp()); + const prefix = this.useEmojis ? "🔍" : "[DEBUG]"; + const formattedMsg = chalk.gray(`${prefix} ${message}`); + + console.log(`${timestamp} ${formattedMsg}`, ...args); + } + + info(message: string, ...args: any[]): void { + if (!this.shouldLog(LogLevel.INFO)) return; + + const timestamp = chalk.gray(this.formatTimestamp()); + const prefix = this.useEmojis ? "ℹ️ " : "[INFO]"; + const formattedMsg = chalk.blue(`${prefix} ${message}`); + + console.log(`${timestamp} ${formattedMsg}`, ...args); + } + + success(message: string, ...args: any[]): void { + if (!this.shouldLog(LogLevel.SUCCESS)) return; + + const timestamp = chalk.gray(this.formatTimestamp()); + const prefix = this.useEmojis ? "✅" : "[SUCCESS]"; + const formattedMsg = chalk.green(`${prefix} ${message}`); + + console.log(`${timestamp} ${formattedMsg}`, ...args); + } + + warning(message: string, ...args: any[]): void { + if (!this.shouldLog(LogLevel.WARNING)) return; + + const timestamp = chalk.gray(this.formatTimestamp()); + const prefix = this.useEmojis ? "⚠️ " : "[WARN]"; + const formattedMsg = chalk.yellow(`${prefix} ${message}`); + + console.warn(`${timestamp} ${formattedMsg}`, ...args); + } + + error(message: string, error?: Error | any, ...args: any[]): void { + if (!this.shouldLog(LogLevel.ERROR)) return; + + const timestamp = chalk.gray(this.formatTimestamp()); + const prefix = this.useEmojis ? "❌" : "[ERROR]"; + const formattedMsg = chalk.red(`${prefix} ${message}`); + + if (error instanceof Error) { + console.error(`${timestamp} ${formattedMsg}`, ...args); + console.error(chalk.red(error.stack || error.message)); + } else if (error) { + console.error(`${timestamp} ${formattedMsg}`, error, ...args); + } else { + console.error(`${timestamp} ${formattedMsg}`, ...args); + } + } + + // Special contextual loggers for test harness + cleanup(message: string, ...args: any[]): void { + if (!this.shouldLog(LogLevel.INFO)) return; + + const timestamp = chalk.gray(this.formatTimestamp()); + const prefix = this.useEmojis ? "🧹" : "[CLEANUP]"; + const formattedMsg = chalk.cyan(`${prefix} ${message}`); + + console.log(`${timestamp} ${formattedMsg}`, ...args); + } + + deployment(message: string, ...args: any[]): void { + if (!this.shouldLog(LogLevel.INFO)) return; + + const timestamp = chalk.gray(this.formatTimestamp()); + const prefix = this.useEmojis ? "🚀" : "[DEPLOY]"; + const formattedMsg = chalk.magenta(`${prefix} ${message}`); + + console.log(`${timestamp} ${formattedMsg}`, ...args); + } + + // Group related logs visually + group(title: string): void { + const line = chalk.gray("─".repeat(50)); + console.log(`\n${line}`); + console.log(chalk.bold.white(title)); + console.log(line); + } + + groupEnd(): void { + console.log(chalk.gray("─".repeat(50)) + "\n"); + } +} + +// Export singleton instance for convenience +export const logger = Logger.getInstance(); + +// Export legacy functions for backwards compatibility +export function logInfo(message: string): void { + logger.info(message); +} + +export function logError(message: string, error?: Error): void { + logger.error(message, error); +} + +export function logSuccess(message: string): void { + logger.success(message); +} + +export function logWarning(message: string): void { + logger.warning(message); +} + +export function logDebug(message: string): void { + logger.debug(message); +} + +export function logCleanup(message: string): void { + logger.cleanup(message); +} + +export function logDeployment(message: string): void { + logger.deployment(message); +} \ No newline at end of file diff --git a/integration_test/templates/firebase.json.hbs b/integration_test/templates/firebase.json.hbs new file mode 100644 index 0000000..a4b1475 --- /dev/null +++ b/integration_test/templates/firebase.json.hbs @@ -0,0 +1,15 @@ +{ + "functions": { + "source": "functions", + "codebase": "default", + "ignore": [ + "node_modules", + ".git", + "firebase-debug.log", + "firebase-debug.*.log" + ], + "predeploy": [ + "npm --prefix \"$RESOURCE_DIR\" run build" + ] + } +} \ No newline at end of file diff --git a/integration_test/templates/functions/firebase.json.hbs b/integration_test/templates/functions/firebase.json.hbs new file mode 100644 index 0000000..a11cddd --- /dev/null +++ b/integration_test/templates/functions/firebase.json.hbs @@ -0,0 +1,18 @@ +{ + "functions": [ + { + "source": "functions", + "codebase": "default", + "runtime": "python311", + "ignore": [ + "venv", + ".venv", + "__pycache__", + ".pytest_cache", + "*.pyc", + ".git", + "*.log" + ] + } + ] +} \ No newline at end of file diff --git a/integration_test/templates/functions/requirements.txt.hbs b/integration_test/templates/functions/requirements.txt.hbs new file mode 100644 index 0000000..37577dc --- /dev/null +++ b/integration_test/templates/functions/requirements.txt.hbs @@ -0,0 +1,13 @@ +# Firebase Functions Integration Tests - Python Dependencies +# Generated for test run: {{testRunId}} + +# Firebase Functions SDK (local build) +{{sdkPackage}} + +# Firebase Admin SDK +firebase-admin>=6.0.1 + +# Additional dependencies +{{#each dependencies}} +{{this}} +{{/each}} \ No newline at end of file diff --git a/integration_test/templates/functions/src/main.py.hbs b/integration_test/templates/functions/src/main.py.hbs new file mode 100644 index 0000000..1eff608 --- /dev/null +++ b/integration_test/templates/functions/src/main.py.hbs @@ -0,0 +1,21 @@ +""" +Firebase Functions Integration Tests - Generated Functions +Project ID: {{projectId}} +Test Run ID: {{testRunId}} +""" + +import os +from firebase_admin import initialize_app + +# Initialize admin SDK +project_id = os.environ.get('PROJECT_ID') or os.environ.get('GCLOUD_PROJECT') or "{{projectId}}" + +try: + initialize_app() +except Exception as error: + print(f"Admin SDK initialization skipped: {error}") + +# Import all generated test suites +{{#each suites}} +from {{this.version}}.{{this.service}}_tests import * # {{this.name}} +{{/each}} \ No newline at end of file diff --git a/integration_test/templates/functions/src/utils.py.hbs b/integration_test/templates/functions/src/utils.py.hbs new file mode 100644 index 0000000..4fdaadf --- /dev/null +++ b/integration_test/templates/functions/src/utils.py.hbs @@ -0,0 +1,36 @@ +""" +Utility functions for Firebase Functions integration tests +""" + +from typing import Any, Dict + + +def sanitize_data(context: Any) -> Dict[str, Any]: + """ + Sanitize context data for storage in Firestore. + Removes non-serializable fields and structures data appropriately. + """ + result = {} + + # Extract basic context fields + if hasattr(context, 'event_id'): + result['eventId'] = context.event_id + if hasattr(context, 'timestamp'): + result['timestamp'] = context.timestamp + if hasattr(context, 'event_type'): + result['eventType'] = context.event_type + if hasattr(context, 'resource'): + result['resource'] = {'name': context.resource} + + # Add params if available + if hasattr(context, 'params'): + result['params'] = dict(context.params) if context.params else {} + + # Add auth context if available + if hasattr(context, 'auth') and context.auth: + result['auth'] = { + 'uid': context.auth.uid if hasattr(context.auth, 'uid') else None, + 'token': context.auth.token if hasattr(context.auth, 'token') else None + } + + return result \ No newline at end of file diff --git a/integration_test/templates/functions/src/v2/database_tests.py.hbs b/integration_test/templates/functions/src/v2/database_tests.py.hbs new file mode 100644 index 0000000..a8b8eb0 --- /dev/null +++ b/integration_test/templates/functions/src/v2/database_tests.py.hbs @@ -0,0 +1,59 @@ +""" +Realtime Database trigger tests for v2 +Test Run ID: {{testRunId}} +""" + +import json +from firebase_admin import firestore +from firebase_functions import db_fn +from firebase_functions.core import Change +from typing import Any + +REGION = "{{region}}" + +{{#each functions}} +@db_fn.{{decorator}}( + reference="{{path}}", + region=REGION, + timeout_sec={{timeout}} +) +def {{name}}{{../testRunId}}(event: {{#if (or (eq trigger "onValueUpdated") (eq trigger "onValueWritten"))}}db_fn.Event[Change[Any | None]]{{else}}db_fn.Event[Any | None]{{/if}}) -> None: + """ + Test function: {{name}} + Trigger: {{trigger}} + Path: {{path}} + """ + test_id = event.params.get("testId") + + {{#if (eq trigger "onValueWritten")}} + # For onValueWritten trigger, check if it's a delete (cleanup event) + if isinstance(event.data, Change) and event.data.after is None: + print(f"Event for {test_id} is null; presuming data cleanup, so skipping.") + return + + {{/if}} + # Prepare context data for storage + context_data = { + "id": event.id, + "time": event.time, + "type": f"google.firebase.database.ref.v1.{{eventType}}", + "source": event.source, + "resource": { + "name": event.reference + }, + "params": dict(event.params) if event.params else {}, + "url": event.reference + } + + {{#if (eq trigger "onValueUpdated")}} + # For onValueUpdated trigger, add the updated data + if isinstance(event.data, Change) and event.data.after is not None: + context_data["data"] = json.dumps(event.data.after) if event.data.after else None + {{/if}} + + # Store context in Firestore for verification + db = firestore.client() + collection_name = "{{#if collection}}{{collection}}{{else}}{{name}}{{/if}}" + db.collection(collection_name).document(test_id).set(context_data) + +{{/each}} diff --git a/integration_test/templates/functions/src/v2/eventarc_tests.py.hbs b/integration_test/templates/functions/src/v2/eventarc_tests.py.hbs new file mode 100644 index 0000000..73fddad --- /dev/null +++ b/integration_test/templates/functions/src/v2/eventarc_tests.py.hbs @@ -0,0 +1,46 @@ +""" +Eventarc trigger tests for v2 +Test Run ID: {{testRunId}} +""" + +import json +from firebase_admin import firestore +from firebase_functions import eventarc_fn + +REGION = "{{region}}" + +{{#each functions}} +@eventarc_fn.{{decorator}}( + event_type="{{eventType}}", + region=REGION, + timeout_sec={{timeout}} +) +def {{name}}{{../testRunId}}(event: eventarc_fn.CloudEvent) -> None: + """ + Test function: {{name}} + Trigger: {{trigger}} + Event Type: {{eventType}} + """ + # Extract test_id from event data + test_id = event.data.get('testId') if event.data else None + + if not test_id: + print(f"Warning: No testId found in event data") + return + + # Prepare context data for storage + context_data = { + "id": event.id, + "time": event.time.isoformat() if hasattr(event.time, 'isoformat') else str(event.time), + "type": event.type, + "source": event.source, + # Stringify the data dict for storage (matching JS behavior) + "data": json.dumps(event.data) if event.data else "{}" + } + + # Store context in Firestore for verification + db = firestore.client() + collection_name = "{{#if collection}}{{collection}}{{else}}{{name}}{{/if}}" + db.collection(collection_name).document(test_id).set(context_data) + +{{/each}} diff --git a/integration_test/templates/functions/src/v2/firestore_tests.py.hbs b/integration_test/templates/functions/src/v2/firestore_tests.py.hbs new file mode 100644 index 0000000..0a2224f --- /dev/null +++ b/integration_test/templates/functions/src/v2/firestore_tests.py.hbs @@ -0,0 +1,64 @@ +""" +Firestore trigger tests for v2 +Test Run ID: {{testRunId}} +""" + +from firebase_admin import firestore +from firebase_functions import firestore_fn +from utils import sanitize_data + +REGION = "{{region}}" + +{{#each functions}} +@firestore_fn.{{decorator}}( + document="{{document}}", + region=REGION, + timeout_sec={{timeout}} +) +def {{name}}{{../testRunId}}(event: {{#if (or (eq trigger "onDocumentUpdated") (eq trigger "onDocumentWritten"))}}firestore_fn.Event[firestore_fn.Change[firestore_fn.DocumentSnapshot]]{{else}}firestore_fn.Event[firestore_fn.DocumentSnapshot]{{/if}}) -> None: + """ + Test function: {{name}} + Trigger: {{trigger}} + Document: {{document}} + """ + test_id = event.params.get("testId") + + # Prepare context data for storage + context_data = { + "id": event.id, + "time": event.time, + "type": f"google.cloud.firestore.document.v1.{{eventType}}", + "source": event.source, + "resource": { + "name": f"projects/{event.project}/databases/{event.database}/documents/{event.document}" + }, + "params": dict(event.params) if event.params else {} + } + + {{#if (eq trigger "onDocumentWritten")}} + # For onDocumentWritten trigger, check if it's a delete + if hasattr(event.data, 'after') and event.data.after is None: + print(f"Event for {test_id} is null; presuming data cleanup, so skipping.") + return + + # Add document URL if available + if hasattr(event.data, 'after') and event.data.after: + context_data["url"] = event.data.after.reference.path + {{else if (eq trigger "onDocumentUpdated")}} + # For onDocumentUpdated trigger, add the updated data URL + if hasattr(event.data, 'after') and event.data.after: + context_data["url"] = event.data.after.reference.path + if event.data.after.to_dict(): + context_data["data"] = str(event.data.after.to_dict()) + {{else if (eq trigger "onDocumentCreated")}} + # For onDocumentCreated, add the document URL + if hasattr(event.data, 'reference'): + context_data["url"] = event.data.reference.path + {{/if}} + + # Store context in Firestore for verification + db = firestore.client() + collection_name = "{{#if collection}}{{collection}}{{else}}{{name}}{{/if}}" + db.collection(collection_name).document(test_id).set(context_data) + +{{/each}} \ No newline at end of file diff --git a/integration_test/templates/functions/src/v2/identity_tests.py.hbs b/integration_test/templates/functions/src/v2/identity_tests.py.hbs new file mode 100644 index 0000000..887af5d --- /dev/null +++ b/integration_test/templates/functions/src/v2/identity_tests.py.hbs @@ -0,0 +1,46 @@ +""" +Identity trigger tests for v2 +Test Run ID: {{testRunId}} +""" + +import os +from firebase_admin import firestore +from firebase_functions import identity_fn + +REGION = "{{region}}" + +{{#each functions}} +@identity_fn.{{decorator}}( + region=REGION, + timeout_sec={{timeout}} +) +def {{name}}{{../testRunId}}(event: identity_fn.AuthBlockingEvent) -> None: + """ + Test function: {{name}} + Trigger: {{trigger}} + """ + # Extract uid from event data + uid = event.data.uid + + # Get project ID from environment + project_id = os.environ.get('PROJECT_ID') or os.environ.get('GCLOUD_PROJECT') or "functions-integration-tests-v2" + + # Prepare context data for storage + context_data = { + "eventId": event.event_id, + "eventType": event.event_type, + "timestamp": event.timestamp.isoformat() if hasattr(event.timestamp, 'isoformat') else str(event.timestamp), + "resource": { + "name": f"projects/{project_id}" + } + } + + # Store context in Firestore for verification + db = firestore.client() + collection_name = "{{#if collection}}{{collection}}{{else}}{{name}}{{/if}}" + db.collection(collection_name).document(uid).set(context_data) + + # Return None (no modifications to user) + return None + +{{/each}} diff --git a/integration_test/templates/functions/src/v2/pubsub_tests.py.hbs b/integration_test/templates/functions/src/v2/pubsub_tests.py.hbs new file mode 100644 index 0000000..ab631a7 --- /dev/null +++ b/integration_test/templates/functions/src/v2/pubsub_tests.py.hbs @@ -0,0 +1,54 @@ +""" +Pub/Sub trigger tests for v2 +Test Run ID: {{testRunId}} +""" + +import json +from firebase_admin import firestore +from firebase_functions import pubsub_fn + +REGION = "{{region}}" + +{{#each functions}} +@pubsub_fn.{{decorator}}( + topic="{{topic}}", + region=REGION, + timeout_sec={{timeout}} +) +def {{name}}{{../testRunId}}(event: pubsub_fn.CloudEvent[pubsub_fn.MessagePublishedData[dict]]) -> None: + """ + Test function: {{name}} + Trigger: {{trigger}} + Topic: {{topic}} + """ + # Extract test_id from message JSON data + message = event.data.message + message_json = message.json + test_id = message_json.get('testId') if message_json else None + + if not test_id: + print(f"Warning: No testId found in message") + return + + # Prepare context data for storage + context_data = { + "id": event.id, + "time": event.time, + "type": f"google.cloud.pubsub.topic.v1.{{eventType}}", + "source": event.source, + # Stringify the message object for storage (matching JS behavior) + "message": json.dumps({ + "data": message.data, + "messageId": message.message_id, + "publishTime": message.publish_time, + "attributes": message.attributes, + "orderingKey": message.ordering_key, + }) + } + + # Store context in Firestore for verification + db = firestore.client() + collection_name = "{{#if collection}}{{collection}}{{else}}{{name}}{{/if}}" + db.collection(collection_name).document(test_id).set(context_data) + +{{/each}} diff --git a/integration_test/templates/functions/src/v2/remoteconfig_tests.py.hbs b/integration_test/templates/functions/src/v2/remoteconfig_tests.py.hbs new file mode 100644 index 0000000..5f687cd --- /dev/null +++ b/integration_test/templates/functions/src/v2/remoteconfig_tests.py.hbs @@ -0,0 +1,41 @@ +""" +Remote Config trigger tests for v2 +Test Run ID: {{testRunId}} +""" + +from firebase_admin import firestore +from firebase_functions import remote_config_fn + +REGION = "{{region}}" + +{{#each functions}} +@remote_config_fn.{{decorator}}( + region=REGION, + timeout_sec={{timeout}} +) +def {{name}}{{../testRunId}}(event: remote_config_fn.CloudEvent[remote_config_fn.ConfigUpdateData]) -> None: + """ + Test function: {{name}} + Trigger: {{trigger}} + """ + # Extract test_id from event data description + test_id = event.data.description if event.data else None + + if not test_id: + print(f"Warning: No testId found in event data description") + return + + # Prepare context data for storage + context_data = { + "id": event.id, + "time": event.time.isoformat() if hasattr(event.time, 'isoformat') else str(event.time), + "type": event.type, + "source": event.source, + } + + # Store context in Firestore for verification + db = firestore.client() + collection_name = "{{#if collection}}{{collection}}{{else}}{{name}}{{/if}}" + db.collection(collection_name).document(test_id).set(context_data) + +{{/each}} diff --git a/integration_test/templates/functions/src/v2/scheduler_tests.py.hbs b/integration_test/templates/functions/src/v2/scheduler_tests.py.hbs new file mode 100644 index 0000000..1e76ab7 --- /dev/null +++ b/integration_test/templates/functions/src/v2/scheduler_tests.py.hbs @@ -0,0 +1,34 @@ +""" +Scheduler trigger tests for v2 +Test Run ID: {{testRunId}} +""" + +from firebase_admin import firestore +from firebase_functions import scheduler_fn + +REGION = "{{region}}" + +{{#each functions}} +@scheduler_fn.{{decorator}}( + schedule="{{schedule}}", + region=REGION, + timeout_sec={{timeout}} +) +def {{name}}{{../testRunId}}(event: scheduler_fn.ScheduledEvent) -> None: + """ + Test function: {{name}} + Trigger: {{trigger}} + Schedule: {{schedule}} + """ + # Extract test_id from job_name + test_id = event.job_name + if not test_id: + print(f"Warning: No job_name found for scheduled function execution") + return + + # Store success flag in Firestore for verification + db = firestore.client() + collection_name = "{{#if collection}}{{collection}}{{else}}{{name}}{{/if}}" + db.collection(collection_name).document(test_id).set({"success": True}) + +{{/each}} diff --git a/integration_test/templates/functions/src/v2/storage_tests.py.hbs b/integration_test/templates/functions/src/v2/storage_tests.py.hbs new file mode 100644 index 0000000..b153013 --- /dev/null +++ b/integration_test/templates/functions/src/v2/storage_tests.py.hbs @@ -0,0 +1,40 @@ +""" +Storage trigger tests for v2 +Test Run ID: {{testRunId}} +""" + +from firebase_admin import firestore +from firebase_functions import storage_fn + +REGION = "{{region}}" + +{{#each functions}} +@storage_fn.{{decorator}}( + bucket="{{bucket}}", + region=REGION, + timeout_sec={{timeout}} +) +def {{name}}{{../testRunId}}(event: storage_fn.CloudEvent[storage_fn.StorageObjectData]) -> None: + """ + Test function: {{name}} + Trigger: {{trigger}} + Bucket: {{bucket}} + """ + # Extract test_id from filename (assumes format: testId.txt) + filename = event.data.name + test_id = filename.replace('.txt', '') if filename.endswith('.txt') else filename + + # Prepare context data for storage + context_data = { + "id": event.id, + "time": event.time, + "type": f"google.cloud.storage.object.v1.{{eventType}}", + "source": event.source, + } + + # Store context in Firestore for verification + db = firestore.client() + collection_name = "{{#if collection}}{{collection}}{{else}}{{name}}{{/if}}" + db.collection(collection_name).document(test_id).set(context_data) + +{{/each}} diff --git a/integration_test/templates/functions/src/v2/tasks_tests.py.hbs b/integration_test/templates/functions/src/v2/tasks_tests.py.hbs new file mode 100644 index 0000000..fe7fef3 --- /dev/null +++ b/integration_test/templates/functions/src/v2/tasks_tests.py.hbs @@ -0,0 +1,44 @@ +""" +Task Queue trigger tests for v2 +Test Run ID: {{testRunId}} +""" + +import json +import uuid +from firebase_admin import firestore +from firebase_functions import tasks_fn +from firebase_functions.https_fn import CallableRequest + +REGION = "{{region}}" + +{{#each functions}} +@tasks_fn.{{decorator}}( + region=REGION, + timeout_sec={{timeout}} +) +def {{name}}{{../testRunId}}(request: CallableRequest) -> None: + """ + Test function: {{name}} + Trigger: {{trigger}} + """ + # Extract test_id from request data + data = request.data + if not data or not isinstance(data, dict) or 'testId' not in data: + print(f"Warning: Invalid data structure for tasks onTaskDispatched") + return + + test_id = data.get('testId') + + # Prepare context data for storage + # Generate a unique ID since CallableRequest doesn't have one + context_data = { + "id": str(uuid.uuid4()), + "data": json.dumps(data) if data else "{}", + } + + # Store context in Firestore for verification + db = firestore.client() + collection_name = "{{#if collection}}{{collection}}{{else}}{{name}}{{/if}}" + db.collection(collection_name).document(test_id).set(context_data) + +{{/each}} diff --git a/integration_test/templates/functions/src/v2/testlab_tests.py.hbs b/integration_test/templates/functions/src/v2/testlab_tests.py.hbs new file mode 100644 index 0000000..e33a0ef --- /dev/null +++ b/integration_test/templates/functions/src/v2/testlab_tests.py.hbs @@ -0,0 +1,42 @@ +""" +Test Lab trigger tests for v2 +Test Run ID: {{testRunId}} +""" + +from firebase_admin import firestore +from firebase_functions import test_lab_fn + +REGION = "{{region}}" + +{{#each functions}} +@test_lab_fn.{{decorator}}( + region=REGION, + timeout_sec={{timeout}} +) +def {{name}}{{../testRunId}}(event: test_lab_fn.CloudEvent[test_lab_fn.TestMatrixCompletedData]) -> None: + """ + Test function: {{name}} + Trigger: {{trigger}} + """ + # Extract test_id from client_info details + test_id = event.data.client_info.details.get('testId') if event.data.client_info else None + + if not test_id: + print(f"Warning: No testId found in client_info details") + return + + # Prepare context data for storage + context_data = { + "testId": test_id, + "id": event.id, + "time": event.time.isoformat() if hasattr(event.time, 'isoformat') else str(event.time), + "type": event.type, + "state": str(event.data.state), + } + + # Store context in Firestore for verification + db = firestore.client() + collection_name = "{{#if collection}}{{collection}}{{else}}{{name}}{{/if}}" + db.collection(collection_name).document(test_id).set(context_data) + +{{/each}} diff --git a/integration_test/tests/firebaseClientConfig.ts b/integration_test/tests/firebaseClientConfig.ts new file mode 100644 index 0000000..22b6fcd --- /dev/null +++ b/integration_test/tests/firebaseClientConfig.ts @@ -0,0 +1,39 @@ +/** + * Firebase Client SDK Configuration for Integration Tests + * + * This configuration is safe to expose publicly as Firebase client SDK + * configuration is designed to be public. Security comes from Firebase + * Security Rules, not config secrecy. + */ + +export const FIREBASE_CLIENT_CONFIG = { + apiKey: "AIzaSyC1r437iUdYU33ecAdS3oUIF--cW8uk7Ek", + authDomain: "functions-integration-tests.firebaseapp.com", + databaseURL: "https://functions-integration-tests-default-rtdb.firebaseio.com", + projectId: "functions-integration-tests", + storageBucket: "functions-integration-tests.firebasestorage.app", + messagingSenderId: "488933414559", + appId: "1:488933414559:web:a64ddadca1b4ef4d40b4aa", + measurementId: "G-DS379RHF58", +}; + +export const FIREBASE_V2_CLIENT_CONFIG = { + apiKey: "AIzaSyCuJHyzpwIkQbxvJdKAzXg3sHUBOcTmsTI", + authDomain: "functions-integration-tests-v2.firebaseapp.com", + projectId: "functions-integration-tests-v2", + storageBucket: "functions-integration-tests-v2.firebasestorage.app", + messagingSenderId: "404926458259", + appId: "1:404926458259:web:eaab8474bc5a6833c66066", + measurementId: "G-D64JVJJSX7", +}; + +/** + * Get Firebase client config for a specific project + * Falls back to default config if project-specific config not found + */ +export function getFirebaseClientConfig(projectId?: string) { + if (projectId === "functions-integration-tests-v2") { + return FIREBASE_V2_CLIENT_CONFIG; + } + return FIREBASE_CLIENT_CONFIG; +} diff --git a/integration_test/tests/firebaseSetup.ts b/integration_test/tests/firebaseSetup.ts new file mode 100644 index 0000000..e6996e8 --- /dev/null +++ b/integration_test/tests/firebaseSetup.ts @@ -0,0 +1,40 @@ +import * as admin from "firebase-admin"; + +/** + * Initializes Firebase Admin SDK with project-specific configuration. + */ +export function initializeFirebase(): admin.app.App { + if (admin.apps.length === 0) { + try { + const projectId = process.env.PROJECT_ID || "functions-integration-tests-v2"; + + // Python SDK only supports 2nd gen functions - use v2 project + const databaseURL = process.env.DATABASE_URL || + "https://functions-integration-tests-v2-default-rtdb.firebaseio.com/"; + const storageBucket = process.env.STORAGE_BUCKET || + "gs://functions-integration-tests-v2.firebasestorage.app"; + + // Check if we're in Cloud Build (ADC available) or local (need service account file) + let credential; + if (process.env.GOOGLE_APPLICATION_CREDENTIALS && process.env.GOOGLE_APPLICATION_CREDENTIALS !== '{}') { + // Use service account file if specified and not a dummy file + const serviceAccountPath = process.env.GOOGLE_APPLICATION_CREDENTIALS; + credential = admin.credential.cert(serviceAccountPath); + } else { + // Use Application Default Credentials (for Cloud Build) + credential = admin.credential.applicationDefault(); + } + + return admin.initializeApp({ + credential: credential, + databaseURL: databaseURL, + storageBucket: storageBucket, + projectId: projectId, + }); + } catch (error) { + console.error("Error initializing Firebase:", error); + console.error("PROJECT_ID:", process.env.PROJECT_ID); + } + } + return admin.app(); +} diff --git a/integration_test/tests/utils.ts b/integration_test/tests/utils.ts new file mode 100644 index 0000000..5a544aa --- /dev/null +++ b/integration_test/tests/utils.ts @@ -0,0 +1,191 @@ +import { CloudTasksClient } from "@google-cloud/tasks"; +import * as admin from "firebase-admin"; + +export const timeout = (ms: number) => new Promise((resolve) => setTimeout(resolve, ms)); + +type RetryOptions = { maxRetries?: number; checkForUndefined?: boolean }; + +/** + * @template T + * @param {() => Promise} fn + * @param {RetryOptions | undefined} [options={ maxRetries: 10, checkForUndefined: true }] + * + * @returns {Promise} + */ +export async function retry(fn: () => Promise, options?: RetryOptions): Promise { + let count = 0; + let lastError: Error | undefined; + const { maxRetries = 20, checkForUndefined = true } = options ?? {}; + let result: Awaited | null = null; + + while (count < maxRetries) { + try { + result = await fn(); + if (!checkForUndefined || result) { + return result; + } + } catch (e) { + lastError = e as Error; + } + await timeout(5000); + count++; + } + + if (lastError) { + throw lastError; + } + + throw new Error(`Max retries exceeded: result = ${result}`); +} + +export async function createTask( + project: string, + queue: string, + location: string, + url: string, + payload: Record +): Promise { + const client = new CloudTasksClient(); + const parent = client.queuePath(project, location, queue); + + // Try to get service account email from various sources + let serviceAccountEmail: string; + + // First, check if we have a service account file + const serviceAccountPath = process.env.GOOGLE_APPLICATION_CREDENTIALS; + if (serviceAccountPath && serviceAccountPath !== '{}') { + try { + const serviceAccount = await import(serviceAccountPath); + serviceAccountEmail = serviceAccount.client_email; + } catch (e) { + // Fall back to using project default service account + serviceAccountEmail = `${project}@appspot.gserviceaccount.com`; + } + } else { + // Use project's default App Engine service account when using ADC + // This is what Cloud Build and other Google Cloud services will use + serviceAccountEmail = `${project}@appspot.gserviceaccount.com`; + } + + const task = { + httpRequest: { + httpMethod: "POST" as const, + url, + oidcToken: { + serviceAccountEmail, + }, + headers: { + "Content-Type": "application/json", + }, + body: Buffer.from(JSON.stringify(payload)).toString("base64"), + }, + }; + + const [response] = await client.createTask({ parent, task }); + if (!response) { + throw new Error("Unable to create task"); + } + return response.name || ""; +} + +// TestLab utilities +const TESTING_API_SERVICE_NAME = "testing.googleapis.com"; + +interface AndroidDevice { + androidModelId: string; + androidVersionId: string; + locale: string; + orientation: string; +} + +export async function startTestRun(projectId: string, testId: string, accessToken: string) { + const device = await fetchDefaultDevice(accessToken); + return await createTestMatrix(accessToken, projectId, testId, device); +} + +async function fetchDefaultDevice(accessToken: string): Promise { + const resp = await fetch( + `https://${TESTING_API_SERVICE_NAME}/v1/testEnvironmentCatalog/androidDeviceCatalog`, + { + headers: { + Authorization: `Bearer ${accessToken}`, + }, + } + ); + if (!resp.ok) { + throw new Error(resp.statusText); + } + const data = (await resp.json()) as any; + const models = data?.androidDeviceCatalog?.models || []; + const defaultModels = models.filter( + (m: any) => + m.tags !== undefined && + m.tags.indexOf("default") > -1 && + m.supportedVersionIds !== undefined && + m.supportedVersionIds.length > 0 + ); + + if (defaultModels.length === 0) { + throw new Error("No default device found"); + } + + const model = defaultModels[0]; + const versions = model.supportedVersionIds; + + return { + androidModelId: model.id, + androidVersionId: versions[versions.length - 1], + locale: "en", + orientation: "portrait", + }; +} + +async function createTestMatrix( + accessToken: string, + projectId: string, + testId: string, + device: AndroidDevice +): Promise { + const body = { + projectId, + testSpecification: { + androidRoboTest: { + appApk: { + gcsPath: "gs://path/to/non-existing-app.apk", + }, + }, + }, + environmentMatrix: { + androidDeviceList: { + androidDevices: [device], + }, + }, + resultStorage: { + googleCloudStorage: { + gcsPath: "gs://" + admin.storage().bucket().name, + }, + }, + clientInfo: { + name: "CloudFunctionsSDKIntegrationTest", + clientInfoDetails: { + key: "testId", + value: testId, + }, + }, + }; + const resp = await fetch( + `https://${TESTING_API_SERVICE_NAME}/v1/projects/${projectId}/testMatrices`, + { + method: "POST", + headers: { + Authorization: `Bearer ${accessToken}`, + "Content-Type": "application/json", + }, + body: JSON.stringify(body), + } + ); + if (!resp.ok) { + throw new Error(resp.statusText); + } + return; +} diff --git a/integration_test/tests/v2/database.test.ts b/integration_test/tests/v2/database.test.ts new file mode 100644 index 0000000..1c11d47 --- /dev/null +++ b/integration_test/tests/v2/database.test.ts @@ -0,0 +1,214 @@ +import * as admin from "firebase-admin"; +import { retry } from "../utils"; +import { initializeFirebase } from "../firebaseSetup"; +import { Reference } from "@firebase/database-types"; +import { logger } from "../../src/utils/logger"; + +describe("Firebase Database (v2)", () => { + const projectId = process.env.PROJECT_ID; + const testId = process.env.TEST_RUN_ID; + + if (!testId || !projectId) { + throw new Error("Environment configured incorrectly."); + } + + beforeAll(() => { + initializeFirebase(); + }); + + afterAll(async () => { + console.log("🧹 Cleaning up test data..."); + const collectionsToClean = [ + "databaseCreatedTests", + "databaseDeletedTests", + "databaseUpdatedTests", + "databaseWrittenTests", + ]; + + for (const collection of collectionsToClean) { + try { + await admin.firestore().collection(collection).doc(testId).delete(); + console.log(`🗑️ Deleted test document: ${collection}/${testId}`); + } catch (error) { + console.log(`ℹ️ No test document to delete: ${collection}/${testId}`); + } + } + }); + + async function setupRef(refPath: string) { + const ref = admin.database().ref(refPath); + await ref.set({ ".sv": "timestamp" }); + return ref; + } + + async function teardownRef(ref: Reference) { + if (ref) { + try { + await ref.remove(); + } catch (err) { + logger.error("Teardown error", err); + } + } + } + + async function getLoggedContext(collectionName: string, testId: string) { + return retry(() => + admin + .firestore() + .collection(collectionName) + .doc(testId) + .get() + .then((logSnapshot) => logSnapshot.data()) + ); + } + + describe("created trigger", () => { + let ref: Reference; + let loggedContext: admin.firestore.DocumentData | undefined; + + beforeAll(async () => { + ref = await setupRef(`databaseCreatedTests/${testId}/start`); + loggedContext = await getLoggedContext("databaseCreatedTests", testId); + }); + + afterAll(async () => { + await teardownRef(ref); + }); + + it("should give refs access to admin data", async () => { + await ref.parent?.child("adminOnly").update({ allowed: 1 }); + + const adminDataSnapshot = await ref.parent?.child("adminOnly").once("value"); + const adminData = adminDataSnapshot?.val(); + + expect(adminData).toEqual({ allowed: 1 }); + }); + + it("should have a correct ref url", () => { + expect(loggedContext?.url).toMatch(`databaseCreatedTests/${testId}/start`); + }); + + it("should have the right event type", () => { + expect(loggedContext?.type).toEqual("google.firebase.database.ref.v1.created"); + }); + + it("should have event id", () => { + expect(loggedContext?.id).toBeDefined(); + }); + + it("should have a time", () => { + expect(loggedContext?.time).toBeDefined(); + }); + }); + + describe("deleted trigger", () => { + let ref: Reference; + let loggedContext: admin.firestore.DocumentData | undefined; + + beforeAll(async () => { + ref = await setupRef(`databaseDeletedTests/${testId}/start`); + await teardownRef(ref); + loggedContext = await getLoggedContext("databaseDeletedTests", testId); + }); + + it("should have a correct ref url", () => { + expect(loggedContext?.url).toMatch(`databaseDeletedTests/${testId}/start`); + }); + + it("should have the right event type", () => { + expect(loggedContext?.type).toEqual("google.firebase.database.ref.v1.deleted"); + }); + + it("should have event id", () => { + expect(loggedContext?.id).toBeDefined(); + }); + + it("should have a time", () => { + expect(loggedContext?.time).toBeDefined(); + }); + }); + + describe("updated trigger", () => { + let ref: Reference; + let loggedContext: admin.firestore.DocumentData | undefined; + + beforeAll(async () => { + ref = await setupRef(`databaseUpdatedTests/${testId}/start`); + await ref.update({ updated: true }); + loggedContext = await getLoggedContext("databaseUpdatedTests", testId); + }); + + afterAll(async () => { + await teardownRef(ref); + }); + + it("should give refs access to admin data", async () => { + await ref.parent?.child("adminOnly").update({ allowed: 1 }); + + const adminDataSnapshot = await ref.parent?.child("adminOnly").once("value"); + const adminData = adminDataSnapshot?.val(); + + expect(adminData).toEqual({ allowed: 1 }); + }); + + it("should have a correct ref url", () => { + expect(loggedContext?.url).toMatch(`databaseUpdatedTests/${testId}/start`); + }); + + it("should have the right event type", () => { + expect(loggedContext?.type).toEqual("google.firebase.database.ref.v1.updated"); + }); + + it("should have event id", () => { + expect(loggedContext?.id).toBeDefined(); + }); + + it("should have a time", () => { + expect(loggedContext?.time).toBeDefined(); + }); + + it("should have updated data", () => { + const parsedData = JSON.parse(loggedContext?.data ?? "{}"); + expect(parsedData).toEqual({ updated: true }); + }); + }); + + describe("written trigger", () => { + let ref: Reference; + let loggedContext: admin.firestore.DocumentData | undefined; + + beforeAll(async () => { + ref = await setupRef(`databaseWrittenTests/${testId}/start`); + loggedContext = await getLoggedContext("databaseWrittenTests", testId); + }); + + afterAll(async () => { + await teardownRef(ref); + }); + + it("should give refs access to admin data", async () => { + await ref.parent?.child("adminOnly").update({ allowed: 1 }); + + const adminDataSnapshot = await ref.parent?.child("adminOnly").once("value"); + const adminData = adminDataSnapshot?.val(); + + expect(adminData).toEqual({ allowed: 1 }); + }); + + it("should have a correct ref url", () => { + expect(loggedContext?.url).toMatch(`databaseWrittenTests/${testId}/start`); + }); + + it("should have the right event type", () => { + expect(loggedContext?.type).toEqual("google.firebase.database.ref.v1.written"); + }); + + it("should have event id", () => { + expect(loggedContext?.id).toBeDefined(); + }); + + it("should have a time", () => { + expect(loggedContext?.time).toBeDefined(); + }); + }); +}); diff --git a/integration_test/tests/v2/eventarc.test.ts b/integration_test/tests/v2/eventarc.test.ts new file mode 100644 index 0000000..967ab1b --- /dev/null +++ b/integration_test/tests/v2/eventarc.test.ts @@ -0,0 +1,69 @@ +import * as admin from "firebase-admin"; +import { initializeFirebase } from "../firebaseSetup"; +import { CloudEvent, getEventarc } from "firebase-admin/eventarc"; +import { retry } from "../utils"; + +describe("Eventarc (v2)", () => { + const projectId = process.env.PROJECT_ID || "functions-integration-tests-v2"; + const testId = process.env.TEST_RUN_ID; + const region = process.env.REGION || "us-central1"; + + if (!testId || !projectId || !region) { + throw new Error("Environment configured incorrectly."); + } + + beforeAll(() => { + initializeFirebase(); + }); + + afterAll(async () => { + await admin.firestore().collection("eventarcOnCustomEventPublishedTests").doc(testId).delete(); + }); + + describe("onCustomEventPublished trigger", () => { + let loggedContext: admin.firestore.DocumentData | undefined; + + beforeAll(async () => { + const cloudEvent: CloudEvent = { + type: "achieved-leaderboard", + source: testId, + subject: "Welcome to the top 10", + data: { + message: "You have achieved the nth position in our leaderboard! To see...", + testId, + }, + }; + await getEventarc().channel(`locations/${region}/channels/firebase`).publish(cloudEvent); + + loggedContext = await retry(() => + admin + .firestore() + .collection("eventarcOnCustomEventPublishedTests") + .doc(testId) + .get() + .then((logSnapshot) => logSnapshot.data()) + ); + }); + + it("should have well-formed source", () => { + expect(loggedContext?.source).toMatch(testId); + }); + + it("should have the correct type", () => { + expect(loggedContext?.type).toEqual("achieved-leaderboard"); + }); + + it("should have an id", () => { + expect(loggedContext?.id).toBeDefined(); + }); + + it("should have a time", () => { + expect(loggedContext?.time).toBeDefined(); + }); + + it("should not have the data", () => { + const eventData = JSON.parse(loggedContext?.data || "{}"); + expect(eventData.testId).toBeDefined(); + }); + }); +}); diff --git a/integration_test/tests/v2/firestore.test.ts b/integration_test/tests/v2/firestore.test.ts new file mode 100644 index 0000000..94e790b --- /dev/null +++ b/integration_test/tests/v2/firestore.test.ts @@ -0,0 +1,228 @@ +import * as admin from "firebase-admin"; +import { retry } from "../utils"; +import { initializeFirebase } from "../firebaseSetup"; + +describe("Cloud Firestore (v2)", () => { + const projectId = process.env.PROJECT_ID; + const testId = process.env.TEST_RUN_ID; + + if (!testId || !projectId) { + throw new Error("Environment configured incorrectly."); + } + + beforeAll(() => { + initializeFirebase(); + }); + + afterAll(async () => { + await admin.firestore().collection("firestoreOnDocumentCreatedTests").doc(testId).delete(); + await admin.firestore().collection("firestoreOnDocumentDeletedTests").doc(testId).delete(); + await admin.firestore().collection("firestoreOnDocumentUpdatedTests").doc(testId).delete(); + await admin.firestore().collection("firestoreOnDocumentWrittenTests").doc(testId).delete(); + }); + + describe("Document created trigger", () => { + let loggedContext: admin.firestore.DocumentData | undefined; + let dataSnapshot: admin.firestore.DocumentSnapshot; + let docRef: admin.firestore.DocumentReference; + + beforeAll(async () => { + docRef = admin.firestore().collection("tests").doc(testId); + await docRef.set({ test: testId }); + dataSnapshot = await docRef.get(); + + loggedContext = await retry(() => + admin + .firestore() + .collection("firestoreOnDocumentCreatedTests") + .doc(testId) + .get() + .then((logSnapshot) => logSnapshot.data()) + ); + }); + + it("should not have event.app", () => { + expect(loggedContext?.app).toBeUndefined(); + }); + + it("should give refs access to admin data", async () => { + const result = await docRef.set({ allowed: 1 }, { merge: true }); + expect(result).toBeTruthy(); + }); + + it("should have well-formed resource", () => { + expect(loggedContext?.source).toMatch( + `//firestore.googleapis.com/projects/${projectId}/databases/(default)` + ); + }); + + it("should have the correct type", () => { + expect(loggedContext?.type).toEqual("google.cloud.firestore.document.v1.created"); + }); + + it("should have an id", () => { + expect(loggedContext?.id).toBeDefined(); + }); + + it("should have a time", () => { + expect(loggedContext?.time).toBeDefined(); + }); + + it("should have the correct data", () => { + expect(dataSnapshot.data()).toEqual({ test: testId }); + }); + }); + + describe("Document deleted trigger", () => { + let loggedContext: admin.firestore.DocumentData | undefined; + let dataSnapshot: admin.firestore.DocumentSnapshot; + let docRef: admin.firestore.DocumentReference; + + beforeAll(async () => { + docRef = admin.firestore().collection("tests").doc(testId); + await docRef.set({ test: testId }); + dataSnapshot = await docRef.get(); + + await docRef.delete(); + + // Refresh snapshot + dataSnapshot = await docRef.get(); + + loggedContext = await retry(() => + admin + .firestore() + .collection("firestoreOnDocumentDeletedTests") + .doc(testId) + .get() + .then((logSnapshot) => logSnapshot.data()) + ); + }); + + it("should not have event.app", () => { + expect(loggedContext?.app).toBeUndefined(); + }); + + it("should have well-formed source", () => { + expect(loggedContext?.source).toMatch( + `//firestore.googleapis.com/projects/${projectId}/databases/(default)` + ); + }); + + it("should have the correct type", () => { + expect(loggedContext?.type).toEqual("google.cloud.firestore.document.v1.deleted"); + }); + + it("should have an id", () => { + expect(loggedContext?.id).toBeDefined(); + }); + + it("should have a time", () => { + expect(loggedContext?.time).toBeDefined(); + }); + + it("should not have the data", () => { + expect(dataSnapshot.data()).toBeUndefined(); + }); + }); + + describe("Document updated trigger", () => { + let loggedContext: admin.firestore.DocumentData | undefined; + let docRef: admin.firestore.DocumentReference; + + beforeAll(async () => { + docRef = admin.firestore().collection("tests").doc(testId); + await docRef.set({}); + + await docRef.update({ test: testId }); + + loggedContext = await retry(() => + admin + .firestore() + .collection("firestoreOnDocumentUpdatedTests") + .doc(testId) + .get() + .then((logSnapshot) => logSnapshot.data()) + ); + }); + + it("should not have event.app", () => { + expect(loggedContext?.app).toBeUndefined(); + }); + + it("should have well-formed resource", () => { + expect(loggedContext?.source).toMatch( + `//firestore.googleapis.com/projects/${projectId}/databases/(default)` + ); + }); + + it("should have the correct type", () => { + expect(loggedContext?.type).toEqual("google.cloud.firestore.document.v1.updated"); + }); + + it("should have an id", () => { + expect(loggedContext?.id).toBeDefined(); + }); + + it("should have a time", () => { + expect(loggedContext?.time).toBeDefined(); + }); + + it("should have the correct data", async () => { + // Retry getting the data snapshot to ensure the function has processed + const finalSnapshot = await retry(() => docRef.get()); + expect(finalSnapshot.data()).toStrictEqual({ test: testId }); + }); + }); + + describe("Document written trigger", () => { + let loggedContext: admin.firestore.DocumentData | undefined; + let dataSnapshot: admin.firestore.DocumentSnapshot; + let docRef: admin.firestore.DocumentReference; + + beforeAll(async () => { + docRef = admin.firestore().collection("tests").doc(testId); + await docRef.set({ test: testId }); + dataSnapshot = await docRef.get(); + + loggedContext = await retry(() => + admin + .firestore() + .collection("firestoreOnDocumentWrittenTests") + .doc(testId) + .get() + .then((logSnapshot) => logSnapshot.data()) + ); + }); + + it("should not have event.app", () => { + expect(loggedContext?.app).toBeUndefined(); + }); + + it("should give refs access to admin data", async () => { + const result = await docRef.set({ allowed: 1 }, { merge: true }); + expect(result).toBeTruthy(); + }); + + it("should have well-formed resource", () => { + expect(loggedContext?.source).toMatch( + `//firestore.googleapis.com/projects/${projectId}/databases/(default)` + ); + }); + + it("should have the correct type", () => { + expect(loggedContext?.type).toEqual("google.cloud.firestore.document.v1.written"); + }); + + it("should have an id", () => { + expect(loggedContext?.id).toBeDefined(); + }); + + it("should have a time", () => { + expect(loggedContext?.time).toBeDefined(); + }); + + it("should have the correct data", () => { + expect(dataSnapshot.data()).toEqual({ test: testId }); + }); + }); +}); diff --git a/integration_test/tests/v2/identity.test.ts b/integration_test/tests/v2/identity.test.ts new file mode 100644 index 0000000..77ae0bd --- /dev/null +++ b/integration_test/tests/v2/identity.test.ts @@ -0,0 +1,133 @@ +import * as admin from "firebase-admin"; +import { retry } from "../utils"; +import { initializeApp } from "firebase/app"; +import { initializeFirebase } from "../firebaseSetup"; +import { getAuth, createUserWithEmailAndPassword, UserCredential } from "firebase/auth"; +import { getFirebaseClientConfig } from "../firebaseClientConfig"; + +interface IdentityEventContext { + eventId: string; + eventType: string; + timestamp: string; + resource: { + name: string; + }; +} + +describe("Firebase Identity (v2)", () => { + const userIds: string[] = []; + const projectId = process.env.PROJECT_ID || "functions-integration-tests-v2"; + const testId = process.env.TEST_RUN_ID; + // Use hardcoded Firebase client config (safe to expose publicly) + const config = getFirebaseClientConfig(projectId); + const app = initializeApp(config); + + if (!testId || !projectId) { + throw new Error("Environment configured incorrectly."); + } + + beforeAll(() => { + initializeFirebase(); + }); + + afterAll(async () => { + for (const userId of userIds) { + await admin.firestore().collection("userProfiles").doc(userId).delete(); + await admin.firestore().collection("authUserOnCreateTests").doc(userId).delete(); + await admin.firestore().collection("authUserOnDeleteTests").doc(userId).delete(); + await admin.firestore().collection("authBeforeCreateTests").doc(userId).delete(); + await admin.firestore().collection("authBeforeSignInTests").doc(userId).delete(); + } + }); + describe("beforeUserCreated trigger", () => { + let userRecord: UserCredential; + let loggedContext: IdentityEventContext | undefined; + + beforeAll(async () => { + userRecord = await createUserWithEmailAndPassword( + getAuth(app), + `${testId}@fake-create.com`, + "secret" + ); + + userIds.push(userRecord.user.uid); + + loggedContext = await retry(() => + admin + .firestore() + .collection("identityBeforeUserCreatedTests") + .doc(userRecord.user.uid) + .get() + .then((logSnapshot) => logSnapshot.data() as IdentityEventContext | undefined) + ); + }); + + afterAll(async () => { + await admin.auth().deleteUser(userRecord.user.uid); + }); + + it("should have a project as resource", () => { + expect(loggedContext?.resource.name).toMatch(`projects/${projectId}`); + }); + + it("should have the correct eventType", () => { + expect(loggedContext?.eventType).toEqual( + "providers/cloud.auth/eventTypes/user.beforeCreate:password" + ); + }); + + it("should have an eventId", () => { + expect(loggedContext?.eventId).toBeDefined(); + }); + + it("should have a timestamp", () => { + expect(loggedContext?.timestamp).toBeDefined(); + }); + }); + + describe("identityBeforeUserSignedInTests trigger", () => { + let userRecord: UserCredential; + let loggedContext: admin.firestore.DocumentData | undefined; + + beforeAll(async () => { + userRecord = await createUserWithEmailAndPassword( + getAuth(app), + `${testId}@fake-before-signin.com`, + "secret" + ); + + userIds.push(userRecord.user.uid); + + loggedContext = await retry(() => + admin + .firestore() + .collection("identityBeforeUserSignedInTests") + .doc(userRecord.user.uid) + .get() + .then((logSnapshot) => logSnapshot.data()) + ); + }); + + afterAll(async () => { + await admin.auth().deleteUser(userRecord.user.uid); + }); + + it("should have a project as resource", () => { + expect(loggedContext?.resource.name).toMatch(`projects/${projectId}`); + }); + + it("should have the correct eventType", () => { + expect(loggedContext?.eventType).toEqual( + "providers/cloud.auth/eventTypes/user.beforeSignIn:password" + ); + }); + + it("should have an eventId", () => { + expect(loggedContext?.eventId).toBeDefined(); + }); + + it("should have a timestamp", () => { + expect(loggedContext?.timestamp).toBeDefined(); + }); + }); +}); diff --git a/integration_test/tests/v2/pubsub.test.ts b/integration_test/tests/v2/pubsub.test.ts new file mode 100644 index 0000000..59609ac --- /dev/null +++ b/integration_test/tests/v2/pubsub.test.ts @@ -0,0 +1,81 @@ +import * as admin from "firebase-admin"; +import { retry } from "../utils"; +import { PubSub } from "@google-cloud/pubsub"; +import { initializeFirebase } from "../firebaseSetup"; + +describe("Pub/Sub (v2)", () => { + const projectId = process.env.PROJECT_ID; + const testId = process.env.TEST_RUN_ID; + const region = process.env.REGION; + const serviceAccountPath = process.env.GOOGLE_APPLICATION_CREDENTIALS; + + if (!testId || !projectId || !region) { + throw new Error("Environment configured incorrectly."); + } + + if (!serviceAccountPath) { + console.warn("GOOGLE_APPLICATION_CREDENTIALS not set, skipping Pub/Sub tests"); + describe.skip("Pub/Sub (v2)", () => { + it("skipped due to missing credentials", () => { + expect(true).toBe(true); + }); + }); + return; + } + + beforeAll(() => { + initializeFirebase(); + }); + + afterAll(async () => { + await admin.firestore().collection("pubsubOnMessagePublishedTests").doc(testId).delete(); + }); + + describe("onMessagePublished trigger", () => { + let loggedContext: admin.firestore.DocumentData | undefined; + + beforeAll(async () => { + const serviceAccount = await import(serviceAccountPath); + const topic = new PubSub({ + credentials: serviceAccount.default, + projectId, + }).topic("custom_message_tests"); + + await topic.publish(Buffer.from(JSON.stringify({ testId }))); + + loggedContext = await retry(() => + admin + .firestore() + .collection("pubsubOnMessagePublishedTests") + .doc(testId) + .get() + .then((logSnapshot) => logSnapshot.data()) + ); + }); + + it("should have a topic as source", () => { + expect(loggedContext?.source).toEqual( + `//pubsub.googleapis.com/projects/${projectId}/topics/custom_message_tests` + ); + }); + + it("should have the correct event type", () => { + expect(loggedContext?.type).toEqual("google.cloud.pubsub.topic.v1.messagePublished"); + }); + + it("should have an event id", () => { + expect(loggedContext?.id).toBeDefined(); + }); + + it("should have time", () => { + expect(loggedContext?.time).toBeDefined(); + }); + + it("should have pubsub data", () => { + const decodedMessage = JSON.parse(loggedContext?.message); + const decoded = new Buffer(decodedMessage.data, "base64").toString(); + const parsed = JSON.parse(decoded); + expect(parsed.testId).toEqual(testId); + }); + }); +}); diff --git a/integration_test/tests/v2/remoteConfig.test.ts b/integration_test/tests/v2/remoteConfig.test.ts new file mode 100644 index 0000000..c5379c7 --- /dev/null +++ b/integration_test/tests/v2/remoteConfig.test.ts @@ -0,0 +1,81 @@ +import * as admin from "firebase-admin"; +import { retry } from "../utils"; +import { initializeFirebase } from "../firebaseSetup"; + +describe("Firebase Remote Config (v2)", () => { + const projectId = process.env.PROJECT_ID; + const testId = process.env.TEST_RUN_ID; + + if (!testId || !projectId) { + throw new Error("Environment configured incorrectly."); + } + + beforeAll(() => { + initializeFirebase(); + }); + + afterAll(async () => { + await admin.firestore().collection("remoteConfigOnConfigUpdatedTests").doc(testId).delete(); + }); + + describe("onUpdated trigger", () => { + let loggedContext: admin.firestore.DocumentData | undefined; + let shouldSkip = false; + + beforeAll(async () => { + try { + const accessToken = await admin.credential.applicationDefault().getAccessToken(); + const resp = await fetch( + `https://firebaseremoteconfig.googleapis.com/v1/projects/${projectId}/remoteConfig`, + { + method: "PUT", + headers: { + Authorization: `Bearer ${accessToken.access_token}`, + "Content-Type": "application/json; UTF-8", + "Accept-Encoding": "gzip", + "If-Match": "*", + }, + body: JSON.stringify({ version: { description: testId } }), + } + ); + if (!resp.ok) { + throw new Error(resp.statusText); + } + + loggedContext = await retry(() => + admin + .firestore() + .collection("remoteConfigOnConfigUpdatedTests") + .doc(testId) + .get() + .then((logSnapshot) => logSnapshot.data()) + ); + } catch (error) { + console.warn("RemoteConfig API access failed, skipping test:", (error as Error).message); + shouldSkip = true; + } + }); + + it("should have the right event type", () => { + if (shouldSkip) { + return; + } + // TODO: not sure if the nested remoteconfig.remoteconfig is expected? + expect(loggedContext?.type).toEqual("google.firebase.remoteconfig.remoteConfig.v1.updated"); + }); + + it("should have event id", () => { + if (shouldSkip) { + return; // Skip test when API not available + } + expect(loggedContext?.id).toBeDefined(); + }); + + it("should have time", () => { + if (shouldSkip) { + return; // Skip test when API not available + } + expect(loggedContext?.time).toBeDefined(); + }); + }); +}); diff --git a/integration_test/tests/v2/scheduler.test.ts b/integration_test/tests/v2/scheduler.test.ts new file mode 100644 index 0000000..8b7cbf8 --- /dev/null +++ b/integration_test/tests/v2/scheduler.test.ts @@ -0,0 +1,56 @@ +import * as admin from "firebase-admin"; +import { retry } from "../utils"; +import { initializeFirebase } from "../firebaseSetup"; + +describe("Scheduler", () => { + const projectId = process.env.PROJECT_ID; + const region = process.env.REGION; + const testId = process.env.TEST_RUN_ID; + + if (!testId || !projectId || !region) { + throw new Error("Environment configured incorrectly."); + } + + beforeAll(() => { + initializeFirebase(); + }); + + afterAll(async () => { + await admin.firestore().collection("schedulerOnScheduleV2Tests").doc(testId).delete(); + }); + + describe("onSchedule trigger", () => { + let loggedContext: admin.firestore.DocumentData | undefined; + + beforeAll(async () => { + const accessToken = await admin.credential.applicationDefault().getAccessToken(); + const jobName = `firebase-schedule-${testId}-v2-schedule-${region}`; + const response = await fetch( + `https://cloudscheduler.googleapis.com/v1/projects/${projectId}/locations/us-central1/jobs/firebase-schedule-${testId}-v2-schedule-${region}:run`, + { + method: "POST", + headers: { + "Content-Type": "application/json", + Authorization: `Bearer ${accessToken.access_token}`, + }, + } + ); + if (!response.ok) { + throw new Error(`Failed request with status ${response.status}!`); + } + + loggedContext = await retry(() => + admin + .firestore() + .collection("schedulerOnScheduleV2Tests") + .doc(jobName) + .get() + .then((logSnapshot) => logSnapshot.data()) + ); + }); + + it("should trigger when the scheduler fires", () => { + expect(loggedContext?.success).toBeTruthy(); + }); + }); +}); diff --git a/integration_test/tests/v2/storage.test.ts b/integration_test/tests/v2/storage.test.ts new file mode 100644 index 0000000..765eb24 --- /dev/null +++ b/integration_test/tests/v2/storage.test.ts @@ -0,0 +1,167 @@ +import * as admin from "firebase-admin"; +import { initializeFirebase } from "../firebaseSetup"; +import { retry, timeout } from "../utils"; + +async function uploadBufferToFirebase(buffer: Buffer, fileName: string) { + const bucket = admin.storage().bucket(); + + const file = bucket.file(fileName); + await file.save(buffer, { + metadata: { + contentType: "text/plain", + }, + }); +} + +describe("Firebase Storage (v2)", () => { + const testId = process.env.TEST_RUN_ID; + + if (!testId) { + throw new Error("Environment configured incorrectly."); + } + + beforeAll(() => { + initializeFirebase(); + }); + + afterAll(async () => { + await admin.firestore().collection("storageOnObjectFinalizedTests").doc(testId).delete(); + await admin.firestore().collection("storageOnObjectDeletedTests").doc(testId).delete(); + await admin.firestore().collection("storageOnObjectMetadataUpdatedTests").doc(testId).delete(); + }); + + describe("onObjectFinalized trigger", () => { + let loggedContext: admin.firestore.DocumentData | undefined; + + beforeAll(async () => { + const testContent = testId; + const buffer = Buffer.from(testContent, "utf-8"); + + await uploadBufferToFirebase(buffer, testId + ".txt"); + + loggedContext = await retry(() => + admin + .firestore() + .collection("storageOnObjectFinalizedTests") + .doc(testId) + .get() + .then((logSnapshot) => logSnapshot.data()) + ); + }); + + afterAll(async () => { + const file = admin + .storage() + .bucket() + .file(testId + ".txt"); + + const [exists] = await file.exists(); + if (exists) { + await file.delete(); + } + }); + + it("should have the right event type", () => { + expect(loggedContext?.type).toEqual("google.cloud.storage.object.v1.finalized"); + }); + + it("should have event id", () => { + expect(loggedContext?.id).toBeDefined(); + }); + + it("should have time", () => { + expect(loggedContext?.time).toBeDefined(); + }); + }); + + describe("onDeleted trigger", () => { + let loggedContext: admin.firestore.DocumentData | undefined; + + beforeAll(async () => { + const testContent = testId; + const buffer = Buffer.from(testContent, "utf-8"); + + await uploadBufferToFirebase(buffer, testId + ".txt"); + + await timeout(5000); // Short delay before delete + + const file = admin + .storage() + .bucket() + .file(testId + ".txt"); + await file.delete(); + + loggedContext = await retry(() => + admin + .firestore() + .collection("storageOnObjectDeletedTests") + .doc(testId) + .get() + .then((logSnapshot) => logSnapshot.data()) + ); + }); + + it("should have the right event type", () => { + expect(loggedContext?.type).toEqual("google.cloud.storage.object.v1.deleted"); + }); + + it("should have event id", () => { + expect(loggedContext?.id).toBeDefined(); + }); + + it("should have time", () => { + expect(loggedContext?.time).toBeDefined(); + }); + }); + + describe("onMetadataUpdated trigger", () => { + let loggedContext: admin.firestore.DocumentData | undefined; + + beforeAll(async () => { + const testContent = testId; + const buffer = Buffer.from(testContent, "utf-8"); + + await uploadBufferToFirebase(buffer, testId + ".txt"); + + // Trigger metadata update + const file = admin + .storage() + .bucket() + .file(testId + ".txt"); + await file.setMetadata({ contentType: "application/json" }); + + loggedContext = await retry(() => + admin + .firestore() + .collection("storageOnObjectMetadataUpdatedTests") + .doc(testId) + .get() + .then((logSnapshot) => logSnapshot.data()) + ); + }); + + afterAll(async () => { + const file = admin + .storage() + .bucket() + .file(testId + ".txt"); + + const [exists] = await file.exists(); + if (exists) { + await file.delete(); + } + }); + + it("should have the right event type", () => { + expect(loggedContext?.type).toEqual("google.cloud.storage.object.v1.metadataUpdated"); + }); + + it("should have event id", () => { + expect(loggedContext?.id).toBeDefined(); + }); + + it("should have time", () => { + expect(loggedContext?.time).toBeDefined(); + }); + }); +}); diff --git a/integration_test/tests/v2/tasks.test.ts b/integration_test/tests/v2/tasks.test.ts new file mode 100644 index 0000000..2af8768 --- /dev/null +++ b/integration_test/tests/v2/tasks.test.ts @@ -0,0 +1,56 @@ +import * as admin from "firebase-admin"; +import { initializeFirebase } from "../firebaseSetup"; +import { createTask, retry } from "../utils"; + +describe("Cloud Tasks (v2)", () => { + const region = process.env.REGION; + const testId = process.env.TEST_RUN_ID; + const projectId = process.env.PROJECT_ID; + const queueName = `tasksOnTaskDispatchedTests${testId}`; + + const serviceAccountPath = process.env.GOOGLE_APPLICATION_CREDENTIALS; + + if (!testId || !projectId || !region) { + throw new Error("Environment configured incorrectly."); + } + + if (!serviceAccountPath) { + console.warn("GOOGLE_APPLICATION_CREDENTIALS not set, skipping Tasks tests"); + describe.skip("Cloud Tasks (v2)", () => { + it("skipped due to missing credentials", () => { + expect(true).toBe(true); + }); + }); + return; + } + + beforeAll(() => { + initializeFirebase(); + }); + + afterAll(async () => { + await admin.firestore().collection("tasksOnTaskDispatchedTests").doc(testId).delete(); + }); + + describe("onDispatch trigger", () => { + let loggedContext: admin.firestore.DocumentData | undefined; + + beforeAll(async () => { + const url = `https://${region}-${projectId}.cloudfunctions.net/tasksOnTaskDispatchedTests${testId}`; + await createTask(projectId, queueName, region, url, { data: { testId } }); + + loggedContext = await retry(() => + admin + .firestore() + .collection("tasksOnTaskDispatchedTests") + .doc(testId) + .get() + .then((logSnapshot) => logSnapshot.data()) + ); + }); + + it("should have correct event id", () => { + expect(loggedContext?.id).toBeDefined(); + }); + }); +}); diff --git a/integration_test/tests/v2/testLab.test.ts b/integration_test/tests/v2/testLab.test.ts new file mode 100644 index 0000000..5894cc2 --- /dev/null +++ b/integration_test/tests/v2/testLab.test.ts @@ -0,0 +1,65 @@ +import * as admin from "firebase-admin"; +import { retry, startTestRun } from "../utils"; +import { initializeFirebase } from "../firebaseSetup"; + +describe.skip("TestLab (v2)", () => { + const projectId = process.env.PROJECT_ID; + const testId = process.env.TEST_RUN_ID; + + if (!testId || !projectId) { + throw new Error("Environment configured incorrectly."); + } + + beforeAll(() => { + initializeFirebase(); + }); + + afterAll(async () => { + await admin.firestore().collection("testLabOnTestMatrixCompletedTests").doc(testId).delete(); + }); + + describe("test matrix onComplete trigger", () => { + let loggedContext: admin.firestore.DocumentData | undefined; + let shouldSkip = false; + + beforeAll(async () => { + try { + const accessToken = await admin.credential.applicationDefault().getAccessToken(); + await startTestRun(projectId, testId, accessToken.access_token); + + loggedContext = await retry(() => + admin + .firestore() + .collection("testLabOnTestMatrixCompletedTests") + .doc(testId) + .get() + .then((logSnapshot) => logSnapshot.data()) + ); + } catch (error) { + console.warn("TestLab API access failed, skipping test:", (error as Error).message); + shouldSkip = true; + } + }); + + it("should have event id", () => { + if (shouldSkip) { + return; + } + expect(loggedContext?.id).toBeDefined(); + }); + + it("should have right event type", () => { + if (shouldSkip) { + return; + } + expect(loggedContext?.type).toEqual("google.firebase.testlab.testMatrix.v1.completed"); + }); + + it("should be in state 'INVALID'", () => { + if (shouldSkip) { + return; + } + expect(loggedContext?.state).toEqual("INVALID"); + }); + }); +}); diff --git a/integration_test/tsconfig.json b/integration_test/tsconfig.json new file mode 100644 index 0000000..38bd854 --- /dev/null +++ b/integration_test/tsconfig.json @@ -0,0 +1,16 @@ +{ + "compilerOptions": { + "target": "ES2020", + "module": "ES2020", + "outDir": "./dist", + "strict": true, + "esModuleInterop": true, + "skipLibCheck": true, + "forceConsistentCasingInFileNames": true, + "moduleResolution": "node", + "types": ["jest", "node"], + "typeRoots": ["./node_modules/@types"] + }, + "include": ["**/*.ts"], + "exclude": ["node_modules", "functions/*", "generated/*"] +} diff --git a/integration_test/tsconfig.test.json b/integration_test/tsconfig.test.json new file mode 100644 index 0000000..82137c5 --- /dev/null +++ b/integration_test/tsconfig.test.json @@ -0,0 +1,11 @@ +{ + "extends": "./tsconfig.json", + "compilerOptions": { + "module": "ES2020", + "moduleResolution": "Bundler", + "resolveJsonModule": true, + "types": ["jest", "node"] + }, + "include": ["**/*.ts"], + "exclude": ["node_modules", "functions/*", "generated/*"] +} \ No newline at end of file diff --git a/pyproject.toml b/pyproject.toml index dfd8af0..6e91986 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -55,7 +55,9 @@ dev = [ [tool.setuptools] package-dir = {"" = "src"} -packages = ["firebase_functions"] + +[tool.setuptools.packages.find] +where = ["src"] [tool.setuptools.package-data] firebase_functions = ["py.typed"] diff --git a/scripts/pack-for-integration-tests.sh b/scripts/pack-for-integration-tests.sh new file mode 100755 index 0000000..5b25ec6 --- /dev/null +++ b/scripts/pack-for-integration-tests.sh @@ -0,0 +1,31 @@ +#!/usr/bin/env bash + +# Script to build Python SDK and prepare it for integration tests +# This is the Python equivalent of the TypeScript SDK's pack-for-integration-tests command + +set -e # Exit on error + +echo "Building firebase-functions Python SDK from source..." + +# Clean any previous builds +rm -rf dist/ +rm -f integration_test/firebase-functions-python-local.whl + +# Build the package using uv +echo "Building wheel package..." +uv build + +# Find the built wheel file +WHEEL_FILE=$(ls dist/*.whl 2>/dev/null | head -n 1) + +if [ -z "$WHEEL_FILE" ]; then + echo "Error: No wheel file found in dist/ directory" + exit 1 +fi + +# Copy wheel to integration test directory +echo "Copying wheel to integration_test directory..." +cp "$WHEEL_FILE" integration_test/firebase-functions-python-local.whl + +echo "SDK built and packed successfully!" +echo "Wheel file: integration_test/firebase-functions-python-local.whl" \ No newline at end of file