-
Notifications
You must be signed in to change notification settings - Fork 0
feat(cf-auto-docs): NixOS module documentation API via CloudFlare #5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
Closed
Changes from all commits
Commits
Show all changes
106 commits
Select commit
Hold shift + click to select a range
ed859e5
feat(home-manager): add Claude Code settings management
Bad3r bd17618
fix: remove unused lambda patterns and trailing whitespace
Bad3r a043ab9
fix: use underscore for empty pattern
Bad3r adc4816
style: apply nixfmt formatting
Bad3r ceedb9c
chore: update nix-logseq-git-flake lock hash
Bad3r c3457dc
feat(home-manager): refactor claude-code module with comprehensive se…
Bad3r 850b65d
feat(cf-auto-docs): implement simplified NixOS module documentation A…
Bad3r 6df7299
fix(wrangler): update configuration to current best practices
Bad3r 4cf517a
feat(cf-auto-docs): implement Nix module extraction pipeline
Bad3r 6e6eaa7
docs: add GitHub Actions secrets setup guide
Bad3r d9c791a
fix(ci): remove npm cache dependency to fix workflow
Bad3r 8ad763a
feat(worker): configure D1 databases and update compatibility date
Bad3r 48fda08
fix(ci): use correct D1 database name for staging environment
Bad3r 4684538
Merge branch 'main' into feat/cf-auto-docs-api
Bad3r 30b328d
fix(worker): add tsconfig.json and remove build step
Bad3r 064bfd4
fix(worker): remove assets configuration to fix deployment
Bad3r a63e2ae
fix(worker): simplify wrangler config for minimal D1-only deployment
Bad3r aeceec7
fix(ci): add outputs to deploy-worker job for Worker URL
Bad3r 033d326
fix(ci): extract Worker URL from wrangler deploy output
Bad3r 358fd9b
fix(ci): update regex to match full Worker URL with subdomain
Bad3r 25bc9e7
fix(upload): handle compressed API responses properly
Bad3r a4bcf8b
fix(ci): add upload script to workflow trigger paths
Bad3r 5b7d3b0
fix(ci): run D1 migrations on remote database
Bad3r 9a69e8e
fix(ci): make database migrations non-blocking
Bad3r 73c83cd
fix(worker): make cache operations optional for MVP
Bad3r 59ce615
fix(upload): accept HTTP 207 Multi-Status as success
Bad3r db51554
fix(ci): require D1 migrations and add compression support
Bad3r 2fbbdd3
chore: add trailing newlines to claude workflow files
Bad3r 5947c8f
fix(ci): use temp file for compressed response handling
Bad3r 54ef6aa
fix(ci): use gunzip for gzip decompression
Bad3r 55aad51
fix(worker): route all requests through Hono and add /docs endpoint
Bad3r 28a603c
fix: remove gzip compression for better usability
Bad3r 7d06bfe
fix(migrations): fix FTS5 schema to enable search functionality
Bad3r afde7e0
Merge remote-tracking branch 'origin/main' into feat/cf-auto-docs-api
Bad3r 315ddb2
fix(worker): resolve critical compilation and security issues
Bad3r f0333ba
feat(worker): implement atomic database transactions for batch operat…
Bad3r 3b19513
feat(db): add composite and covering indexes for query optimization
Bad3r 21027e6
feat(extraction): implement full repository module extraction with ch…
Bad3r b5336ff
fix(extraction): revert to simple extractor due to nix-instantiate li…
Bad3r 5fb4cf6
fix(upload): redirect logs to stderr in upload_chunk_with_retry
Bad3r fff17cb
fix(handlers): guard all KV/R2 binding access with existence checks
Bad3r 1077ca9
feat(config): add KV and R2 bindings to wrangler.jsonc
Bad3r 482aabb
feat(config): add Analytics Engine binding for observability
Bad3r d4e067d
feat(semantic-search): implement Phase 1 foundation
Bad3r 4a787ac
feat(semantic-search): implement Phase 2 embedding pipeline
Bad3r 19ed20e
feat(semantic-search): implement Phase 3 hybrid search handler
Bad3r 84537dd
fix(semantic-search): add AI binding to staging environment
Bad3r 3b64514
feat(ai-search): complete Cloudflare AI Search integration with authe…
Bad3r db4e095
fix(ci): properly handle database migration errors in GitHub workflows
Bad3r 621e324
fix(migrations): remove obsolete FTS column updates from seed data
Bad3r 591a43e
fix(ci): improve worker deployment error visibility
Bad3r 8eb68ff
fix(batch-update): migrate from Vectorize to AI Search
Bad3r 6ea6ff7
fix(ci): use correct MODULE_API_KEY secret name
Bad3r d15a6c5
fix(ci): set Worker secrets after deployment
Bad3r 67e8907
feat(ai-search): add ingestion workflow and test script
Bad3r 3c16102
feat(ai-search): add ingestion trigger script
Bad3r 0e9274a
docs(ai-search): add comprehensive setup and testing guide
Bad3r c612332
feat(secrets): centralize cloudflare credentials
Bad3r 47c1620
build(worker): bump wrangler to ^3.84.0
Bad3r 43473a5
chore(managed): refresh generated files
Bad3r d7f0096
chore: apply treefmt exclusions and secret fixes
Bad3r 6fb87b1
fix(extract): ensure lib helpers are always available
Bad3r a44ef1b
chore(ci): bump action pins and nix 2.32
Bad3r 782bca1
feat(base): add openssl utility
Bad3r f76bb96
chore: update workflows
Bad3r d8dbf0d
feat: centralize MCP server catalog (#9)
Bad3r 9a936e7
feat(ai-search): add ingestion workflow and test script
Bad3r 4747bbc
chore: update workflows
Bad3r 4f31796
fix(ci): use cached module extraction outputs
Bad3r 9ef5722
fix(extract): remove rebase leftovers
Bad3r 7b76c63
fix(ci): place module artifact in cache dir
Bad3r c8c6a28
chore(extract): log chunk upload failures
Bad3r c8a721b
chore(extract): surface curl failures
Bad3r 0e5d31f
fix(extract): send chunk payload via temp file
Bad3r b845b7e
fix(extract): default to 10-module upload batches
Bad3r c4ce826
fix(extract): drop null strings when building payload
Bad3r 6ce0478
fix(extract): correct jq option helper
Bad3r 91c1515
feat(ai-search): publish module docs to r2 for indexing
Bad3r 28e33a0
feat(ai-search): generate markdown snapshots and upload to r2
Bad3r 580b3f2
fix(extract): seed flake owner metadata during eval
Bad3r f212213
docs: update extractor configs
Bad3r c0b2388
fix(extract): avoid duplicate flake assignment
Bad3r 1d8d318
fix(extract): supply fallback inputs when scanning modules
Bad3r cb75709
fix(extract): guard aggregator detection errors
Bad3r dd694dc
fix(extractor): skip unresolved imports and clean payload
Bad3r 8f6d45f
revert: undo extractor changes
Bad3r 27b747e
docs: add module extractor postmortem
Bad3r 84d3b15
feat(module-docs): add derivation-backed exporter
Bad3r 3e6d44c
docs(module-docs): refresh guidance and tooling
Bad3r 251565d
ci(module-docs): switch workflow to new exporter
Bad3r 937d8fe
fix(ci): build exporter via per-system attr
Bad3r 2ee2374
fix(module-docs): align derivations with data args
Bad3r 4818eaf
fix(module-docs): derive metadata inputs from self
Bad3r 1576183
fix(module-docs): allow derivation nixpkgs inputs override
Bad3r 564d6c3
fix(module-docs): normalize attrPath string assembly
Bad3r e582d4e
fix(module-docs): coerce attr path lists
Bad3r 277e5b6
fix(module-docs): ensure attr path normalization handles strings
Bad3r a39d64f
fix(module-docs): harden attr path normalization
Bad3r 26f9288
fix(module-docs): stub local logseq input for ci
Bad3r 7825759
chore(module-docs): stub logseq input in ci workflow
Bad3r 22bb902
feat(module-docs): add upload wrapper script
Bad3r 37db34d
chore: sync docs with main
Bad3r 3af3ac1
chore: rebase on main
Bad3r 5320d1b
Merge branch 'main' into feat/cf-auto-docs-api
Bad3r f22f7c3
Merge branch 'main' into feat/cf-auto-docs-api
Bad3r 9786110
feat(module-docs): scope extracted options to module sources
Bad3r File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,328 @@ | ||
| name: Deploy Module Documentation | ||
|
|
||
| on: | ||
| push: | ||
| branches: | ||
| - main | ||
| - feat/cf-auto-docs-api | ||
| paths: | ||
| - "modules/**" | ||
| - "implementation/**" | ||
| - "scripts/extract-*.nix" | ||
| - "scripts/module-docs-upload.sh" | ||
| - ".github/workflows/deploy-module-docs.yml" | ||
| workflow_dispatch: | ||
| inputs: | ||
| environment: | ||
| description: "Deployment environment" | ||
| required: true | ||
| default: "staging" | ||
| type: choice | ||
| options: | ||
| - staging | ||
| - production | ||
|
|
||
| env: | ||
| NODE_VERSION: "20" | ||
|
|
||
| jobs: | ||
| extract-modules: | ||
| name: Extract NixOS Modules | ||
| runs-on: ubuntu-latest | ||
| outputs: | ||
| module-count: ${{ steps.extract.outputs.module-count }} | ||
| namespace-count: ${{ steps.extract.outputs.namespace-count }} | ||
| steps: | ||
| - name: Checkout repository | ||
| uses: actions/checkout@v5.0.0 | ||
|
|
||
| - name: Install Nix | ||
| uses: cachix/install-nix-action@v31 | ||
| with: | ||
| github_access_token: ${{ secrets.GITHUB_TOKEN }} | ||
| nix_path: nixpkgs=channel:nixos-unstable | ||
| install_url: https://releases.nixos.org/nix/nix-2.32.0/install | ||
|
|
||
| - name: Setup Nix cache | ||
| uses: cachix/cachix-action@v16 | ||
| with: | ||
| name: nix-community | ||
| skipPush: true | ||
|
|
||
| - name: Extract modules | ||
| id: extract | ||
| timeout-minutes: 20 # Increased for full repository extraction (431+ modules) | ||
| run: | | ||
| set -euo pipefail | ||
| export MODULE_DOCS_NIX_FLAGS='--override-input nix-logseq-git-flake path:./stubs/nix-logseq-git-flake' | ||
| echo "Extracting ALL Nix modules via derivation-backed pipeline..." | ||
| ./scripts/module-docs-upload.sh --format json,md --out .cache/module-docs --dry-run --summary | ||
|
|
||
| # Output statistics | ||
| CACHE_DIR=".cache/module-docs" | ||
| MODULE_JSON="$CACHE_DIR/json/modules.json" | ||
|
|
||
| if [ -f "$MODULE_JSON" ]; then | ||
| MODULE_COUNT=$(jq -r '.metadata.moduleCount' "$MODULE_JSON") | ||
| NAMESPACE_COUNT=$(jq -r '.namespaces | keys | length' "$MODULE_JSON") | ||
| echo "module-count=$MODULE_COUNT" >> $GITHUB_OUTPUT | ||
| echo "namespace-count=$NAMESPACE_COUNT" >> $GITHUB_OUTPUT | ||
| echo "✅ Extracted $MODULE_COUNT modules across $NAMESPACE_COUNT namespaces" | ||
| else | ||
| echo "❌ Failed to extract modules" | ||
| exit 1 | ||
| fi | ||
|
|
||
| - name: Upload extracted modules | ||
| uses: actions/upload-artifact@v4.6.2 | ||
| with: | ||
| name: extracted-modules | ||
| path: | | ||
| .cache/module-docs/json/modules.json | ||
| .cache/module-docs/json/errors.ndjson | ||
| .cache/module-docs/md/modules.md | ||
|
|
||
| deploy-worker: | ||
| name: Deploy Cloudflare Worker | ||
| needs: extract-modules | ||
| runs-on: ubuntu-latest | ||
| environment: ${{ github.event.inputs.environment || 'staging' }} | ||
| outputs: | ||
| worker-url: ${{ steps.worker-url.outputs.worker-url }} | ||
| steps: | ||
| - name: Checkout repository | ||
| uses: actions/checkout@v5.0.0 | ||
|
|
||
| - name: Setup Node.js | ||
| uses: actions/setup-node@v5.0.0 | ||
| with: | ||
| node-version: ${{ env.NODE_VERSION }} | ||
|
|
||
| - name: Install dependencies | ||
| working-directory: implementation/worker | ||
| run: | | ||
| if [ -f package-lock.json ]; then | ||
| npm ci | ||
| else | ||
| npm install | ||
| fi | ||
|
|
||
| - name: Run database migrations | ||
| working-directory: implementation/worker | ||
| env: | ||
| CLOUDFLARE_ACCOUNT_ID: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }} | ||
| CLOUDFLARE_API_TOKEN: ${{ secrets.CLOUDFLARE_API_TOKEN }} | ||
| run: | | ||
| set -euo pipefail | ||
| echo "Running database migrations..." | ||
|
|
||
| # Use the correct database name based on environment | ||
| if [ "${{ github.event.inputs.environment || 'staging' }}" = "staging" ]; then | ||
| DB_NAME="nixos-modules-db-staging" | ||
| else | ||
| DB_NAME="nixos-modules-db" | ||
| fi | ||
|
|
||
| # Run all migrations in order | ||
| for migration in migrations/*.sql; do | ||
| echo "Running migration: $migration" | ||
|
|
||
| # Try to run the migration | ||
| if npx wrangler d1 execute "$DB_NAME" \ | ||
| --file="$migration" \ | ||
| --remote \ | ||
| --env=${{ github.event.inputs.environment || 'staging' }} 2>&1 | tee migration.log; then | ||
| echo "✅ Migration $(basename $migration) applied successfully" | ||
| else | ||
| EXIT_CODE=$? | ||
| # Check if it's an "already exists" error | ||
| if grep -qi "already exists\|duplicate\|unique constraint" migration.log; then | ||
| echo "ℹ️ Migration $(basename $migration) was already applied (table/index exists)" | ||
| else | ||
| echo "❌ Migration $(basename $migration) failed with error:" | ||
| cat migration.log | ||
| exit $EXIT_CODE | ||
| fi | ||
| fi | ||
| done | ||
|
|
||
| echo "✅ All migrations completed successfully" | ||
|
|
||
| - name: Deploy Worker | ||
| id: worker-url | ||
| working-directory: implementation/worker | ||
| env: | ||
| CLOUDFLARE_ACCOUNT_ID: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }} | ||
| CLOUDFLARE_API_TOKEN: ${{ secrets.CLOUDFLARE_API_TOKEN }} | ||
| run: | | ||
| set -euo pipefail | ||
| echo "Deploying Worker to ${{ github.event.inputs.environment || 'staging' }}..." | ||
|
|
||
| # Deploy with proper error handling | ||
| if ! npx wrangler deploy --env=${{ github.event.inputs.environment || 'staging' }} 2>&1 | tee deploy.log; then | ||
| echo "❌ Worker deployment failed:" | ||
| cat deploy.log | ||
| exit 1 | ||
| fi | ||
|
|
||
| # Extract Worker URL from deployment output | ||
| WORKER_URL=$(grep -oP 'https://[a-zA-Z0-9-]+\.[a-zA-Z0-9-]+\.workers\.dev' deploy.log | head -1) | ||
|
|
||
| if [ -z "$WORKER_URL" ]; then | ||
| echo "❌ Failed to extract Worker URL from deployment output" | ||
| echo "Deployment log:" | ||
| cat deploy.log | ||
| exit 1 | ||
| fi | ||
|
|
||
| echo "worker-url=$WORKER_URL" >> $GITHUB_OUTPUT | ||
| echo "✅ Worker deployed to: $WORKER_URL" | ||
|
|
||
| - name: Set Worker Secrets | ||
| working-directory: implementation/worker | ||
| env: | ||
| CLOUDFLARE_ACCOUNT_ID: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }} | ||
| CLOUDFLARE_API_TOKEN: ${{ secrets.CLOUDFLARE_API_TOKEN }} | ||
| run: | | ||
| set -euo pipefail | ||
| echo "Setting Worker secrets..." | ||
|
|
||
| # Set AI_GATEWAY_TOKEN secret | ||
| if [ -n "${{ secrets.AI_GATEWAY_TOKEN }}" ]; then | ||
| echo "Setting AI_GATEWAY_TOKEN secret..." | ||
| echo "${{ secrets.AI_GATEWAY_TOKEN }}" | npx wrangler secret put AI_GATEWAY_TOKEN --env=${{ github.event.inputs.environment || 'staging' }} | ||
| echo "✅ AI_GATEWAY_TOKEN set successfully" | ||
| else | ||
| echo "⚠️ AI_GATEWAY_TOKEN secret not found in GitHub Secrets" | ||
| fi | ||
|
|
||
| # Set API_KEY secret | ||
| if [ -n "${{ secrets.MODULE_API_KEY }}" ]; then | ||
| echo "Setting API_KEY secret..." | ||
| echo "${{ secrets.MODULE_API_KEY }}" | npx wrangler secret put API_KEY --env=${{ github.event.inputs.environment || 'staging' }} | ||
| echo "✅ API_KEY set successfully" | ||
| else | ||
| echo "⚠️ MODULE_API_KEY secret not found in GitHub Secrets" | ||
| fi | ||
|
|
||
| upload-data: | ||
| name: Upload Module Data | ||
| needs: [extract-modules, deploy-worker] | ||
| runs-on: ubuntu-latest | ||
| environment: ${{ github.event.inputs.environment || 'staging' }} | ||
| steps: | ||
| - name: Checkout repository | ||
| uses: actions/checkout@v5.0.0 | ||
|
|
||
| - name: Download extracted modules | ||
| uses: actions/download-artifact@v5.0.0 | ||
| with: | ||
| name: extracted-modules | ||
| path: .cache/module-docs | ||
|
|
||
| - name: Upload to Worker API | ||
| timeout-minutes: 15 # Increased for chunked upload of 431+ modules (~11 batches) | ||
| env: | ||
| WORKER_ENDPOINT: ${{ needs.deploy-worker.outputs.worker-url }} | ||
| API_KEY: ${{ secrets.MODULE_API_KEY }} | ||
| run: | | ||
| echo "Uploading module data to API (chunked batches)..." | ||
| ./scripts/extract-and-upload.sh --upload-only \ | ||
| --endpoint "$WORKER_ENDPOINT" \ | ||
| --api-key "$API_KEY" | ||
|
|
||
| - name: Verify deployment | ||
| env: | ||
| WORKER_ENDPOINT: ${{ needs.deploy-worker.outputs.worker-url }} | ||
| run: | | ||
| echo "Verifying deployment..." | ||
|
|
||
| # Check health endpoint | ||
| curl -sf "$WORKER_ENDPOINT/health" || exit 1 | ||
|
|
||
| # Check stats endpoint | ||
| STATS=$(curl -s "$WORKER_ENDPOINT/api/stats") | ||
|
|
||
| echo "API Statistics:" | ||
| echo "$STATS" | jq . | ||
|
|
||
| # Verify module count matches | ||
| UPLOADED_COUNT=$(echo "$STATS" | jq -r '.stats.total_modules') | ||
| EXPECTED_COUNT=${{ needs.extract-modules.outputs.module-count }} | ||
|
|
||
| if [ "$UPLOADED_COUNT" -eq "$EXPECTED_COUNT" ]; then | ||
| echo "✅ Successfully uploaded all $UPLOADED_COUNT modules" | ||
| else | ||
| echo "⚠️ Module count mismatch: uploaded=$UPLOADED_COUNT, expected=$EXPECTED_COUNT" | ||
| fi | ||
|
|
||
| build-frontend: | ||
| name: Build and Deploy Frontend | ||
| needs: [deploy-worker, upload-data] | ||
| runs-on: ubuntu-latest | ||
| steps: | ||
| - name: Checkout repository | ||
| uses: actions/checkout@v5.0.0 | ||
|
|
||
| - name: Setup Node.js | ||
| uses: actions/setup-node@v5.0.0 | ||
| with: | ||
| node-version: ${{ env.NODE_VERSION }} | ||
|
|
||
| - name: Build frontend | ||
| working-directory: implementation/frontend | ||
| run: | | ||
| echo "Building frontend..." | ||
| # Install dependencies if package.json exists | ||
| if [ -f package.json ]; then | ||
| npm ci | ||
| npm run build | ||
| else | ||
| echo "No frontend build configured yet" | ||
| mkdir -p dist | ||
| echo "<html><body><h1>NixOS Module Documentation</h1><p>Frontend coming soon...</p></body></html>" > dist/index.html | ||
| fi | ||
|
|
||
| - name: Deploy frontend to Worker | ||
| working-directory: implementation/frontend | ||
| env: | ||
| CLOUDFLARE_ACCOUNT_ID: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }} | ||
| CLOUDFLARE_API_TOKEN: ${{ secrets.CLOUDFLARE_API_TOKEN }} | ||
| run: | | ||
| echo "Deploying frontend assets..." | ||
| # This would upload to R2 or use Workers Static Assets | ||
| # For now, we'll skip this as the frontend is not implemented | ||
|
|
||
| summary: | ||
| name: Deployment Summary | ||
| needs: [extract-modules, deploy-worker, upload-data] | ||
| runs-on: ubuntu-latest | ||
| if: always() | ||
| steps: | ||
| - name: Generate summary | ||
| run: | | ||
| cat << EOF >> $GITHUB_STEP_SUMMARY | ||
| # 📚 NixOS Module Documentation Deployment | ||
|
|
||
| ## Deployment Status | ||
| - **Environment**: ${{ github.event.inputs.environment || 'staging' }} | ||
| - **Branch**: ${{ github.ref_name }} | ||
| - **Commit**: ${{ github.sha }} | ||
|
|
||
| ## Module Extraction | ||
| - **Modules Extracted**: ${{ needs.extract-modules.outputs.module-count || 'N/A' }} | ||
| - **Namespaces**: ${{ needs.extract-modules.outputs.namespace-count || 'N/A' }} | ||
|
|
||
| ## API Deployment | ||
| - **Worker URL**: ${{ needs.deploy-worker.outputs.worker-url || 'Deployment failed' }} | ||
| - **Health Check**: [Check Status](${{ needs.deploy-worker.outputs.worker-url }}/health) | ||
| - **API Stats**: [View Statistics](${{ needs.deploy-worker.outputs.worker-url }}/api/stats) | ||
|
|
||
| ## Next Steps | ||
| 1. Visit the [API documentation](${{ needs.deploy-worker.outputs.worker-url }}/docs) | ||
| 2. Test the search endpoint: \`/api/modules/search?q=networking\` | ||
| 3. Browse modules by namespace: \`/api/modules?namespace=services\` | ||
|
|
||
| --- | ||
| *Deployed at $(date -u '+%Y-%m-%d %H:%M:%S UTC')* | ||
| EOF | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -45,6 +45,6 @@ excludes = [ | |
| ".hgignore", | ||
| ".svnignore", | ||
| "inputs/*", | ||
| "inputs/*", | ||
| "secrets/**", | ||
| ] | ||
| on-unmatched = "warn" | ||
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.