Commit 99df2ac
Enhance NotebookLM integration and frontend video ingestion pipeline (#51)
* feat: Initialize PGLite v17 database data files for the dataconnect project.
* feat: enable automatic outline generation for Gemini Code Assist in VS Code settings.
* feat: Add NotebookLM integration with a new processor and `analyze_video_with_notebooklm` MCP tool.
* feat: Add NotebookLM profile data and an ingestion test.
* chore: Update and add generated browser profile files for notebooklm development.
* Update `notebooklm_chrome_profile` internal state and add architectural context documentation and video asset.
* feat: Add various knowledge prototypes for MCP servers and universal automation, archive numerous scripts and documentation, and update local browser profile data.
* chore: Add generated browser profile cache and data for notebooklm.
* Update notebooklm Chrome profile preferences, cache, and session data.
* feat: Update NotebookLM Chrome profile with new cache, preferences, and service worker data.
* feat: Add generated Chrome profile cache and code cache files and update associated profile data.
* Update `notebooklm` Chrome profile cache, code cache, GPU cache, and safe browsing data.
* chore(deps): bump the npm_and_yarn group across 4 directories with 5 updates
Bumps the npm_and_yarn group with 3 updates in the / directory: [ajv](https://github.com/ajv-validator/ajv), [hono](https://github.com/honojs/hono) and [qs](https://github.com/ljharb/qs).
Bumps the npm_and_yarn group with 3 updates in the /docs/knowledge_prototypes/mcp-servers/fetch-mcp directory: [@modelcontextprotocol/sdk](https://github.com/modelcontextprotocol/typescript-sdk), [ajv](https://github.com/ajv-validator/ajv) and [hono](https://github.com/honojs/hono).
Bumps the npm_and_yarn group with 1 update in the /scripts/archive/software-on-demand directory: [ajv](https://github.com/ajv-validator/ajv).
Bumps the npm_and_yarn group with 2 updates in the /scripts/archive/supabase_cleanup directory: [next](https://github.com/vercel/next.js) and [qs](https://github.com/ljharb/qs).
Updates `ajv` from 8.17.1 to 8.18.0
- [Release notes](https://github.com/ajv-validator/ajv/releases)
- [Commits](ajv-validator/ajv@v8.17.1...v8.18.0)
Updates `hono` from 4.11.7 to 4.12.1
- [Release notes](https://github.com/honojs/hono/releases)
- [Commits](honojs/hono@v4.11.7...v4.12.1)
Updates `qs` from 6.14.1 to 6.15.0
- [Changelog](https://github.com/ljharb/qs/blob/main/CHANGELOG.md)
- [Commits](ljharb/qs@v6.14.1...v6.15.0)
Updates `@modelcontextprotocol/sdk` from 1.25.2 to 1.26.0
- [Release notes](https://github.com/modelcontextprotocol/typescript-sdk/releases)
- [Commits](modelcontextprotocol/typescript-sdk@v1.25.2...v1.26.0)
Updates `ajv` from 8.17.1 to 8.18.0
- [Release notes](https://github.com/ajv-validator/ajv/releases)
- [Commits](ajv-validator/ajv@v8.17.1...v8.18.0)
Updates `hono` from 4.11.5 to 4.12.1
- [Release notes](https://github.com/honojs/hono/releases)
- [Commits](honojs/hono@v4.11.7...v4.12.1)
Updates `qs` from 6.14.1 to 6.15.0
- [Changelog](https://github.com/ljharb/qs/blob/main/CHANGELOG.md)
- [Commits](ljharb/qs@v6.14.1...v6.15.0)
Updates `ajv` from 8.17.1 to 8.18.0
- [Release notes](https://github.com/ajv-validator/ajv/releases)
- [Commits](ajv-validator/ajv@v8.17.1...v8.18.0)
Updates `next` from 15.4.10 to 15.5.10
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](vercel/next.js@v15.4.10...v15.5.10)
Updates `qs` from 6.14.1 to 6.15.0
- [Changelog](https://github.com/ljharb/qs/blob/main/CHANGELOG.md)
- [Commits](ljharb/qs@v6.14.1...v6.15.0)
---
updated-dependencies:
- dependency-name: ajv
dependency-version: 8.18.0
dependency-type: indirect
dependency-group: npm_and_yarn
- dependency-name: hono
dependency-version: 4.12.1
dependency-type: indirect
dependency-group: npm_and_yarn
- dependency-name: qs
dependency-version: 6.15.0
dependency-type: indirect
dependency-group: npm_and_yarn
- dependency-name: "@modelcontextprotocol/sdk"
dependency-version: 1.26.0
dependency-type: direct:production
dependency-group: npm_and_yarn
- dependency-name: ajv
dependency-version: 8.18.0
dependency-type: indirect
dependency-group: npm_and_yarn
- dependency-name: hono
dependency-version: 4.12.1
dependency-type: indirect
dependency-group: npm_and_yarn
- dependency-name: qs
dependency-version: 6.15.0
dependency-type: indirect
dependency-group: npm_and_yarn
- dependency-name: ajv
dependency-version: 8.18.0
dependency-type: direct:production
dependency-group: npm_and_yarn
- dependency-name: next
dependency-version: 15.5.10
dependency-type: direct:production
dependency-group: npm_and_yarn
- dependency-name: qs
dependency-version: 6.15.0
dependency-type: indirect
dependency-group: npm_and_yarn
...
Signed-off-by: dependabot[bot] <support@github.com>
* chore(deps): bump minimatch
Bumps the npm_and_yarn group with 1 update in the /scripts/archive/supabase_cleanup directory: [minimatch](https://github.com/isaacs/minimatch).
Updates `minimatch` from 3.1.2 to 3.1.4
- [Changelog](https://github.com/isaacs/minimatch/blob/main/changelog.md)
- [Commits](isaacs/minimatch@v3.1.2...v3.1.4)
---
updated-dependencies:
- dependency-name: minimatch
dependency-version: 3.1.4
dependency-type: indirect
dependency-group: npm_and_yarn
...
Signed-off-by: dependabot[bot] <support@github.com>
* chore(deps): bump the npm_and_yarn group across 2 directories with 1 update
Bumps the npm_and_yarn group with 1 update in the / directory: [hono](https://github.com/honojs/hono).
Bumps the npm_and_yarn group with 1 update in the /docs/knowledge_prototypes/mcp-servers/fetch-mcp directory: [hono](https://github.com/honojs/hono).
Updates `hono` from 4.12.1 to 4.12.2
- [Release notes](https://github.com/honojs/hono/releases)
- [Commits](honojs/hono@v4.12.1...v4.12.2)
Updates `hono` from 4.12.1 to 4.12.2
- [Release notes](https://github.com/honojs/hono/releases)
- [Commits](honojs/hono@v4.12.1...v4.12.2)
---
updated-dependencies:
- dependency-name: hono
dependency-version: 4.12.2
dependency-type: indirect
dependency-group: npm_and_yarn
- dependency-name: hono
dependency-version: 4.12.2
dependency-type: indirect
dependency-group: npm_and_yarn
...
Signed-off-by: dependabot[bot] <support@github.com>
* feat: enable frontend-only video ingestion pipeline for Vercel deployment
The core pipeline previously required the Python backend to be running.
When deployed to Vercel (https://v0-uvai.vercel.app/), the backend is
unavailable, causing all video analysis to fail immediately.
Changes:
- /api/video: Falls back to frontend-only pipeline (transcribe + extract)
when the Python backend is unreachable, with 15s timeout
- /api/transcribe: Adds Gemini fallback when OpenAI is unavailable, plus
8s timeout on backend probe to avoid hanging on Vercel
- layout.tsx: Loads Google Fonts via <link> instead of next/font/google
to avoid build failures in offline/sandboxed CI environments
- page.tsx: Replace example URLs with technical content (3Blue1Brown
neural networks, Karpathy LLM intro) instead of rick roll / zoo videos
- gemini_service.py: Gate Vertex AI import behind GOOGLE_CLOUD_PROJECT
env var to prevent 30s+ hangs on the GCE metadata probe
- agent_gap_analyzer.py: Fix f-string backslash syntax errors (Python 3.11)
https://claude.ai/code/session_015Pd3a6hinTenCNrPRGiZqE
* Potential fix for code scanning alert no. 4518: Server-side request forgery
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
* Initial plan
* Potential fix for code scanning alert no. 4517: Server-side request forgery
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
* Initial plan
* Fix review feedback: timeout cleanup, transcript_segments shape, ENABLE_VERTEX_AI boolean parsing
Co-authored-by: groupthinking <154503486+groupthinking@users.noreply.github.com>
* fix: clearTimeout in finally blocks, transcript_segments shape, ENABLE_VERTEX_AI boolean parsing
Co-authored-by: groupthinking <154503486+groupthinking@users.noreply.github.com>
* Update src/youtube_extension/services/ai/gemini_service.py
Co-authored-by: vercel[bot] <35613825+vercel[bot]@users.noreply.github.com>
* Update apps/web/src/app/api/video/route.ts
Co-authored-by: vercel[bot] <35613825+vercel[bot]@users.noreply.github.com>
* Update apps/web/src/app/api/video/route.ts
Co-authored-by: vercel[bot] <35613825+vercel[bot]@users.noreply.github.com>
* Initial plan
* Initial plan
* Fix: move clearTimeout into .finally() to prevent timer leaks on fetch abort/error
Co-authored-by: groupthinking <154503486+groupthinking@users.noreply.github.com>
* Fix clearTimeout not called in finally blocks for AbortController timeouts
Co-authored-by: groupthinking <154503486+groupthinking@users.noreply.github.com>
* Fix: Relative URLs in server-side fetch calls fail in production - fetch('/api/transcribe') and fetch('/api/extract-events') use relative URLs which don't resolve correctly in server-side Next.js code on production deployments like Vercel.
This commit fixes the issue reported at apps/web/src/app/api/video/route.ts:101
## Bug Analysis
**Why it happens:**
In Next.js API routes running on the server (Node.js runtime), the `fetch()` API requires absolute URLs. Unlike browsers which have an implicit base URL (the current origin), server-side code has no context for resolving relative URLs like `/api/transcribe`. The Node.js fetch implementation will fail to resolve these relative paths, resulting in TypeError or connection errors.
**When it manifests:**
- **Development (localhost:3000)**: Works accidentally because the request URL contains the host
- **Production (Vercel)**: Fails because the relative URL cannot be resolved to a valid absolute URL without proper host context
**What impact it has:**
The frontend-only pipeline fallback (Strategy 2) in lines 101-132 is completely broken in production. When the backend is unavailable (common on Vercel), the code attempts to use `/api/transcribe` and `/api/extract-events` serverless functions but fails due to unresolvable relative URLs. This causes the entire video analysis endpoint to fail when the backend is unavailable.
## Fix Explanation
**Changes made:**
1. Added a `getBaseUrl(request: Request)` helper function that extracts the absolute base URL from the incoming request object using `new URL(request.url)`
2. Updated line 108: `fetch('/api/transcribe', ...)` → `fetch(`${baseUrl}/api/transcribe`, ...)`
3. Updated line 127: `fetch('/api/extract-events', ...)` → `fetch(`${baseUrl}/api/extract-events`, ...)`
**Why it solves the issue:**
- The incoming `request` object contains the full URL including protocol and host
- By constructing an absolute URL from the request, we ensure the fetch calls work in both development and production
- This approach is more reliable than environment variables because it uses the actual request context, handling reverse proxies and different deployment configurations correctly
Co-authored-by: Vercel <vercel[bot]@users.noreply.github.com>
Co-authored-by: groupthinking <garveyht@gmail.com>
* Initial plan
* chore(deps): bump the npm_and_yarn group across 1 directory with 1 update
Bumps the npm_and_yarn group with 1 update in the /docs/knowledge_prototypes/mcp-servers/fetch-mcp directory: [minimatch](https://github.com/isaacs/minimatch).
Updates `minimatch` from 3.1.2 to 3.1.5
- [Changelog](https://github.com/isaacs/minimatch/blob/main/changelog.md)
- [Commits](isaacs/minimatch@v3.1.2...v3.1.5)
Updates `minimatch` from 5.1.6 to 5.1.9
- [Changelog](https://github.com/isaacs/minimatch/blob/main/changelog.md)
- [Commits](isaacs/minimatch@v3.1.2...v3.1.5)
---
updated-dependencies:
- dependency-name: minimatch
dependency-version: 3.1.5
dependency-type: indirect
dependency-group: npm_and_yarn
- dependency-name: minimatch
dependency-version: 5.1.9
dependency-type: indirect
dependency-group: npm_and_yarn
...
Signed-off-by: dependabot[bot] <support@github.com>
* fix: validate BACKEND_URL before using it
Skip backend calls entirely when BACKEND_URL is not configured or
contains an invalid value (like a literal ${...} template string).
This prevents URL parse errors on Vercel where the env var may not
be set.
https://claude.ai/code/session_015Pd3a6hinTenCNrPRGiZqE
* fix: resolve embeddings package build errors (#41)
- Create stub types for Firebase Data Connect SDK in src/dataconnect-generated/
- Fix import path from ../dataconnect-generated to ./dataconnect-generated (rootDir constraint)
- Add explicit type assertions for JSON responses (predictions, access_token)
- All 6 TypeScript errors resolved, clean build verified
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
* feat: Gemini SDK upgrade + VideoPack schema alignment (#43)
* chore: Update generated Chrome profile cache and session data for notebooklm.
* chore: refresh notebooklm Chrome profile data, including Safe Browsing lists, caches, and session files.
* Update local application cache and database files within the NotebookLM Chrome profile.
* chore: update Chrome profile cache and Safe Browsing data files.
* feat: upgrade Gemini to @google/genai SDK with structured output, search grounding, video URL processing, and extend VideoPack schema
- Upgrade extract-events/route.ts from @google/generative-ai to @google/genai
- Add Gemini responseSchema with Type system for structured output enforcement
- Add Google Search grounding (googleSearch tool) to Gemini calls
- Upgrade transcribe/route.ts to @google/genai with direct YouTube URL processing via fileData
- Add Gemini video URL fallback chain: direct video → text+search → other strategies
- Extend VideoPackV0 schema with Chapter, CodeCue, Task models
- Update versioning shim for new fields
- Export new types from videopack __init__
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
---------
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
* feat: wire CloudEvents pipeline + Chrome Built-in AI fallback (#44)
- Add TypeScript CloudEvents publisher (apps/web/src/lib/cloudevents.ts)
emitting standardized events at each video processing stage
- Wire CloudEvents into /api/video route (both backend + frontend strategies)
- Wire CloudEvents into FastAPI backend router (process_video_v1 endpoint)
- Add Chrome Built-in AI service (Prompt API + Summarizer API) for
on-device client-side transcript analysis when API keys are unavailable
- Add useBuiltInAI React hook for component integration
- Add .next/ to .gitignore
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
* feat: wire A2A inter-agent messaging into orchestrator + API (#45)
- Add A2AContextMessage dataclass to AgentOrchestrator for lightweight
inter-agent context sharing during parallel task execution
- Auto-broadcast agent results to peer agents after parallel execution
- Add send_a2a_message() and get_a2a_log() methods to orchestrator
- Add POST /api/v1/agents/a2a/send endpoint for frontend-to-agent messaging
- Add GET /api/v1/agents/a2a/log endpoint to query message history
- Extend frontend agentService with sendA2AMessage() and getA2ALog()
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
* feat: add LiteRT-LM setup script and update README (#46)
- Add setup.sh to download lit CLI binary and .litertlm model
- Support macOS arm64 and x86_64 architectures
- Auto-generate .env with LIT_BINARY_PATH and LIT_MODEL_PATH
- Add .gitignore for bin/, models/, .env
- Update README with Quick Setup section
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
* feat: implement Gemini agentic video analysis with Google Search grounding (#47)
- Create gemini-video-analyzer.ts: single Gemini call with googleSearch
tool for transcript extraction AND event analysis (PK=998 pattern)
- Add youtube-metadata.ts: scrapes title, description, chapters from
YouTube without API key
- Update /api/video: Gemini agentic analysis as primary strategy,
transcribe→extract chain as fallback
- Fix /api/transcribe: remove broken fileData.fileUri, use Gemini
Google Search grounding as primary, add metadata context, filter
garbage OpenAI results
- Fix /api/extract-events: accept videoUrl without requiring transcript,
direct Gemini analysis via Google Search when no transcript available
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
* fix: support Vertex_AI_API_KEY as Gemini key fallback
Create shared gemini-client.ts that resolves API key from:
GEMINI_API_KEY → GOOGLE_API_KEY → Vertex_AI_API_KEY
All API routes now use the shared client instead of
hardcoding process.env.GEMINI_API_KEY.
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
* fix: use Vertex AI Express Mode for Vertex_AI_API_KEY
When only Vertex_AI_API_KEY is set (no GEMINI_API_KEY), the client
now initializes in Vertex AI mode with vertexai: true + apiKey.
Uses project uvai-730bb and us-central1 as defaults.
Also added GOOGLE_CLOUD_PROJECT env var to Vercel.
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
* fix: Vertex AI Express Mode compatibility — remove responseSchema+googleSearch conflict (#48)
Vertex AI does not support controlled generation (responseSchema) combined
with the googleSearch tool. This caused 400 errors on every Gemini call.
Changes:
- gemini-client.ts: Prioritize Vertex_AI_API_KEY, support GOOGLE_GENAI_USE_VERTEXAI env var
- gemini-video-analyzer.ts: Remove responseSchema, enforce JSON via prompt instructions
- extract-events/route.ts: Same fix for extractWithGemini and inline Gemini calls
- Strip markdown code fences from responses before JSON parsing
Tested end-to-end with Vertex AI Express Mode key against multiple YouTube videos.
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
* fix: restore full PK=998 pattern — responseSchema + googleSearch + gemini-3-pro-preview (#49)
The previous fix (PR #48) was a shortcut — it removed responseSchema when
the real issue was using gemini-2.5-flash which doesn't support
responseSchema + googleSearch together on Vertex AI.
gemini-3-pro-preview DOES support the combination. This commit restores
the exact PK=998 pattern:
- gemini-video-analyzer.ts: Restored responseSchema with Type system,
responseMimeType, e22Snippets field, model → gemini-3-pro-preview
- extract-events/route.ts: Restored geminiResponseSchema, Type import,
responseMimeType, model → gemini-3-pro-preview
- transcribe/route.ts: model → gemini-3-pro-preview
Tested with Vertex AI Express Mode key on two YouTube videos.
Both return structured JSON with events, transcript, actions,
codeMapping, cloudService, e22Snippets, architectureCode, ingestScript.
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
* feat: end-to-end pipeline — YouTube URL to deployed software (#50)
- Add /api/pipeline route for full end-to-end pipeline
(video analysis → code generation → GitHub repo → Vercel deploy)
- Add deployPipeline() action to dashboard store with stage tracking
- Add 🚀 Deploy button to dashboard alongside Analyze
- Show pipeline results (live URL, GitHub repo, framework) in video cards
- Fix deployment_manager import path in video_processing_service
- Wire pipeline to backend /api/v1/video-to-software endpoint
- Fallback to Gemini-only analysis when no backend available
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
* fix: add writable directories to Docker image for deployment pipeline
Create /app/generated_projects, /app/youtube_processed_videos, and
/tmp/uvai_data directories in Dockerfile to fix permission denied
errors in the deployment and video processing pipeline on Railway.
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
* fix: security hardening, video-specific codegen, API consistency
- CORS: replace wildcard/glob with explicit allowed origins in both entry points
- Rate limiting: enable 60 req/min with 15 burst on backend
- API auth: add optional X-API-Key middleware for pipeline endpoints
- Codegen: generate video-specific HTML/CSS/JS from analysis output
- API: accept both 'url' and 'video_url' via Pydantic alias
- Deploy: fix Vercel REST API payload format (gitSource instead of gitRepository)
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
* fix: Vercel deployment returning empty live_url
Root causes fixed:
- Case mismatch in _poll_deployment_status: compared lowercased status
against uppercase success_statuses list, so READY was never matched
- Vercel API returns bare domain URLs without https:// prefix; added
_ensure_https() to normalize them
- Poll requests were missing auth headers, causing 401 failures
- _deploy_files_directly fallback returned fake simulated URLs that
masked real failures; removed in favor of proper error reporting
- _generate_deployment_urls only returned URLs from 'success' status
deployments, discarding useful fallback URLs from failed deployments
Improvements:
- On API failure (permissions, plan limits), return a Vercel import URL
the user can click to deploy manually instead of an empty string
- Support VERCEL_ORG_ID team scoping on deploy and poll endpoints
- Use readyState field (Vercel v13 API) for initial status check
- Add 'canceled' to failure status list in poll loop
- Poll failures are now non-fatal; initial URL is used as fallback
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
* fix: harden slim entry point — CORS, rate limiting, auth, security headers
- Add uvaiio.vercel.app to CORS allowed origins
- Add slowapi rate limiting (60 req/min)
- Add API key auth middleware (optional via EVENTRELAY_API_KEY)
- Add security headers (X-Content-Type-Options, X-Frame-Options, X-XSS-Protection)
- Fixes production gap where slim main.py had none of the backend/main.py protections
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
* fix: resolve Pydantic Config/model_config conflict breaking Railway deploy
The VideoToSoftwareRequest model had both 'model_config = ConfigDict(...)' and
'class Config:' which Pydantic v2 rejects. Merged into single model_config.
This was causing the v1 router to fail loading, making /api/v1/health return 404.
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
---------
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: vercel[bot] <35613825+vercel[bot]@users.noreply.github.com>
Co-authored-by: Vercel <vercel[bot]@users.noreply.github.com>
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>1 parent 8820c5b commit 99df2ac
12,310 files changed
Lines changed: 346724 additions & 814 deletions
File tree
- .firebase
- .vscode
- apps/web
- src
- app
- api
- extract-events
- pipeline
- transcribe
- video
- dashboard
- hooks
- lib
- services
- store
- dataconnect
- .dataconnect
- pgliteData/pg17
- base
- 1
- 4
- 5
- global
- pg_logical
- pg_multixact
- members
- offsets
- pg_subtrans
- pg_wal
- pg_xact
- schema
- main
- example
- docs
- gemini_reference
- Google AI Studio_files
- Google APIs Explorer _ Google for Developers_files
- guides
- knowledge_prototypes
- legacy_scripts
- mcp-servers
- docs
- fetch-mcp
- src
- legacy_proxies
- server-code-assistant
- server-communication-hub
- server-creative-studio
- server-data-analysis
- server-knowledge-management
- server-workflow-automation
- mcp-alexnet
- mcp_youtube-0.2.0
- .cursor
- .github/workflows
- tests
- server-npm
- universal-automation-service
- config
- integrations
- monitoring
- public
- mcp-servers/litert-mcp
- notebooklm_chrome_profile
- ActorSafetyLists/8.6294.2057
- _metadata
- AmountExtractionHeuristicRegexes/4
- _metadata
- AutofillStates/2025.6.13.84507
- _metadata
- CertificateRevocation
- 10348
- _metadata
- 10349
- _metadata
- 10351
- _metadata
- 10352
- _metadata
- 10353
- _metadata
- 10354
- _metadata
- 10355
- _metadata
- CookieReadinessList/2024.11.26.0
- _metadata
- Crowd Deny
- 2026.2.12.120
- _metadata
- 2026.2.16.121
- _metadata
- Default
- Accounts/Avatar Images
- AutofillAiModelCache
- AutofillStrikeDatabase
- BudgetDatabase
- Cache/Cache_Data
Some content is hidden
Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
155 | 155 | | |
156 | 156 | | |
157 | 157 | | |
| 158 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
81 | 81 | | |
82 | 82 | | |
83 | 83 | | |
84 | | - | |
| 84 | + | |
| 85 | + | |
85 | 86 | | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
49 | 49 | | |
50 | 50 | | |
51 | 51 | | |
52 | | - | |
53 | | - | |
| 52 | + | |
| 53 | + | |
54 | 54 | | |
55 | 55 | | |
56 | 56 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
9 | 9 | | |
10 | 10 | | |
11 | 11 | | |
| 12 | + | |
12 | 13 | | |
13 | 14 | | |
14 | 15 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
1 | 1 | | |
2 | | - | |
| 2 | + | |
3 | 3 | | |
| 4 | + | |
4 | 5 | | |
5 | 6 | | |
6 | 7 | | |
7 | 8 | | |
8 | 9 | | |
9 | 10 | | |
10 | 11 | | |
11 | | - | |
12 | | - | |
13 | | - | |
14 | | - | |
15 | | - | |
16 | | - | |
17 | | - | |
| 12 | + | |
18 | 13 | | |
19 | 14 | | |
20 | 15 | | |
| |||
54 | 49 | | |
55 | 50 | | |
56 | 51 | | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
| 73 | + | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
| 78 | + | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
| 85 | + | |
| 86 | + | |
| 87 | + | |
| 88 | + | |
57 | 89 | | |
58 | 90 | | |
59 | 91 | | |
| |||
94 | 126 | | |
95 | 127 | | |
96 | 128 | | |
97 | | - | |
98 | | - | |
99 | | - | |
100 | | - | |
| 129 | + | |
| 130 | + | |
| 131 | + | |
| 132 | + | |
| 133 | + | |
101 | 134 | | |
| 135 | + | |
| 136 | + | |
| 137 | + | |
102 | 138 | | |
103 | 139 | | |
104 | | - | |
105 | | - | |
| 140 | + | |
106 | 141 | | |
107 | 142 | | |
108 | 143 | | |
109 | 144 | | |
110 | 145 | | |
111 | 146 | | |
112 | 147 | | |
113 | | - | |
| 148 | + | |
| 149 | + | |
114 | 150 | | |
115 | | - | |
| 151 | + | |
116 | 152 | | |
117 | 153 | | |
118 | 154 | | |
119 | 155 | | |
120 | | - | |
121 | 156 | | |
122 | 157 | | |
123 | 158 | | |
124 | | - | |
125 | | - | |
126 | | - | |
127 | | - | |
128 | | - | |
129 | | - | |
130 | | - | |
131 | | - | |
132 | | - | |
133 | | - | |
134 | | - | |
135 | | - | |
| 159 | + | |
| 160 | + | |
| 161 | + | |
| 162 | + | |
| 163 | + | |
| 164 | + | |
| 165 | + | |
| 166 | + | |
| 167 | + | |
| 168 | + | |
| 169 | + | |
| 170 | + | |
| 171 | + | |
| 172 | + | |
| 173 | + | |
| 174 | + | |
136 | 175 | | |
| 176 | + | |
| 177 | + | |
| 178 | + | |
137 | 179 | | |
138 | | - | |
139 | | - | |
140 | | - | |
141 | | - | |
| 180 | + | |
| 181 | + | |
| 182 | + | |
| 183 | + | |
| 184 | + | |
| 185 | + | |
| 186 | + | |
| 187 | + | |
| 188 | + | |
| 189 | + | |
| 190 | + | |
| 191 | + | |
| 192 | + | |
| 193 | + | |
| 194 | + | |
| 195 | + | |
| 196 | + | |
| 197 | + | |
| 198 | + | |
| 199 | + | |
| 200 | + | |
| 201 | + | |
| 202 | + | |
| 203 | + | |
| 204 | + | |
| 205 | + | |
| 206 | + | |
| 207 | + | |
| 208 | + | |
| 209 | + | |
| 210 | + | |
142 | 211 | | |
143 | 212 | | |
144 | | - | |
| 213 | + | |
145 | 214 | | |
146 | 215 | | |
147 | 216 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
| 23 | + | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
| 29 | + | |
| 30 | + | |
| 31 | + | |
| 32 | + | |
| 33 | + | |
| 34 | + | |
| 35 | + | |
| 36 | + | |
| 37 | + | |
| 38 | + | |
| 39 | + | |
| 40 | + | |
| 41 | + | |
| 42 | + | |
| 43 | + | |
| 44 | + | |
| 45 | + | |
| 46 | + | |
| 47 | + | |
| 48 | + | |
| 49 | + | |
| 50 | + | |
| 51 | + | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
| 73 | + | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
| 78 | + | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
| 85 | + | |
| 86 | + | |
| 87 | + | |
| 88 | + | |
| 89 | + | |
| 90 | + | |
| 91 | + | |
| 92 | + | |
| 93 | + | |
| 94 | + | |
| 95 | + | |
| 96 | + | |
| 97 | + | |
| 98 | + | |
| 99 | + | |
| 100 | + | |
| 101 | + | |
| 102 | + | |
| 103 | + | |
| 104 | + | |
| 105 | + | |
| 106 | + | |
| 107 | + | |
| 108 | + | |
| 109 | + | |
| 110 | + | |
| 111 | + | |
| 112 | + | |
| 113 | + | |
| 114 | + | |
| 115 | + | |
| 116 | + | |
| 117 | + | |
| 118 | + | |
| 119 | + | |
| 120 | + | |
| 121 | + | |
| 122 | + | |
| 123 | + | |
| 124 | + | |
| 125 | + | |
| 126 | + | |
| 127 | + | |
| 128 | + | |
| 129 | + | |
| 130 | + | |
| 131 | + | |
| 132 | + | |
| 133 | + | |
| 134 | + | |
| 135 | + | |
| 136 | + | |
| 137 | + | |
| 138 | + | |
| 139 | + | |
| 140 | + | |
| 141 | + | |
| 142 | + | |
| 143 | + | |
| 144 | + | |
| 145 | + | |
| 146 | + | |
| 147 | + | |
| 148 | + | |
| 149 | + | |
| 150 | + | |
| 151 | + | |
| 152 | + | |
| 153 | + | |
| 154 | + | |
| 155 | + | |
| 156 | + | |
| 157 | + | |
| 158 | + | |
| 159 | + | |
| 160 | + | |
| 161 | + | |
| 162 | + | |
| 163 | + | |
0 commit comments