Skip to content

feat: wip data approval and sync#505

Draft
chrismclarke wants to merge 1 commit intomainfrom
feat/data-approval
Draft

feat: wip data approval and sync#505
chrismclarke wants to merge 1 commit intomainfrom
feat/data-approval

Conversation

@chrismclarke
Copy link
Copy Markdown
Collaborator

Description

Summary of main changes

Discussion

Feedback discussion points if relevant (should also tag as Feedback Discussion)

Preview

Link to app preview if relevant

Screenshots / Videos

Include at least 1-2 screenshots of videos if visual changes

@gemini-code-assist
Copy link
Copy Markdown
Contributor

Summary of Changes

Hello @chrismclarke, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request lays the groundwork for a robust data management system, integrating client-side data synchronization with an administrative approval workflow. It ensures that client applications can maintain up-to-date local data efficiently, while providing a controlled process for administrators to review and publish changes from the backend. This feature significantly enhances data consistency and administrative oversight.

Highlights

  • Client-side Data Synchronization: Introduced DataSyncService to manage initial data loading from bundled assets and incremental synchronization with the Supabase backend, ensuring local data consistency.
  • Admin Data Approval Interface: Added a new DataApprovalComponent and DataApprovalService in the dashboard, providing an administrative interface to review and publish pending data changes.
  • Database Schema Enhancements: Modified database tables to include published_at and deleted columns, crucial for tracking data publication status and soft deletions for synchronization.
  • Supabase RPC for Sync: Implemented a new Supabase RPC function get_sync_changes to efficiently query for data modifications since the last synchronization timestamp.
  • Data Bundling Script: Created export-data.ts to generate static JSON bundles of published data, used for fast initial loading in client applications.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@nx-cloud
Copy link
Copy Markdown

nx-cloud bot commented Jan 14, 2026

🤖 Nx Cloud AI Fix Eligible

An automatically generated fix could have helped fix failing tasks for this run, but Self-healing CI is disabled for this workspace. Visit workspace settings to enable it and get automatic fixes in future runs.

To disable these notifications, a workspace admin can disable them in workspace settings.


View your CI Pipeline Execution ↗ for commit 52e13da

Command Status Duration Result
nx run-many --target=lint ❌ Failed 6s View ↗

☁️ Nx Cloud last updated this comment at 2026-01-14 22:56:50 UTC

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a new data synchronization and approval mechanism, which is a significant feature. The implementation includes a new dashboard page for data approval, a client-side sync service using RxDB, a Supabase RPC function, and a data export script.

My review focuses on several key areas:

  • Maintainability: There's significant duplication of the SYNCABLE_TABLES list and primary key logic across frontend, backend (SQL), and build scripts. Centralizing this is critical.
  • Correctness & Robustness: I've pointed out issues with unmanaged RxJS subscriptions which can cause memory leaks, and incorrect asynchronous operation chaining that could lead to race conditions.
  • Performance: There's an opportunity to optimize data loading during the merge process in the sync service.
  • Consistency: I've suggested using the existing ErrorHandlerService instead of console.error for consistent error handling.

Overall, this is a good foundation for the new feature, but addressing these points will greatly improve its robustness and maintainability.

Comment on lines +24 to +26
this.approvalService.pendingChanges$.subscribe((changes) => {
this.dataSource.data = changes;
});
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

This subscription is not being managed and will lead to a memory leak if the component is destroyed. A common pattern in Angular to avoid this is using the takeUntil operator. You should apply this to all subscriptions in this component (ngOnInit, refresh, publishSelected).

Here's how you can implement it:

  1. Add OnDestroy to your component's implements list and import it from @angular/core.
  2. Add a destroy$ subject: private destroy$ = new Subject<void>(); (import Subject from rxjs).
  3. Implement ngOnDestroy:
    ngOnDestroy() {
      this.destroy$.next();
      this.destroy$.complete();
    }
  4. Pipe takeUntil(this.destroy$) to your subscriptions (import takeUntil from rxjs/operators).
Suggested change
this.approvalService.pendingChanges$.subscribe((changes) => {
this.dataSource.data = changes;
});
this.approvalService.pendingChanges$.pipe(takeUntil(this.destroy$)).subscribe((changes) => {
this.dataSource.data = changes;
});

Comment on lines +14 to +27
export const SYNCABLE_TABLES = [
'climate_station_data',
'climate_stations',
'crop_data',
'crop_data_downscaled',
'deployments',
'forecasts',
'monitoring_forms',
'resource_collections',
'resource_files',
'resource_files_child',
'resource_links',
'translations',
];
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The SYNCABLE_TABLES constant is hardcoded here. This list is also duplicated in apps/picsa-server/supabase/migrations/20251124000000_sync_schema.sql and tools/scripts/export-data.ts. This duplication can lead to inconsistencies if a table is added or removed and not updated in all places. Consider creating a single source of truth for this list in a shared library file (e.g., in libs/shared) and import it where needed.

return [];
}
return (data || []).map((record: any) => ({
id: record.id || record.station_id, // Handle different PKs
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The logic to determine the primary key (record.id || record.station_id) is brittle and duplicated in publishChanges (line 89). This logic is also present in the SQL migration file. This should be centralized. You could, for example, change SYNCABLE_TABLES to be an array of objects, each containing the table name and its primary key column name.

Example:

export const SYNCABLE_TABLES_CONFIG = [
  { name: 'climate_station_data', pk: 'station_id' },
  { name: 'climate_stations', pk: 'station_id' },
  { name: 'crop_data', pk: 'id' },
  // ...
];

This would make the code more robust and easier to maintain.


return forkJoin(updateOps).pipe(
map(() => void 0),
tap(() => this.fetchPendingChanges().subscribe()), // Refresh list
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The tap(() => this.fetchPendingChanges().subscribe()) creates a fire-and-forget subscription. This means the publishChanges observable will complete before the fetchPendingChanges operation is finished, which can lead to race conditions or stale data in the UI. Also, the subscription is not managed.

Consider chaining the fetchPendingChanges call using switchMap to ensure it completes as part of the main stream. This will ensure the publishChanges observable only completes after the data has been refreshed.

      switchMap(() => this.fetchPendingChanges()), // Refresh list
      map(() => void 0),

Comment on lines +23 to +36
tables text[] := ARRAY[
'climate_station_data',
'climate_stations',
'crop_data',
'crop_data_downscaled',
'deployments',
'forecasts',
'monitoring_forms',
'resource_collections',
'resource_files',
'resource_files_child',
'resource_links',
'translations'
];
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The list of tables to sync is hardcoded in this migration. This same list is also hardcoded in the frontend code (data-approval.service.ts) and in build scripts (export-data.ts). This makes the system brittle, as any change to the list of syncable tables requires updates in multiple places. While it's not possible to share code directly with a SQL file, you could consider generating this part of the migration from a shared configuration file during your build process to ensure consistency.

Comment on lines +19 to +32
const SYNCABLE_TABLES = [
'climate_station_data',
'climate_stations',
'crop_data',
'crop_data_downscaled',
'deployments',
'forecasts',
'monitoring_forms',
'resource_collections',
'resource_files',
'resource_files_child',
'resource_links',
'translations',
];
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The SYNCABLE_TABLES constant is hardcoded here. This list is also duplicated in other files like data-approval.service.ts and the Supabase migration. To improve maintainability and avoid inconsistencies, this list should be defined in a single shared location and imported where needed.

}

.unpublished {
color: #f44336; // Red for unpublished/draft
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The color #f44336 is hardcoded. It's better to use a variable from your SCSS theme, if available (e.g., from an Angular Material palette). This makes the theme easier to maintain and ensures visual consistency across the application. For example, you might use something like color: mat.get-color-from-palette($warn-palette, 500);.

).pipe(
map(({ data, error }) => {
if (error) {
console.error(`Error fetching pending changes for ${table}:`, error);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Using console.error directly makes it hard to track errors in production. This application appears to have a centralized ErrorHandlerService. You should inject it in the constructor (private errorService: ErrorHandlerService) and use it here for consistent error handling.

Suggested change
console.error(`Error fetching pending changes for ${table}:`, error);
this.errorService.handleError(new Error(`Error fetching pending changes for ${table}: ${JSON.stringify(error)}`));

Comment on lines +132 to +133
const docs = await collection.find().exec();
const localMap = new Map(docs.map((d) => [d.primary, d._data]));
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Loading all documents from the local collection into a map (const docs = await collection.find().exec();) can be inefficient and consume a lot of memory, especially for large collections. A more performant approach would be to only fetch the local documents that correspond to the records in the bundled data using findByIds.

Suggested change
const docs = await collection.find().exec();
const localMap = new Map(docs.map((d) => [d.primary, d._data]));
const idsToCompare = (records as any[]).map(r => r[collection.schema.primaryPath]);
const localDocsMap = await collection.findByIds(idsToCompare).exec();
const localMap = new Map(Array.from(localDocsMap.values()).map((d) => [d.primary, d.toJSON()]));

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

App: Dashboard Updates related to Dashboard app feature

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

1 participant