-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CSV importing and exporting fixes #8824
base: main
Are you sure you want to change the base?
CSV importing and exporting fixes #8824
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PR Summary
This PR enhances CSV import/export functionality for multi-select and RAW_JSON fields in Twenty's object record system.
- Added JSON string parsing for multi-select/array fields in
buildRecordFromImportedStructuredRow.ts
with fallback to empty array - Added RAW_JSON field handling in
useExportProcessRecordsForCSV.ts
to properly stringify JSON content for export - Added Zod schema validation for array/multi-select fields during import to ensure data integrity
- Added proper error handling for malformed JSON input during import operations
💡 (2/5) Greptile learns from your feedback when you react with 👍/👎!
2 file(s) reviewed, 4 comment(s)
Edit PR Review Bot Settings | Greptile
default: | ||
return processedRecord; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: The default case drops the field value entirely by returning unmodified record. This breaks the export of multi-select fields, which was the main issue to fix.
if (!isDefined(record[field.name])) { | ||
return processedRecord; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
style: Early return for undefined values means null values will be excluded from CSV export. Consider if this is the desired behavior.
const stringArrayJSONSchema = z | ||
.preprocess((value) => { | ||
try { | ||
if (typeof value !== 'string') { | ||
return []; | ||
} | ||
return JSON.parse(value); | ||
} catch { | ||
return []; | ||
} | ||
}, z.array(z.string())) | ||
.catch([]); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
style: Consider handling comma-separated values as fallback when JSON parsing fails, to support both JSON array format and simple comma-separated values
if (typeof importedFieldValue === 'string') { | ||
try { | ||
recordToBuild[field.name] = JSON.parse(importedFieldValue); | ||
} catch { | ||
break; | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
style: Should validate the parsed JSON structure matches expected schema before assigning to recordToBuild
Please ignore previous comment, I wasn't on the right branch 🤦♂️ |
Fixes issue #5793 (and duplicate #8822)