Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support JSON file upload for bulk insert of records #677

Closed
kdhrubo opened this issue Sep 2, 2024 Discussed in #612 · 5 comments
Closed

Support JSON file upload for bulk insert of records #677

kdhrubo opened this issue Sep 2, 2024 Discussed in #612 · 5 comments
Assignees
Labels
enhancement New feature or request good first issue Good for newcomers MR created
Milestone

Comments

@kdhrubo
Copy link
Collaborator

kdhrubo commented Sep 2, 2024

Discussed in #612

Originally posted by kdhrubo May 8, 2024

  1. Max file size of 5MB to be supported.
  2. This will be an async operation.
  3. Records will be streamed in chunks.
@kdhrubo kdhrubo added enhancement New feature or request enterprise edition only planned for only available in enterprise edition labels Sep 2, 2024
@kdhrubo kdhrubo added good first issue Good for newcomers and removed enterprise edition only planned for only available in enterprise edition labels Sep 26, 2024
@Hackeemlunar
Copy link
Collaborator

Must the chunks be processed immediately or one can wait for the whole file to be built

@kdhrubo
Copy link
Collaborator Author

kdhrubo commented Oct 5, 2024

Please explain what do you mean?
Ideally it should be chunk based processing

@thadguidry
Copy link
Collaborator

Chunks should be processed as received.
One has to consider that someone MIGHT want to upload a 4gb JSON string into a DB column?
But I guess for this issue we can start with a maxFileSize = 5MB

Questions:

  1. Whatever we do, for JSON processing anywhere in DB2Rest, we should never use org.json (which still has licensing issues) but instead com.fasterxml.jackson.core?
  2. Perhaps we should expose a new parameter to allow specifying which JSON datatype is going to be used for the prepared statement insert? (PostgreSQL has json and jsonb and I have needed json to keep things exact and retain key ordering, but then other times needing jsonb so that indexing could be used for efficient, fast, jsonpath queries later on.)

@Hackeemlunar
Copy link
Collaborator

@kdhrubo I get it now.

@thadguidry The file upload insertion is already a type of bulk insertion. Bulk insertion currently happens on **/db/{tablename}/bulk, so what **/db/{tablename}/file endpoint for the file processing?

@Hackeemlunar Hackeemlunar self-assigned this Oct 6, 2024
@kdhrubo
Copy link
Collaborator Author

kdhrubo commented Oct 6, 2024

Instead of file make it upload pls

kdhrubo added a commit that referenced this issue Oct 9, 2024
@kdhrubo kdhrubo added this to the Oct2024 milestone Oct 9, 2024
@kdhrubo kdhrubo closed this as completed Oct 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers MR created
Projects
None yet
Development

No branches or pull requests

3 participants