Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEA] Need Deletion Vector read support for Databricks 14.3 #12042

Open
razajafri opened this issue Jan 29, 2025 · 0 comments
Open

[FEA] Need Deletion Vector read support for Databricks 14.3 #12042

razajafri opened this issue Jan 29, 2025 · 0 comments
Labels
feature request New feature or request

Comments

@razajafri
Copy link
Collaborator

Is your feature request related to a problem? Please describe.
Databricks 14.3 uses deletion vectors and currently, we don't support deletion vectors which leads to errors while reading a delta table

The following error can be reproduced by running TEST_MODE=DELTA_LAKE_ONLY WITH_DEFAULT_UPSTREAM_SHIM=0 TEST_PARALLEL=0 TESTS=delta_lake_auto_compact_test.py::test_auto_compact_min_num_files ./jenkins/databricks/test.sh

E                   Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: Part of the plan is not columnar class com.nvidia.spark.rapids.delta.RapidsDeltaWriteExec
E                   RapidsDeltaWrite
E                   +- Coalesce 1
E                      +- Project [id#3157L]
E                         +- Filter if (isnotnull(_databricks_internal_edge_computed_column_skip_row#3180)) (_databricks_internal_edge_computed_column_skip_row#3180 = false) else isnotnull(raise_error(DELTA_SKIP_ROW_COLUMN_NOT_FILLED, map(keys: [], values: []), NullType))
E                            +- ColumnarToRow
E                               +- FileScan parquet [id#3157L,_databricks_internal_edge_computed_column_skip_row#3180] Batched: true, DataFilters: [], Format: Parquet, Location: TahoeBatchFileIndex(1 paths)[file:/tmp/pyspark_tests/0905-174917-z50tzgqt-10-59-227-36-master-822..., PartitionFilters: [], PushedFilters: [], ReadSchema: struct<id:bigint,_databricks_internal_edge_computed_column_skip_row:boolean>
E

Describe the solution you'd like
We need to add support for Deletion Vectors but to do this it may make sense to break the functionality down into smaller parts and do the write and read

Describe alternatives you've considered
None

Additional context
#8654
#11541

@razajafri razajafri added ? - Needs Triage Need team to review and classify feature request New feature or request labels Jan 29, 2025
@mattahrens mattahrens removed the ? - Needs Triage Need team to review and classify label Jan 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants