Skip to content

VARINT values over 38 digits from Cassandra implicitly convert to null #184

@MCarter-Ulu

Description

@MCarter-Ulu

Describe the bug
During the initial cassandra read, Spark seems to implicitly convert the varint to decimal(38,0). If the value exceeds 38 digits, it automatically converts it to null before the replicator conversion.

To Reproduce
Steps to reproduce the behavior:
Attempt to load a table with long varint value e.g. 1595542979496957281074989646537940572646. It will be loaded as null value.

Expected behavior
varint should be read in full and transferred to target as per source length and content.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions