You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There seems to be no way to define data classes where the data class encoder produces a Spark schema with fields of type Decimal(38, 0). The natural approach would be to define a data class with a field of type BigInteger, but this is unsupported by the data class encoder.
This can be seen by the following code
data classA(valvalue:BigInteger)
funmain() = withSpark {
val ds = dsOf(1, 2)
val df = ds.`as`<A>()
println(df.schema())
}
which throws java.lang.IllegalArgumentException: java.math.BigInteger is unsupported.