Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] the 2 parameter versions of pow and log have too large of an error #89

Open
revans2 opened this issue Jun 2, 2020 · 0 comments
Open
Labels
bug Something isn't working P1 Nice to have for release SQL part of the SQL/Dataframe plugin

Comments

@revans2
Copy link
Collaborator

revans2 commented Jun 2, 2020

Describe the bug
The underlying cudf implementation of POW and LOG both show larger than wanted floating point differences with the spark CPU version when the optional parameter values are very large or very small. It is likely that a different algorithm is used to compute these values so when overflow happens we get different results.

Steps/Code to reproduce bug
The python integration tests

and

are both marked as xfail because of this.

Expected behavior
The floating point error is withing the expected bounds.

@revans2 revans2 added bug Something isn't working ? - Needs Triage Need team to review and classify SQL part of the SQL/Dataframe plugin and removed ? - Needs Triage Need team to review and classify labels Jun 2, 2020
@sameerz sameerz added the P1 Nice to have for release label Aug 26, 2020
tgravescs pushed a commit to tgravescs/spark-rapids that referenced this issue Nov 30, 2023
Signed-off-by: spark-rapids automation <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working P1 Nice to have for release SQL part of the SQL/Dataframe plugin
Projects
None yet
Development

No branches or pull requests

2 participants