Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feat] Provide derivatives for pow #246

Merged
merged 3 commits into from
Nov 26, 2024
Merged

[Feat] Provide derivatives for pow #246

merged 3 commits into from
Nov 26, 2024

Conversation

fjosw
Copy link
Owner

@fjosw fjosw commented Nov 20, 2024

I explicitly added the derivatives for __pow__ and __rpow__, previously these were estimated via autograd. On my machine this results in a ~4x speedup for pow operations, obs ** 2 is now slightly faster than obs * obs.

I removed the Obs case from __rpow__ as this should be covered by __pow__ already and none of our tests fail 🤞

@fjosw
Copy link
Owner Author

fjosw commented Nov 20, 2024

We are still missing explicit derivatives for

  • abs (I did not want to deal with the special cases)
  • arcsin, arccos, arctan, arcsinh, arccosh, arctanh (probably hardly used at all)

@s-kuberski
Copy link
Collaborator

s-kuberski commented Nov 20, 2024

Great, thanks! I also think that it is not necessarily worth implementing the derivatives for the remaining functions. For __pow__ however, there might be some speedup in quite a number of real world examples.

@fjosw fjosw merged commit 30bfb55 into develop Nov 26, 2024
9 checks passed
@fjosw fjosw deleted the feat/pow_man_grad branch November 26, 2024 16:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants