You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have implemented fine grained caching for Haskell projects where project dependencies are cached individually and it works really well for small projects.
In build(8.10.3, windows-latest) under the step Cache cabal store, things look like they are going find until at some point I start getting lots of these errors:
Warning: Cache service responded with 429
Looks like there is some kind of throttling.
Describe the enhancement
Please provide fine grained caching of files without throttling, or at least increase the limit to the throttling by maybe an order of magnitude.
Note, I have tested this against S3, and this is very fast, but unfortunately use of S3 in a secure and convenient way is not fork friendly.
Code Snippet
N/A
Additional information
N/A
The text was updated successfully, but these errors were encountered:
First some background
I have implemented fine grained caching for Haskell projects where project dependencies are cached individually and it works really well for small projects.
The action is here:
https://github.com/action-works/cabal-cache
For an example of a build using this see:
https://github.com/haskell-works/hw-prim/actions/runs/651341909
The problem arises when I try to use it on a large project with hundreds of dependencies:
https://github.com/input-output-hk/cardano-node/runs/2112862696?check_suite_focus=true
In
build(8.10.3, windows-latest)
under the stepCache cabal store
, things look like they are going find until at some point I start getting lots of these errors:Looks like there is some kind of throttling.
Describe the enhancement
Please provide fine grained caching of files without throttling, or at least increase the limit to the throttling by maybe an order of magnitude.
Note, I have tested this against S3, and this is very fast, but unfortunately use of S3 in a secure and convenient way is not fork friendly.
Code Snippet
N/A
Additional information
N/A
The text was updated successfully, but these errors were encountered: