You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
When searching for rows in a spreadsheet, it's not uncommon to encounter rate-limiting errors. The Google Sheets API rate-limit is quite small. At the time of writing it's 500 requests per 100 seconds per project, and 100 requests per 100 seconds per user src. As such I would guess that most developers using this for anything substantial either turn away from approach, or implement a caching mechanism.
Describe the solution you'd like
A clear and concise description of what you want to happen.
I'm not sure what the best solution would be. Roughly speaking it'd be nice for the caching mechanism to work on a per worksheet basis. It'd also be nice for the caching mechanism to support different backend's (perhaps there is a library that can optionally be made use of for this, i.e. pip install gspread[cache]). The mechanism would need to have a TTL for each worksheet that is cached. One difficulty in this approach is that because of the TTL, in the context of a high traffic web application there would be the possibility of cache stampede. Even with that, a built-in caching mechanism would help users scale beyond the performance penalty that the vanilla api interactions impose.
Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
An extension package for gspread similar to gspread-pandas
Hints in the documentation as to what you might need to do to alleviate rate-limiting issues.
Additional context
Add any other context or screenshots about the feature request here.
I understand as well that adding this is a big overhead in terms of maintainance. And honestly, it may not even be the best approach. It may simply be best to just recommend to users that they be prepared to handle caching on their own. And make some suggestions about ways of doing that.
The text was updated successfully, but these errors were encountered:
It sound like you are looking for serviceworker-like(PWA) solution,
a middle ware that cache and control request over network whether it seems stale or
like making request every 10 minutes staying fresh? in local machine?
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
When searching for rows in a spreadsheet, it's not uncommon to encounter rate-limiting errors. The Google Sheets API rate-limit is quite small. At the time of writing it's 500 requests per 100 seconds per project, and 100 requests per 100 seconds per user src. As such I would guess that most developers using this for anything substantial either turn away from approach, or implement a caching mechanism.
Describe the solution you'd like
A clear and concise description of what you want to happen.
I'm not sure what the best solution would be. Roughly speaking it'd be nice for the caching mechanism to work on a per worksheet basis. It'd also be nice for the caching mechanism to support different backend's (perhaps there is a library that can optionally be made use of for this, i.e.
pip install gspread[cache]
). The mechanism would need to have a TTL for each worksheet that is cached. One difficulty in this approach is that because of the TTL, in the context of a high traffic web application there would be the possibility of cache stampede. Even with that, a built-in caching mechanism would help users scale beyond the performance penalty that the vanilla api interactions impose.Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
Additional context
Add any other context or screenshots about the feature request here.
I understand as well that adding this is a big overhead in terms of maintainance. And honestly, it may not even be the best approach. It may simply be best to just recommend to users that they be prepared to handle caching on their own. And make some suggestions about ways of doing that.
The text was updated successfully, but these errors were encountered: