Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Slack integration - Hitting Slack rate limit when using API's to create alerts gives wrong error message #80443

Open
Angelodaniel opened this issue Nov 8, 2024 · 10 comments
Labels
Product Area: Settings - Integrations Sync: Jira Apply to auto-create a Jira shadow ticket

Comments

@Angelodaniel
Copy link
Member

Angelodaniel commented Nov 8, 2024

Environment

SaaS (https://sentry.io/)

Steps to Reproduce

  1. Create/edit multiple alerts through https://sentry.io/api/0/organizations//alert-rules//
  2. make sure you set a slack action step to a channel

Expected Result

When Slack is rate limiting the requests it should be clear within Sentry users why it has been failing.

Actual Result

Error message is: (ERROR): failed to update transactionsVolume, 400 given. Received: {"nonFieldErrors":["Could not retrieve Slack channel information."]}

While internally we get "The Slack API responded with {'ok': False, 'error': 'ratelimited'}"

We likely will need to add a rate limit check here.

Product Area

Settings - Integrations

Link

No response

DSN

No response

Version

No response

┆Issue is synchronized with this Jira Improvement by Unito

@Angelodaniel Angelodaniel added Product Area: Settings - Integrations Sync: Jira Apply to auto-create a Jira shadow ticket labels Nov 8, 2024
@getsantry
Copy link
Contributor

getsantry bot commented Nov 8, 2024

Routing to @getsentry/product-owners-settings-integrations for triage ⏲️

@getsantry
Copy link
Contributor

getsantry bot commented Nov 8, 2024

Auto-routing to @getsentry/product-owners-settings-integrations for triage ⏲️

@getsantry getsantry bot moved this to Waiting for: Product Owner in GitHub Issues with 👀 3 Nov 8, 2024
@sentaur-athena
Copy link
Member

Yeah, it's very reasonable. Thanks for the feedback. Added it to the team backlog.

@Angelodaniel
Copy link
Member Author

Hey @sentaur-athena, one thing I wanted to know more about is the usage of the ChannelID.
As far as I know the using the Channel ID and the Channel Name would prevent the rate limit to kick in. but it did cc:@keradus
Did something change on the slack side perhaps?

@getsantry getsantry bot moved this to Waiting for: Product Owner in GitHub Issues with 👀 3 Nov 25, 2024
@sentaur-athena
Copy link
Member

As far as I know the using the Channel ID and the Channel Name would prevent the rate limit to kick in

@Angelodaniel It may or may not help. You might be reaching workspace limit. From Slacks docs:

Rate limiting conditions are unique for methods with this tier. For example, [chat.postMessage](https://api.slack.com/methods/chat.postMessage) generally allows posting one message per second per channel, while also maintaining a workspace-wide limit. Consult the method's documentation to better understand its rate limiting conditions.

@keradus
Copy link
Contributor

keradus commented Jan 4, 2025

extra details: when creating multiple alerts that post to same channel ID, on first alerts I do not face rate-limit, while on last alerts I do. As all alerts used same slack channel ID, maybe there can be internal Sentry caching for Slack integration?

I'm facing this issue every week, and adding extra sleeps to not trigger rate limiting and waiting 1-2h to update all alerts is far from optimal

@getsantry getsantry bot moved this to Waiting for: Product Owner in GitHub Issues with 👀 3 Jan 4, 2025
@sentaur-athena
Copy link
Member

@keradus what do you mean by the last alert? Is it always the same alert that rate limits? Is it the last in order of time or creation?

@sentaur-athena
Copy link
Member

@keradus the ratelimiting for the creation API is 50+ per minute so I don't think there's anyway to manually hit it. We will add more logging to investigate this further.

@keradus
Copy link
Contributor

keradus commented Jan 7, 2025

what do you mean by the last alert? Is it always the same alert that rate limits? Is it the last in order of time or creation?

Imagine that you have an automation to maintain 200 alerts for your 100 projects (real numbers for my Sentry contract), and then imagine each of those alerts having configured action to send alert msg to the very same Slack channel, you have 200 API calls.
Then, initial API calls are flawless, but after some number of calls, the API starts failing with Slack rate limiting.
Ideally, as this is same Slack channel, Sentry doesn't need to check details of it again and again and hit Slack limits.

in meanwhile, thanks for the info about current rate-limiting to be 50/minute. It will help groom speed of API requests on our automation.

@getsantry getsantry bot moved this to Waiting for: Product Owner in GitHub Issues with 👀 3 Jan 7, 2025
@sentaur-athena
Copy link
Member

Got it. Yeah makes sense to add some caching and read the response instead of calling the API again. I'll add this improvement to our backlog and post an update when I have a timeline.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Product Area: Settings - Integrations Sync: Jira Apply to auto-create a Jira shadow ticket
Projects
Status: No status
Development

No branches or pull requests

3 participants