-
Notifications
You must be signed in to change notification settings - Fork 37
EIP-Bot ratelimited on large PR #77
Comments
Hi, I believe that the risk is imminent. Please see notes below.. Notes from GitHub rate limits: If you hit a rate limit, it's expected that you back off from making requests and try again later when you're permitted to do so. Failure to do so may result in the banning of your app. You can always check your rate limit status at any time. Checking your rate limit incurs no cost against your rate limit. Server-to-server rate limits for GitHub Enterprise Cloud |
The chances that someone will abuse this is negligible, however. |
I'll change it to high priority, as, on reflection, it does pose a risk. The EIP-Bot is not necessary for the functioning of the EIP repo, however (as there are repo admins). |
Totally agreed. Any potential risk must be addressed as high priority. |
Which rate limit is being hit? Just CPU time spent by the bot? If the rate limit is doing its job successfully, I don't think we need to do anything additional. |
It's the GitHub API rate limit - a lot scarier. The bot could get temporarily blacklisted from accessing PR files |
And the App could be halted or banned. Not sure if this would be temporary or permanent. |
I'm not terribly worried about it then. I suspect GitHub will associate repeated over-running bots with the account that triggered them, rather than the bot itself (though, perhaps a bit of both depending on context). I also suspect we'll get warnings long before any sort of permanent action is taken. We should still address it, no point in wasting resources if we know it is going to fail. I just don't think it is critical priority. |
Uh, let me rephrase this: A malicious actor could submit a PR with 500 files, and submit trivial commits over and over. This would effectively halt the EIP process, because the only way PRs could be merged is manually with admin rights. |
I believe GitHub automatically cancels an existing CI run if a new run for the same PR starts, so (if true) the attacker would need to commit just before the run finishes each time to lock things up. Also, I think GitHub parallelizes up to a point, so we could still get other PRs through. Most importantly though, someone could do this attack today to any repository with CI. I believe GitHub has things in place to protect against it, and I think rate limiting is part of that. |
so do we agree that the risk if any is mitigated / controlled by GitHub Rate Limiting ? |
I still think there is a small chance that this would temporarily get the bot banned (especially if GitHub uses Cloudflare). However, I doubt anyone will do it. Incidentally, if you're reading this and you're thinking "hey, I'd like to try to shut down the EIP repo for fun," please don't. Moving to medium priority. |
I will write a PR where max_num_files_allowed_to_change = 100, if you guys decide to increase it to 1000 it will be easy to do, but I believe that the possibility to limit it must be in place. |
It should be no more than 400. It was already getting rate-limited at that point. Personally, I would have it output a fail that changes to a pass when there are multiple editor approvals (similar to modifying EIP-1). This would still allow the auto merging of large PRs (such as mine). |
Ok, so it would fail if (max_num_files_allowed_to_change > 400 && editors < 2) |
100 files is fine. Really probably 20 is fine. Re: number of editors, I wish we had a good mechanism for variable number of editor approvals. Sometimes, getting 2 editors to review is hard. Sometimes we have many active editors and 2 is too few. For now though, 2 is better than nothing. |
Ok, so: fail if (max_num_files_allowed_to_change > 100 && editors < 2) |
it is straight forward to get files_number for the PR, but concerning the editors: a. bot just ask for at least one editor to review: ie.: b. Could be used Reviewers number (supposed to be just editors ? question) to get editors_number (but reading the EIP/PRs I can see that or no reviewer or just 1 (editor) reviewing. Is it possible to just verify the total_file_numbers and fail if > 100 |
Lets start out by just having the bot short circuit (fail early) if there are more than 100 files. We can manually merge in those rare situations. We can do the "or 2 editors" bit later. |
done (for approval) |
See: ethereum/EIPs#5055 (comment)
That PR changes over 400 files. When the EIP bot tried to review it, its rate limit exceeded.
I'm putting it on medium priority as it might have DoS potential, but the PR will have to be manually merged anyway (good luck getting 399 authors to approve it!)
The text was updated successfully, but these errors were encountered: