-
Notifications
You must be signed in to change notification settings - Fork 304
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request] Automatic Retries for Throttling by S3 During File Upload #409
Comments
Thank you! I noticed that earlier, which is why I'm considering adding support for retries. If you believe this would be beneficial, I can spend some time on this. |
@ViswanathaReddyGajjala The simplest way would be retrying uploadFile n times with a circuit breaker. It'd be appreciated if you could create a PR. |
I've tested it with |
I have a follow-up question. For testing purposes, I've uploaded over 800+ files. These are stored in the Will these files be automatically deleted after a certain period, or is there a specific process for this? Additionally, would adding appropriate tags to the bucket name be beneficial in this scenario? It would be easy to find the correct bucket. |
The latter one. These files deleted on the each event. source
What do you mean "tags" exactly? The doc says that we can add tag on the object, but not on the bucket. Another question is, if added, how can we access to the correct bucket easily? |
Can I reopen this issue? I'm curious why you closed. |
Describe the solution you'd like
When multiple files are uploaded to create a new bot, it often results in an
Error Message: Request failed with status code 503
. I want to implement automatic retries for these failed uploads. I need to understand if there are any considerations I should take into account before implementing this feature.Why the solution needed
For every throttled request, we currently have to manually delete each file. This process can be quite frustrating for our customers. To improve the customer experience, we could implement automatic retries for these failed uploads
Additional context
I attempted to upload 200+ files and experienced over 140+ throttled uploads. To avoid throttling, I had to upload the files in batches, which was a frustrating process.
Implementation feasibility
Are you willing to discuss the solution with us, decide on the approach, and assist with the implementation?
The text was updated successfully, but these errors were encountered: