-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
wiki should handle multiple requests concurrently #16
Comments
Does this issue still exist? I think Nginx takes care of the scaling part, or? |
I guess this can take care of it https://www.mediawiki.org/wiki/Manual:$wgRunJobsAsync |
Ping @amrav |
Not sure why this isn't on the wiki repo instead. I don't think async jobs is the issue here, but really the way to fix this is to first reproduce the issue (by running locally, or preferably by adding an integration test). Then we can investigate the effect of possible fixes. An integration test for this is tricky to add, but would look like making two requests, making the first request block indefinitely somehow (maybe a dummy extension, or request a page with a mediawiki commons image and fake the upstream to stall indefinitely), and making a second request and ensuring it succeeds. If going down this route, we can discuss more on Slack. Doing it this way will ensure we don't regress in the future. |
Problem: the wiki only handles one request at a time, which means a slow request could block other users, and leads to a bad experience.
How to reproduce: Make an expensive request (like saving a large page), and try to load a different page simultaneously. The second request won't complete until the first one does.
I'm pretty sure this isn't the standard behaviour, and is possibly related to faulty configuration at our end.
The text was updated successfully, but these errors were encountered: