-
Notifications
You must be signed in to change notification settings - Fork 301
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Regression in 2.3.2: Long download closed after StreamTimeout even if data still moving #140
Comments
Ok upon further investigation, this is not a regression. The bug was there from the start (not refreshing the read deadline when there's activity) but it was working by luck because streamBufferPipe was calling timer.AfterFunc to schedule a wake up just a very short time before the deadline actually passed every loop, so the StreamTimeout was actually malfunctioning and not being applied correctly. Cloak/internal/multiplex/streamBufferedPipe.go Lines 64 to 92 in f1c6567
In 2.3.2 the StreamTimeout is now being enforced correctly, however, there is a new behavior that is required, which is having any activity on the other side reset the timeout (to handle the case of HTTP download, the browser sends a short data then sends nothing, and the rest of the time is spent downloading). Actually, even in 2.3.2 there is still a bug, if a browser uploads data continuously for the duration of the StreamTimeout, now the connection will be forcefully closed also because the deadline is not reset every loop (only when the buffer runs out). |
Hello @cbeuw
I think in the latest pull request, we accidentally made regression. If you set StreamTimeout to be a low value (eg: 5 seconds) and start a long download such that the download takes more than StreamTimeout seconds, the session will be forcefully closed even if there was data being downloaded. I will investigate.
The session needs to be kept alive if there is any traffic passing in it.
The text was updated successfully, but these errors were encountered: