-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Avoid memory with outputAsAttachment + instant download #46
Comments
Hello.
|
1 - I'm using 3.3.0 Did you get the chance to take a look at ZipSteamer lib? |
This was referenced Mar 14, 2020
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
First, thank you @Ne-Lexa for this amazing library!
I'm trying to implement your library on my Nextcloud server (running as a Docker container on my Synology NAS) but when I trigger your lib, it takes too much time to actually start the download and I ended up by having a 502 bad gateway (timeout) session from nginx, especially with big files (>4GB). In my case, it's not possible to change that timeout because of Cloudflare proxy (free account).
I want to use your lib mostly because of the password feature that is not available on Nextcloud's PHP ZipStreamer but I'm trying to make it work with the basics first.
For context, Nextcloud uses this lib and all the main calls comes from here.
Somehow, the zip download starts instantly with ZipStreamer.
I managed to set up everything correctly using
addFromStream
method, and this is what I tried already. Grouping the code here:I also make sure that the stream from each file is not being close (the opposite from ZipStreamer).
I tried to change from
php:https://temp
tophp:https://output
insideoutputAsAttachment
method but it was not enough - seems like it was missing the mime type.Is it possible to, somehow avoid the memory and download/stream instantly the zip by blocks, as I think it is from ZipStreamer?
Log:
Please let me know if it's clear enough or if you need more info. Thanks!
The text was updated successfully, but these errors were encountered: