Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Zip/unzip large dataset (7k/22k files) causing out of memory (with reasonable memory limit of 64MB) #83

Open
marek-hanzal opened this issue Oct 15, 2021 · 1 comment

Comments

@marek-hanzal
Copy link

Q A
Library version(s) affected: 3.3.3
PHP version(s): 7.2
OS (with bit depth): Alpine Docker

Description
When an archive contains a lot of entries (for example 22k+), library always loads it into memory (internal array) of entries, which leads to out of memory problems (adding more RAM is not a solution). Also the same is for zipping large amount of files (around 7k is enough) where is the same problem: entries are collected into an internal array.

Maybe it's kind of limitation of this library to zip/unzip relatively small amount of files as it's not probably intended to do such huge works; my original reason is making deployment packages using ZIP (both zip/unzip) in quite dark environment, so a lot of processed files are required in a bit restricted environment.

How to reproduce
Set memory limit of php script to for example 32MB and try zip/unzip folder with 10k files; it will die on memory usage.

Possible Solution
This could be hard to do as there is an internal array of entries; solution could be zipping on fly when an entry is added and also reading entries using for example stream/iterator. Functions like hasEntry() would work with quite low performance on huge entry set as it has to go through all entries in the stream.

Additional context
Nothing to say here.

@fakhamatia
Copy link

fakhamatia commented Jan 26, 2022

I have this problem.
I want zip nested folders and thousands of photos with small sizes. Eventually, they become gigs
But I get Allowed memory size of.....
Server don't have enough RAM, so ini_set('memory_limit', '-1') not an option.
Maybe zipping files into disk step by step? not at once.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants