Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Large downloads can take up all memory #668

Open
cosimoc opened this issue Mar 28, 2017 · 3 comments
Open

Large downloads can take up all memory #668

cosimoc opened this issue Mar 28, 2017 · 3 comments

Comments

@cosimoc
Copy link
Contributor

cosimoc commented Mar 28, 2017

When downloading an asset through the "extra data" mechanism, its contents are all downloaded in memory (see flatpak_load_http_uri() in common/flatpak-utils.c). This can be problematic for large downloads (Unity3D is >2gb).
This can also be problematic for file sources, since there may be two full copies of the contents in memory, one of them base-64 encoded. See

if (is_local)
{
g_autofree char *data = NULL;
g_autofree char *base64 = NULL;
gsize len;
if (!g_file_load_contents (src, NULL, &data, &len, NULL, error))
return FALSE;
base64 = g_base64_encode ((const guchar *) data, len);
g_free (self->url);
self->url = g_strdup_printf ("data:text/plain;charset=utf8;base64,%s", base64);
if (self->dest_filename == NULL || *self->dest_filename == 0)
{
g_free (self->dest_filename);
self->dest_filename = g_file_get_basename (src);
}
}

@alexlarsson
Copy link
Member

I've recently added flatpak_download_http_uri() which is used in some places to download directly to disk. Moving extra data to use that seems like a good idea.

As for the file sources, maybe we should have a size limit for what files we include in the resolved manifest, because that file is going to be in the final image, so it shouldn't be too large.

@alexlarsson
Copy link
Member

Hmm, we actually need the data in memory because we're constructing a gvariant with it, in order to make the extra data be part of the commit in the local flatpak ostree repo for use at deploy time. It will be hard to change that.

@tinywrkb
Copy link

What's the status of this?

Should we avoid using large extra-data sources? If so, then what size should be the upper limit?
It's possible that we download a large extra-data source, but the ostree commit will be small. Do we only care about committed data?
With large proprietary applications (over 10GB is not uncommon), should we prefer packaging installation wrappers that install into XDG_DATA_HOME post Flatpak deployment, instead of running the application's installer during apply_extra and installing into /app/extra?

It would be nice to have extra-data guidelines somewhere?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants