Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possible to load model from memory? #44

Closed
lotsofone opened this issue Jan 28, 2021 · 3 comments
Closed

Possible to load model from memory? #44

lotsofone opened this issue Jan 28, 2021 · 3 comments

Comments

@lotsofone
Copy link

tfgo.LoadModel() method require the path to the model file on disk.
Suppose I have a []byte that contain a model file content, just downloaded from internet. Is it possible to directly load model from this []byte?

@galeone
Copy link
Owner

galeone commented Jan 28, 2021

LoadModel accepts a path because the SavedModel serialization serialize and stores the model as a folder.

The support for frozen models (e.g.models all in one, with variables + structures all of them converted to a single optimized .pb) has been deprecated in TensorFlow 2.0 and it is going to be removed soon - that's why I support only loading from disk.

But maybe I misunderstood your question: do you have a SavedModel loaded in []byte? If yes, how did you loaded it?

If instead, as I thought, you have a .pb file read as a []byte this is not supported with tfgo.

It was present, however, you can go back to a specific commit (0663583) and load your frozen model using this version of tfgo (you also need some old TensorFlow C runtime installed, like 1.x)

@galeone galeone closed this as completed Jan 29, 2021
@jeremyevith
Copy link

Loading models from []byte have a few advantages:

  1. Able to ship a single binary with Go's new embed feature
  2. Being able to apply some kind of encryption/decryption of models in memory

Unfortunately, looks like the current solution of going back to older Tensorflow versions aren't really viable.

@galeone
Copy link
Owner

galeone commented May 16, 2021

Unfortunately, the SavedModel serialization format is a folder - maybe some kind of abstraction can be designed, in order to have a byte array and let tfgo interpret this as a path (since tfgo just invokes the standard TensorFlow C API for loading saved models, and this API wants a location on the disk). But honestly, I don't know if this is feasible or how can this become complex to design and implement

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants