Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged all support functions into read_graph; returning atoms #4

Merged
merged 1 commit into from
May 17, 2018

Conversation

anshuman23
Copy link
Owner

@josevalim as discussed in #3 , merged all graph reading capabilities into the read_graph function. This function returns a TF_Graph on successful loading of graph and an {:error,"Unable to load graph"} tuple on an unsuccessful attempt.

Example is as follows (classify_image_graph_def.pb is from Google's Imagenet model):

iex(1)> graph = Tensorflex.read_graph("classify_image_graph_def.pb")
2018-05-17 23:36:16.488469: I tensorflow/core/platform/cpu_feature_guard.cc:137] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX AVX2 FMA 2018-05-17 23:36:16.774442: W tensorflow/core/framework/op_def_util.cc:334] OpBatchNormWithGlobalNormalization is deprecated. It will cease to work in GraphDef version 9. Use tf.nn.batch_normalization().
Successfully imported graph
#Reference<0.1610607974.1988231169.250293>

iex(2)> graph2 = Tensorflex.read_graph("Makefile")                   
{:error, 'Unable to import graph'}

I have an idea for the "story" around the graph loading we had talked about, so I'll be merging this PR as already discussed and adding the details of that in the next PR for your review.

@anshuman23 anshuman23 merged commit ef4efe3 into master May 17, 2018
@josevalim
Copy link
Contributor

I love the return types. Great job!

Note though that you are returning a charlist (single-quoted) in the error message while a string/binary (double-quoted) would be preferred:

iex(2)> graph2 = Tensorflex.read_graph("Makefile")                   
{:error, "Unable to import graph"}

if (TF_GetCode(status) != TF_OK) {
fprintf(stderr, "ERROR: Unable to import graph %s", TF_Message(status));
return 1;
return enif_make_tuple2(env,enif_make_atom(env,"error"),enif_make_string(env, "Unable to import graph", ERL_NIF_LATIN1));
Copy link
Contributor

@josevalim josevalim May 18, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just to clarify, an Erlang string is not the same as an Elixir string. So we need to use the binary API here.

Also, note that we can return a better error, so let's do that. Here are all of the possible error values from Tensorflow. The idea is to convert each of them into an atom. So if we get TF_NOT_FOUND, we should return {:error, :not_found}. This will probably be very common, so we can encapsulate it in a C procedure.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Noted. I completely agree-- I didn't realize I was returning char lists. Now I've shifted to enif_make_binary() wherever required. Also added this and the TF error codes to the latest PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants