Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue when using Keras NetVLAD with functional api #1

Open
fede-vaccaro opened this issue Nov 11, 2019 · 2 comments
Open

Issue when using Keras NetVLAD with functional api #1

fede-vaccaro opened this issue Nov 11, 2019 · 2 comments

Comments

@fede-vaccaro
Copy link

fede-vaccaro commented Nov 11, 2019

Hi,
First of all, thanks for this repository.
I'm trying to stack your NetVLAD implementation in keras on the top of a ResNet50, but i'm having some issue doing it.

That's my code:

# instantiating the resnet50
resnet = ResNet50(weights='imagenet', include_top=False, pooling=False, 
input_shape=input_shape)
# deleting the default Input layer
resnet.layers.pop(0)
input_q = Input(shape=(224, 224, 3))
# stacking the resnet on a new input (seems useless but i need this to having multiple inputs)
resnet_q = resnet(input_q)
# permuting the tensor from resnet last layer shape which is (None, 7,7,2048) to (None, 2048, 7,7)
transpose = Permute((3,1,2), input_shape=(7,7,2048))(resnet_p)
# reshaping from (2048, 7,7) to (2048, 7*7), in the form (max_samples, feature_size) like requested from NetVLAD
reshape = Reshape((2048, 7*7))(transpose)
# stacking netvlad
netvlad = NetVLAD(feature_size=7*7, max_samples=2048, cluster_size=64, output_dim=1024)(reshape)

But i'm getting this error:

 ---------------------------------------------------------------------------
 
 ValueError                                Traceback (most recent call last)
 
 <ipython-input-20-7cb139e75b78> in <module>
      19 transpose = Permute((3,1,2), input_shape=(7,7,2048))(resnet_p)
      20 reshape = Reshape((2048, 7*7))(transpose)
 ---> 21 netvlad = NetVLAD(feature_size=7*7, max_samples=2048, cluster_size=64, output_dim=1024)(reshape)
 
 ~/Documenti/netvlad/venv/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py in symbolic_fn_wrapper(*args, **kwargs)
      73         if _SYMBOLIC_SCOPE.value:
      74             with get_graph().as_default():
 ---> 75                 return func(*args, **kwargs)
      76         else:
      77             return func(*args, **kwargs)
 
 ~/Documenti/netvlad/venv/lib/python3.6/site-packages/keras/engine/base_layer.py in __call__(self, inputs, **kwargs)
     461                                          'You can build it manually via: '
     462                                          '`layer.build(batch_input_shape)`')
 --> 463                 self.build(unpack_singleton(input_shapes))
     464                 self.built = True
     465 
 
 ~/Documenti/netvlad/loupe_keras.py in build(self, input_shape)
      81         self.cluster_weights = self.add_weight(name='kernel_W1',
      82                                       shape=(self.feature_size, self.cluster_size),
 ---> 83                                       initializer=tf.random_normal_initializer(stddev=1 / math.sqrt(self.feature_size)),
      84                                       trainable=True)
      85         self.cluster_biases = self.add_weight(name='kernel_B1',
 
 ~/Documenti/netvlad/venv/lib/python3.6/site-packages/keras/engine/base_layer.py in add_weight(self, name, shape, dtype, initializer, regularizer, trainable, constraint)
     280                             dtype=dtype,
     281                             name=name,
 --> 282                             constraint=constraint)
     283         if regularizer is not None:
     284             with K.name_scope('weight_regularizer'):
 
 ~/Documenti/netvlad/venv/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py in variable(value, dtype, name, constraint)
     618     """
     619     v = tf_keras_backend.variable(
 --> 620         value, dtype=dtype, name=name, constraint=constraint)
     621     if hasattr(value, 'tocoo'):
     622         v._keras_shape = value.tocoo().shape
 
 ~/Documenti/netvlad/venv/lib/python3.6/site-packages/tensorflow_core/python/keras/backend.py in variable(value, dtype, name, constraint)
     809       dtype=dtypes_module.as_dtype(dtype),
     810       name=name,
 --> 811       constraint=constraint)
     812   if isinstance(value, np.ndarray):
     813     v._keras_shape = value.shape
 
 ~/Documenti/netvlad/venv/lib/python3.6/site-packages/tensorflow_core/python/ops/variables.py in __call__(cls, *args, **kwargs)
     258       return cls._variable_v1_call(*args, **kwargs)
     259     elif cls is Variable:
 --> 260       return cls._variable_v2_call(*args, **kwargs)
     261     else:
     262       return super(VariableMetaclass, cls).__call__(*args, **kwargs)
 
 ~/Documenti/netvlad/venv/lib/python3.6/site-packages/tensorflow_core/python/ops/variables.py in _variable_v2_call(cls, initial_value, trainable, validate_shape, caching_device, name, variable_def, dtype, import_scope, constraint, synchronization, aggregation, shape)
     252         synchronization=synchronization,
     253         aggregation=aggregation,
 --> 254         shape=shape)
     255 
     256   def __call__(cls, *args, **kwargs):
 
 ~/Documenti/netvlad/venv/lib/python3.6/site-packages/tensorflow_core/python/ops/variables.py in <lambda>(**kws)
     233                         shape=None):
     234     """Call on Variable class. Useful to force the signature."""
 --> 235     previous_getter = lambda **kws: default_variable_creator_v2(None, **kws)
     236     for _, getter in ops.get_default_graph()._variable_creator_stack:  # pylint: disable=protected-access
     237       previous_getter = _make_getter(getter, previous_getter)
 
 ~/Documenti/netvlad/venv/lib/python3.6/site-packages/tensorflow_core/python/ops/variable_scope.py in default_variable_creator_v2(next_creator, **kwargs)
    2554       synchronization=synchronization,
    2555       aggregation=aggregation,
 -> 2556       shape=shape)
    2557 
    2558 
 
 ~/Documenti/netvlad/venv/lib/python3.6/site-packages/tensorflow_core/python/ops/variables.py in __call__(cls, *args, **kwargs)
     260       return cls._variable_v2_call(*args, **kwargs)
     261     else:
 --> 262       return super(VariableMetaclass, cls).__call__(*args, **kwargs)
     263 
     264 
 
 ~/Documenti/netvlad/venv/lib/python3.6/site-ackages/tensorflow_core/python/ops/resource_variable_ops.py in __init__(self, initial_value, trainable, collections, validate_shape, caching_device, name, dtype, variable_def, import_scope, constraint, distribute_strategy, synchronization, aggregation, shape)
    1404           aggregation=aggregation,
    1405           shape=shape,
 -> 1406           distribute_strategy=distribute_strategy)
    1407 
    1408   def _init_from_args(self,
 
 ~/Documenti/netvlad/venv/lib/python3.6/site-packages/tensorflow_core/python/ops/resource_variable_ops.py in _init_from_args(self, initial_value, trainable, collections, caching_device, name, dtype, constraint, synchronization, aggregation, distribute_strategy, shape)
    1487     if isinstance(initial_value, ops.Tensor) and hasattr(
    1488         initial_value, "graph") and initial_value.graph.building_function:
 -> 1489       raise ValueError("Tensor-typed variable initializers must either be "
    1490                        "wrapped in an init_scope or callable "
    1491                        "(e.g., `tf.Variable(lambda : "
 
 ValueError: Tensor-typed variable initializers must either be wrapped in an init_scope or callable (e.g., `tf.Variable(lambda : tf.truncated_normal([10, 40]))`) when building functions. Please file a feature request if this restriction inconveniences you.

Do you have any idea what could be the issue here? I really can't figure out, i looked at your code and i did some variation trying to spot the bug but nothing changed.
Thank you!

@fede-vaccaro
Copy link
Author

I should have solved it:
the tensorflow weight initializer doesn't work with the last version of keras/tensorflow (keras==2.3.1/tensorflow-gpu==2.0.0)
I changed your initializer to:

initializer=keras.initializers.random_normal(stddev=1 / math.sqrt(self.feature_size)),

@shamangary
Copy link
Owner

Hello sorry for the inconvenience but I am in a middle of something right now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants