Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to visualize memory transformation matrix W. #2

Open
jiangds518 opened this issue Sep 17, 2020 · 3 comments
Open

how to visualize memory transformation matrix W. #2

jiangds518 opened this issue Sep 17, 2020 · 3 comments

Comments

@jiangds518
Copy link

jiangds518 commented Sep 17, 2020

hello, I am training rethinkNet on a image data set, I want to know how to visualize memory transformation matrix W.

@jiangds518 jiangds518 changed the title how to visualize memory transform. how to visualize memory transformation matrix W. Sep 17, 2020
@yangarbiter
Copy link
Owner

Find the RNN unit in the keras model (https://github.com/yangarbiter/multilabel-learn/blob/master/mlearn/models/rethinknet/rethinkNet.py#L126)

When the RNN unit is SimpleRNN, you can just plot the recurrent matrix in it, that's the memory transformation matrix W.
(https://www.tensorflow.org/api_docs/python/tf/keras/layers/SimpleRNN)

@jiangds518
Copy link
Author

Thanks for your reply :), I am writing a paper and want to cite your paper. And I have two other questions:
1)why the value of units in RNN is 128?
2) If the number of labels in my dataset is 6 and I want to visualize memory transformation matrix W, should I set the value of units to 6?
Thanks again!!

@yangarbiter
Copy link
Owner

Hi,

Previously we found that adding another layer between RNN and the label will improve the performance, so the number of units in RNN is set to 128.
If you want to plot the memory transformation as the size of the number of labels, you may want to tweak the architecture a bit.
Changing the number of unit in RNN from 128 to 6 and remove the dense layer right after it should work in your case.

    x = get_rnn_unit(rnn_unit, n_labels, x, activation='sigmoid', l2w=regularizer,
                     recurrent_dropout=0.25)

Let me know if you have any question.
Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants