Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

seq2seq attention中代码的问题 #2

Open
wmathor opened this issue Jul 1, 2020 · 0 comments
Open

seq2seq attention中代码的问题 #2

wmathor opened this issue Jul 1, 2020 · 0 comments

Comments

@wmathor
Copy link

wmathor commented Jul 1, 2020

请问下面这个代码
embedded = self.embedding(x).view(self.sentence_length, 1, -1) # seq_len * batch_size * word_size
注释是不是有问题,我觉得应该是seq_len * 1 * word_size
而且,view成这样,能对batch_size > 1的样本进行训练吗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant