Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

wordVectors in seq2seq #40

Open
guoxuxu opened this issue Nov 27, 2018 · 0 comments
Open

wordVectors in seq2seq #40

guoxuxu opened this issue Nov 27, 2018 · 0 comments

Comments

@guoxuxu
Copy link

guoxuxu commented Nov 27, 2018

Hi, I have question on using wordVectors in seq2seq.py since I have to adapt to my very long sentences dataset (I saw the limited length of sentence is 15 in the createTrainingMatrices function in seq2seq.py, but my datasets are almost short paragraphs^_^). Although the createTrainingMatrices function in seq2seq.py can help to create new vectors for every sentence using index of words, why not using the pre-trained embeddingMatrix.npy produced by word2vec.py? "wordVectors = np.load('models/embeddingMatrix.npy') " has been stated in seq2seq.py, but doesn't been used actually?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant