Skip to content
This repository has been archived by the owner on Dec 11, 2023. It is now read-only.

How to use BERT embedding? #486

Open
nashid opened this issue Feb 23, 2022 · 3 comments
Open

How to use BERT embedding? #486

nashid opened this issue Feb 23, 2022 · 3 comments

Comments

@nashid
Copy link

nashid commented Feb 23, 2022

Is it possible to use BERT word embeddings along with this NMT implementation?

The goal is to use a pre-trained BERT language model so the contextualized embedding could be leveraged.

I am wondering whether anyone implemented or was able to run this model with any other contextualized embedding like ELMO or BERT.

@nashid
Copy link
Author

nashid commented Feb 23, 2022

@mommi84 any idea?

@namavar-marjane
Copy link

That would be a very useful feature.

@nashid
Copy link
Author

nashid commented Mar 23, 2022

This would be very useful as contextual embedding became a norm nowadays. Wondering anyone implemented or was able to run this model with any contextualized embedding?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants