You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Dec 11, 2023. It is now read-only.
This would be very useful as contextual embedding became a norm nowadays. Wondering anyone implemented or was able to run this model with any contextualized embedding?
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Is it possible to use BERT word embeddings along with this NMT implementation?
The goal is to use a pre-trained BERT language model so the contextualized embedding could be leveraged.
I am wondering whether anyone implemented or was able to run this model with any other contextualized embedding like ELMO or BERT.
The text was updated successfully, but these errors were encountered: