Neural Machine Translation with Attention (Dynet)
-
Updated
Feb 26, 2017 - Python
Neural Machine Translation with Attention (Dynet)
Support material and source code for the model described in : "A Recurrent Encoder-Decoder Approach With Skip-Filtering Connections For Monaural Singing Voice Separation"
Learning the basics of modern AI
Source Code Generation Based On User Intention Using LSTM Networks
Sequence to sequence model for Language translation from English to French
Design and build a chatbot using data from the Cornell Movie Dialogues corpus, using Keras
Classic spy Encoder-Decoder console game made with Python
Text Summariser based on RNN-LSTM and Tensorflow
Tamil to English translation using Neural Networks
LSTM autoencoder Model for Query-by-example for Spoken word detection
A keras implementation of word level encoder-decoder architecture with teacher forcing.
This is the sequential Encoder-Decoder implementation of Neural Machine Translation using Keras
Noise removal from images using Convolutional autoencoder
Implementation of Neural Machine Translation system using Encoder-Decoder Architecture that translates the input German phrases to English
Image Captioning
This is an implementation of the paper "Show and Tell: A Neural Image Caption Generator".
📺 An Encoder-Decoder Model for Sequence-to-Sequence learning: Video to Text
Add a description, image, and links to the encoder-decoder-model topic page so that developers can more easily learn about it.
To associate your repository with the encoder-decoder-model topic, visit your repo's landing page and select "manage topics."