sooftware / attentions Sponsor Star 496 Code Issues Pull requests PyTorch implementation of some attentions for Deep Learning Researchers. pytorch attention multi-head-attention location-sensitive-attension dot-product-attention location-aware-attention additive-attention relative-positional-encoding relative-multi-head-attention Updated Mar 4, 2022 Python
gazelle93 / Transformer-Various-Positional-Encoding Star 18 Code Issues Pull requests This project aims to implement the Transformer Encoder blocks using various Positional Encoding methods. nlp natural-language-processing pytorch spacy transformer nltk gensim wordembeddings transformer-encoder t5 relative-positional-encoding relative-positional-representation Updated Nov 14, 2022 Python
gazelle93 / Attention-Various-Positional-Encoding Star 3 Code Issues Pull requests This project aims to implement the Scaled-Dot-Product Attention layer and the Multi-Head Attention layer using various Positional Encoding methods. nlp natural-language-processing pytorch spacy nltk gensim attention-mechanism wordembeddings multi-head-attention t5 relative-positional-encoding scaled-dot-product relative-positional-representation Updated Jun 27, 2022 Python