Skip to content

A simple, pure Python implementation of the original attention mechanism with no PyTorch or NumPy dependencies

Notifications You must be signed in to change notification settings

pavanyellow/Attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 

Repository files navigation

Attention Mechanism

A simple Pure Python from-scratch zero-dependency implementation of Original Attention mechanism from Vaswani et al..

The Attention mechanism was proposed in the paper "Attention is All You Need" by Vaswani et al. It has become a key component of many state-of-the-art language models.

This is a minimal implementation meant for understanding the core concepts behind the Attention mechanism.

The code is heavily commented to explain each section and matches the formulas from the paper.

Let me know if any part needs more explanation or can be improved! Feedback is welcomed.

About

A simple, pure Python implementation of the original attention mechanism with no PyTorch or NumPy dependencies

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages