Skip to content
forked from jaymody/picoGPT

An unnecessarily tiny and minimal implementation of GPT-2 in NumPy.

License

Notifications You must be signed in to change notification settings

salikshah/picoGPT

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PicoGPT

You've seen openai/gpt-2.

You've seen karpathy/minGPT.

You've even seen karpathy/nanoGPT!

But have you seen picoGPT??!?

picoGPT is an unnecessarily tiny and minimal implementation of GPT-2 in plain NumPy. The entire forward pass code is 40 lines of code.

So is picoGPT:

  • Fast? ❌ Nah, picoGPT is megaSLOW 🐌
  • Training code? ❌ Error, 4️⃣0️⃣4️⃣ not found
  • Batch inference? ❌ picoGPT is civilized, one at a time only, single file line
  • Support top-p? ❌ top-k? ❌ temperature? ❌ categorical sampling?! ❌ greedy? ✅
  • Readable? 🤔 Well actually, it's not too horrible on the eyes!
  • Smol??? ✅✅✅✅✅✅ YESS!!! TEENIE TINY in fact 🤏

Dependencies

pip install -r requirements.txt

If you're using an M1 Macbook, you'll need to replace tensorflow with tensorflow-macos.

Tested on Python 3.9.10.

Usage

python gpt2.py "Alan Turing theorized that computers would one day become"

Which generates

 the most powerful machines on the planet.

The computer is a machine that can perform complex calculations, and it can perform these calculations in a way that is very similar to the human brain.

You can also control the number of tokens to generate, the model size (one of ["124M", "355M", "774M", "1558M"]), and the directory to save the models:

python gpt2.py \
    "Alan Turing theorized that computers would one day become" \
    --n_tokens_to_generate 40 \
    --model_size "124M" \
    --models_dir "models"

gpt2_pico.py is the same as gpt2.py, but has even fewer lines of code (removed comments, extra whitespace, and combined certain operations into a single line). Why? Because why not.

About

An unnecessarily tiny and minimal implementation of GPT-2 in NumPy.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%