Skip to content

Commit

Permalink
Merge pull request EleutherAI#698 from xerus/fix_link
Browse files Browse the repository at this point in the history
Fix link to download 20B vocab
  • Loading branch information
StellaAthena committed Oct 2, 2022
2 parents bab229d + b0a7095 commit 738b87e
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -207,7 +207,7 @@ Next make sure to download the GPT2 tokenizer vocab, and merge files from the fo

Or use the 20B tokenizer (for which only a single Vocab file is needed):

- Vocab: https://mystic.the-eye.eu/public/AI/models/GPT-NeoX-20B/slim_weights/20B_tokenizer.json
- Vocab: https://the-eye.eu/public/AI/models/GPT-NeoX-20B/slim_weights/20B_tokenizer.json

(alternatively, you can provide any tokenizer file that can be loaded by Huggingface's tokenizers library with the `Tokenizer.from_pretrained()` command)

Expand Down

0 comments on commit 738b87e

Please sign in to comment.