Skip to content

GPT-2/3 tokenizer based on @latitudegames/GPT-3-Encoder that works in the browser and Deno

Notifications You must be signed in to change notification settings

josephrocca/gpt-2-3-tokenizer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Note: This repo is for GPT-3 only. Here's some code that should work for gpt-3.5-turbo and gpt-4:

import { AutoTokenizer } from 'https://cdn.jsdelivr.net/npm/@xenova/[email protected]'
let tokenizer = await AutoTokenizer.from_pretrained("Xenova/gpt-4"); // gpt-3.5-turbo uses same tokenizer as gpt-4 IIUC
let tokens = tokenizer.encode("hello world");

GPT-2/3 Tokenizer

GPT-2/3 byte pair encoder/decoder/tokenizer based on @latitudegames/GPT-3-Encoder that works in the browser and Deno.

See also: JS byte pair encoder for OpenAI's CLIP model.

import {encode, decode} from "https://deno.land/x/[email protected]/mod.js";
let text = "hello world";
console.log(encode(text)); // [258, 18798, 995]
console.log(decode(encode(text))); // "hello world"

or:

let mod = await import("https://deno.land/x/[email protected]/mod.js");
mod.encode("hello world"); // [258, 18798, 995]

or to include it as a global variable (as if you were importing it with the old script tag style):

<script type=module>
  import tokenizer from "https://deno.land/x/[email protected]/mod.js";
  window.tokenizer = tokenizer;
</script>

License

The original code is MIT Licensed and so are any changes made by this repo.

About

GPT-2/3 tokenizer based on @latitudegames/GPT-3-Encoder that works in the browser and Deno

Resources

Stars

Watchers

Forks

Packages

No packages published