Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to find the token count of a prompt using meta/llama2-70b model #274

Open
pradeepdev-1995 opened this issue Mar 29, 2024 · 1 comment

Comments

@pradeepdev-1995
Copy link

Is tiktoken supports meta/llama2-70b?
I want to find the token count of the prompt before passing the prompt to a meta/llama2-70b model
So how to do this with tiktoken

import tiktoken
enc = tiktoken.get_encoding("meta/llama2-70b")

is this possible?

@jmehnle
Copy link

jmehnle commented Jun 25, 2024

Meta's models are tokenized differently from OpenAI's. tiktoken doesn't support non-OpenAI models.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants