Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

model size info #1774

Open
MatPoliquin opened this issue Jul 19, 2023 · 2 comments
Open

model size info #1774

MatPoliquin opened this issue Jul 19, 2023 · 2 comments

Comments

@MatPoliquin
Copy link

Is there a way to get the number of parameters of each models for each tests?

for example:

python run.py llama -d cuda -t eval

How to know which llama model variant it is? (7B, 13B, 33B, or 65B parameters)

@xuzhao9
Copy link
Contributor

xuzhao9 commented Jul 20, 2023

cc @msaroufim Do you think we can add this information to the metadata.yaml ?

@msaroufim
Copy link
Member

Sorry just saw this, I think for llama it might be best to rename the model

For example for llama 2 its llama_v2_7b_16h to be very specific that its the version with 16 heads and 7b parameters

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants