Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Load LLMs in FP16 for Faster Inference #10

Open
wants to merge 10 commits into
base: master
Choose a base branch
from
Prev Previous commit
Next Next commit
Fix batch size issue
  • Loading branch information
Kyle1668 committed Aug 18, 2023
commit 96053f753cdfb070d26e6ec1d9af54b708f0fd90
3 changes: 1 addition & 2 deletions inference.py
Original file line number Diff line number Diff line change
Expand Up @@ -137,8 +137,7 @@ def get_batch_size(model_name: str) -> int:
"6.9b": 64,
"12b": 64,
}
model_size = ".".join(model_name.split(".")[1:])
return size_batch_map[model_size]
return size_batch_map[model_name]


def get_dataset(dataset_name: str, split_name: str, sample: int = None) -> pd.DataFrame:
Expand Down