Skip to content
This repository has been archived by the owner on Mar 21, 2024. It is now read-only.

Allow batch sizes > 1 for classification model inference. #453

Open
Shruthi42 opened this issue May 4, 2021 · 1 comment
Open

Allow batch sizes > 1 for classification model inference. #453

Shruthi42 opened this issue May 4, 2021 · 1 comment

Comments

@Shruthi42
Copy link
Contributor

Shruthi42 commented May 4, 2021

At the moment, we set batch size =1 in the dataloaders when running inference for a classification model.

AB#3998

@hxri
Copy link

hxri commented Aug 25, 2021

At the moment, we set batch size =1 in the dataloaders when running inference for a classification model.

AB#3998

Hi Can I work on this?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants