Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to run gpt-neox with two gtx 1080's? #964

Closed
therealjr opened this issue Jun 3, 2023 · 1 comment
Closed

How to run gpt-neox with two gtx 1080's? #964

therealjr opened this issue Jun 3, 2023 · 1 comment

Comments

@therealjr
Copy link

I have followed the instructions exactly but I appear to run into an out of memory error every time. It doesn't matter what I do to the config file. Can anyone explain to me how to run the batch size as 1? Would that work?

Thanks

@StellaAthena
Copy link
Member

You should not use this library. This library is specifically designed for training very large models, far larger than what will fit on your GPUs. Given your computing resources, using another library like Hugging Face’s transformers or Meta’s MetaSeq is probably more appropriate.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants