-
Notifications
You must be signed in to change notification settings - Fork 984
Issues: EleutherAI/gpt-neox
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
How to Load Model from pytorch_model.bin into Trained Model for Text Generation?
feature request
New feature or request
#1254
opened Jul 15, 2024 by
lieh1203
what's the biggest dataset you've tried?
bug
Something isn't working
#1253
opened Jul 15, 2024 by
exnx
Assertion Error when Setting pipe_parallel_size or model_parallel_size in GPT-NeoX
bug
Something isn't working
#1251
opened Jul 10, 2024 by
lieh1203
For nucleus sampling, top-p sampling appears to happen on the softmax-normalized top-k logits
bug
Something isn't working
#1250
opened Jul 3, 2024 by
j-frei
batch_input and elapsed time per iteration suddenly slow down during model training
bug
Something isn't working
#1248
opened Jun 29, 2024 by
Yuhanleeee
Add Transformer Engine's version of RMSNorm and LayerNorm
#1235
opened Jun 11, 2024 by
lintangsutawika
•
Draft
How to set the ffn hidden size parameter in gpt neox
feature request
New feature or request
#1230
opened May 28, 2024 by
IronMan-WangJinxi
My servers used for multi-node training do not have ssh. How can I launch multi-node training using the torchrun command?
feature request
New feature or request
#1203
opened Apr 23, 2024 by
dingning97
Added infinite lr schedules
merge-queue
This PR is next on the queue to merge
#1194
opened Mar 25, 2024 by
kshitijkg
Loading…
PyTorch Lightning Fused optimizer step
feature request
New feature or request
#1160
opened Feb 29, 2024 by
jahatef
Tests fail when run with pytest --forked
bug
Something isn't working
#1132
opened Jan 25, 2024 by
segyges
Previous Next
ProTip!
Follow long discussions with comments:>50.