Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

20B Release #533

Merged
merged 7 commits into from
Feb 10, 2022
Merged

20B Release #533

merged 7 commits into from
Feb 10, 2022

Conversation

sdtblck
Copy link
Contributor

@sdtblck sdtblck commented Feb 9, 2022

don't merge yet, still waiting to fill in a few links :)

@sdtblck sdtblck requested a review from a team as a code owner February 9, 2022 23:50
Copy link
Member

@StellaAthena StellaAthena left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fuck yeah!

@sdtblck sdtblck merged commit c560814 into main Feb 10, 2022
@sdtblck sdtblck deleted the release branch February 10, 2022 03:38

**Diverse Modeling Options:** We provide a wide collections of options for constructing your model.
A 20 Billion Parameter autoregressive language model trained on the pile. For technical details about the model, see our paper [here](http:https://eaidata.bmk.sh/data/GPT_NeoX_20B.pdf).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
A 20 Billion Parameter autoregressive language model trained on the pile. For technical details about the model, see our paper [here](http:https://eaidata.bmk.sh/data/GPT_NeoX_20B.pdf).
A 20 billion parameter autoregressive language model trained on the pile. For technical details about the model, see our paper [here](http:https://eaidata.bmk.sh/data/GPT_NeoX_20B.pdf).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants