Skip to content

Generate Wikipedia-like text by different NLP models.

Notifications You must be signed in to change notification settings

FlyingFathead/Wikipedia-RNN-and-GPT-2

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Wikipedia-RNN-and-GPT-2

In this repository, we generate Wikipedia by two different - GPT2 and LSTM. We let both models read the Wikipedia datasets and then generate the Wikipwdia-like text. We can modify diffent temporature to present diffent writing style - more normal or more creatrive. Because of the limitation on RAM, the articles we fit into the model is around 10000 to 20000, which might limit the model's ability to generate human-being-like text. If we increase the raw data, the result will be more useful.

Inspired by: Train a GPT-2 Text-Generating Model w/ GPU text_generation_wikipedia_rnn

Credits: https://zh.wikipedia.org/wiki/File:Wikipedia-logo_(inverse).png

Datasets

RNN TF hub Text classification with TensorFlow Hub: Movie reviews BERT BERT HUB

About

Generate Wikipedia-like text by different NLP models.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%