In this repository, we generate Wikipedia by two different - GPT2 and LSTM. We let both models read the Wikipedia datasets and then generate the Wikipwdia-like text. We can modify diffent temporature to present diffent writing style - more normal or more creatrive. Because of the limitation on RAM, the articles we fit into the model is around 10000 to 20000, which might limit the model's ability to generate human-being-like text. If we increase the raw data, the result will be more useful.
Inspired by: Train a GPT-2 Text-Generating Model w/ GPU text_generation_wikipedia_rnn
Credits: https://zh.wikipedia.org/wiki/File:Wikipedia-logo_(inverse).png
RNN TF hub Text classification with TensorFlow Hub: Movie reviews BERT BERT HUB