Load GPT-2 checkpoint and generate texts in PyTorch
-
Updated
Feb 26, 2019 - Python
Load GPT-2 checkpoint and generate texts in PyTorch
Keras/Tensorflow implementation of the decoder from the transformer as described in the paper: "Attention Is All You Need"
generate lyrics with GPT-2
Load GPT-2 checkpoint and generate texts
GPT-2-based content generation for marketing, novelty, and whatever.
Powering Academia with AI generating creative topic-related concepts
Generated Simpsons scripts using a neural network. See some of my favorites at https://twitter.com/deepsimpsons
Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation
Code and UI for running a Magic card text generator API via GPT-2
GPT-2 style architecture for training language generators for specific tasks. [Production Ready]
solve text generation tasks by the language model GPT2, including papers, code, demo demos, and hands-on tutorials. 使用语言模型GPT2来解决文本生成任务的资源,包括论文、代码、展示demo和动手教程。
A collection of PyTorch implementations of GPT-2 to generate stories interactively or autonomously done during my Internship at the University College Dublin in the Computational Creativity Group
Generatore di tweets di Matteo Salvini
probability distribution checker for poem
Add a description, image, and links to the gpt-2 topic page so that developers can more easily learn about it.
To associate your repository with the gpt-2 topic, visit your repo's landing page and select "manage topics."