Exciting news! Gradient has launched a FREE GPU plan. Read More
Project Details

Getting Started with aitextgen

Perform text-based AI training and generation using OpenAI's GPT-2 and EleutherAI's GPT Neo/GPT-3 architecture.

By
The Gradient Team

Description

aitextgen is a text-generation library created by Max Woolf that uses GPT-2 and GPT Neo/GPT-3 architecture in combination with PyTorch, Hugging Face Transformers, and python-lightning.

This ML Showcase entry is a fork of the notebooks in the aitextgen repo. The contents are as follows:

  • Train a Custom GPT-2 Model + Tokenizer w/ GPU
  • Train a GPT-2 (or GPT Neo) Text-Generating Model w/ GPU
  • Generation Hello World
  • Training Hello World
  • Hacker News Demo
  • Reddit Demo