Library/gpt-2Forked

openai/gpt-2

gpt-2

Code for the paper "Language Models are Unsupervised Multitask Learners"

Builder

OpenAI

OpenAI

openai • ai-lab

Stars

24,732

Using upstream star count

Forks

5,869

Using upstream fork count

Open Issues

0

Activity Score

0/100

0 commits in 30d

Created

Feb 11, 2019

Project creation date

README Summary

GPT-2 is a large-scale unsupervised language model that generates coherent paragraphs of text and performs many language tasks without task-specific training. This repository contains the code implementation for the paper 'Language Models are Unsupervised Multitask Learners' by OpenAI. It demonstrates how language models can achieve strong performance on downstream NLP tasks through unsupervised pre-training on diverse text data.

AI Dev Skills

Unmapped

Transformer ArchitectureUnsupervised LearningLanguage Model PretrainingAutoregressive Text GenerationLarge Scale Neural NetworksNatural Language ProcessingDeep Learning Model Implementation

Tags

Transformer ArchitectureUnsupervised LearningLanguage Model PretrainingAutoregressive Text GenerationLarge Scale Neural NetworksNatural Language ProcessingDeep Learning Model ImplementationTextTransfer LearningText CompletionSelf-hostedResearch on Language ModelsCreative Writing AssistanceLanguage ModelingNeural Language ModelsFoundation ModelsLarge Language ModelsPre-trained ModelsGenerative AIOn-premiseAttention MechanismsCloud APIText GenerationPython

Taxonomy

Recent Activity

Updated 1 years ago

7 Days

0

30 Days

0

90 Days

0

Quality

research
Quality
high
Maturity
research

Categories

Learning ResourcesPrimarySearch & KnowledgeOther AI / MLGenerative MediaFoundation ModelsModel TrainingNLP & Text

PM Skills

Product Discovery

Languages

Python100.0%

Timeline

Project created
Feb 11, 2019
Forked
Mar 14, 2026
Your last push
1 years ago
Upstream last push
1 years ago
Tracked since
Aug 14, 2024

Similar Repos

pgvector cosine similarity · $0

Loading…