openai/gpt-2
gpt-2
Code for the paper "Language Models are Unsupervised Multitask Learners"
Builder

OpenAI
openai • ai-lab
Stars
24,732
Using upstream star count
Forks
5,869
Using upstream fork count
Open Issues
0
Activity Score
0/100
0 commits in 30d
Created
Feb 11, 2019
Project creation date
README Summary
GPT-2 is a large-scale unsupervised language model that generates coherent paragraphs of text and performs many language tasks without task-specific training. This repository contains the code implementation for the paper 'Language Models are Unsupervised Multitask Learners' by OpenAI. It demonstrates how language models can achieve strong performance on downstream NLP tasks through unsupervised pre-training on diverse text data.
AI Dev Skills
Unmapped
Tags
Taxonomy
Deployment Context
Modalities
Skill Areas
Recent Activity
Updated 1 years ago
7 Days
0
30 Days
0
90 Days
0
Quality
research- Quality
- high
- Maturity
- research
Categories
PM Skills
Languages
Timeline
- Project created
- Feb 11, 2019
- Forked
- Mar 14, 2026
- Your last push
- 1 years ago
- Upstream last push
- 1 years ago
- Tracked since
- Aug 14, 2024
Similar Repos
pgvector cosine similarity · $0
Loading…