EleutherAI/gpt-neox
gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries
Builder

EleutherAI
EleutherAI • ai-lab
Stars
7,408
Using upstream star count
Forks
1,101
Using upstream fork count
Open Issues
0
Activity Score
0/100
0 commits in 30d
Created
Dec 22, 2020
Project creation date
README Summary
GPT-NeoX is an implementation of model parallel autoregressive transformers on GPUs, built on top of Megatron and DeepSpeed libraries. It provides a framework for training large-scale language models with efficient distributed computing capabilities. The project enables researchers and developers to train GPT-style models with billions of parameters across multiple GPUs.
AI Dev Skills
Unmapped
Tags
Taxonomy
Deployment Context
Modalities
Skill Areas
Recent Activity
Updated 2 months ago
7 Days
0
30 Days
0
90 Days
0
Quality
production- Quality
- high
- Maturity
- production
Categories
PM Skills
Languages
Timeline
- Project created
- Dec 22, 2020
- Forked
- Mar 22, 2026
- Your last push
- 2 months ago
- Upstream last push
- 2 months ago
- Tracked since
- Feb 3, 2026
Similar Repos
pgvector cosine similarity · $0
Loading…