Library/mtp-lm
Library/mtp-lmForked

CelestialCreator/mtp-lm

mtp-lm

Source code to accompany research paper on training multi token prediction language models using self-distillation.

Builder

CelestialCreator

CelestialCreator

CelestialCreator • individual

Stars

4

Using upstream star count

Forks

2

Using upstream fork count

Open Issues

0

Activity Score

0/100

0 commits in 30d

Created

Mar 5, 2026

Project creation date

README Summary

This repository contains the source code implementation for a research paper focused on training multi-token prediction language models using self-distillation techniques. The codebase provides the experimental framework and algorithms needed to reproduce the research findings on improving language model training through multi-token prediction capabilities.

AI Dev Skills

Unmapped

Multi-Token PredictionSelf-DistillationLanguage Model TrainingTransformer ArchitectureKnowledge DistillationLarge Language Model DevelopmentDeep Learning ResearchNeural Network Optimization

Tags

Multi-Token PredictionSelf-DistillationLanguage Model TrainingTransformer ArchitectureKnowledge DistillationLarge Language Model DevelopmentDeep Learning ResearchNeural Network OptimizationSelf-hostedTraining OptimizationModel DistillationAutoregressive ModelingEfficient Text GenerationEfficient Language ModelsTextResearch in Language Model ArchitectureAccelerated Language Model InferenceNeural Language ModelingCloudPython

Taxonomy

Recent Activity

Updated 1 months ago

7 Days

0

30 Days

0

90 Days

0

Quality

research
Quality
medium
Maturity
research

Categories

Learning ResourcesPrimaryInference & ServingSearch & KnowledgeOther AI / MLFoundation ModelsModel TrainingEdge & Mobile AI

PM Skills

Product Discovery

Languages

Python100.0%

Timeline

Project created
Mar 5, 2026
Forked
Mar 12, 2026
Your last push
1 months ago
Upstream last push
1 months ago
Tracked since
Mar 5, 2026

Similar Repos

pgvector cosine similarity · $0

Loading…