Library/mistral-inference
Library/mistral-inferenceForked

mistralai/mistral-inference

mistral-inference

Official inference library for Mistral models

Builder

Mistral AI

Mistral AI

mistralai • ai-lab

Stars

10,753

Using upstream star count

Forks

1,033

Using upstream fork count

Open Issues

0

Activity Score

0/100

0 commits in 30d

Created

Sep 27, 2023

Project creation date

README Summary

This is the official inference library for Mistral AI models, providing optimized tools and utilities for running inference on Mistral's language models. The library offers efficient implementations for model loading, tokenization, and text generation capabilities. It serves as the primary interface for developers to integrate and deploy Mistral models in production environments.

AI Dev Skills

Unmapped

Large Language Model InferenceTransformer ArchitectureModel OptimizationFunction CallingText GenerationModel ServingGPU AccelerationDistributed Inference

Tags

Large Language Model InferenceTransformer ArchitectureModel OptimizationFunction CallingText GenerationModel ServingGPU AccelerationDistributed InferenceOpen Source LLMsGPU ServersTextText Generation ServicesPrivate Model HostingSelf-hosted AISelf-hostedFunction Calling ApplicationsOn-premiseBatch Text ProcessingLocal LLM DeploymentCloud InfrastructureCustom AI Assistant DevelopmentOn-device AIJupyter Notebook

Taxonomy

Recent Activity

Updated 1 months ago

7 Days

0

30 Days

0

90 Days

0

Quality

production
Quality
high
Maturity
production

Categories

Learning ResourcesPrimaryInference & ServingML Platform & InfrastructureData Science & AnalyticsEdge & Mobile AIOther AI / MLFoundation ModelsAI Agents

PM Skills

Product Discovery

Languages

Jupyter Notebook100.0%

Timeline

Project created
Sep 27, 2023
Forked
Mar 14, 2026
Your last push
1 months ago
Upstream last push
1 months ago
Tracked since
Feb 26, 2026

Similar Repos

pgvector cosine similarity · $0

Loading…