cocktailpeanut/dalai
dalai
The simplest way to run LLaMA on your local machine
Builder

cocktailpeanut
cocktailpeanut • individual
Stars
12,965
Using upstream star count
Forks
1,349
Using upstream fork count
Open Issues
0
Activity Score
0/100
0 commits in 30d
Created
Mar 12, 2023
Project creation date
README Summary
Dalai provides the simplest way to run Meta's LLaMA (Large Language Model Meta AI) on your local machine. It offers an easy installation process and web interface for interacting with LLaMA models locally without requiring cloud services or APIs.
AI Dev Skills
Unmapped
Large Language Model DeploymentLocal Model InferenceLLaMA ArchitectureModel QuantizationHardware Optimization
Tags
Large Language Model DeploymentLocal Model InferenceLLaMA ArchitectureModel QuantizationHardware OptimizationOffline Text GenerationLocal AI ExperimentationLocal AIOn-premiseOn-device AIDeveloper ToolsLocal MachineTextResearch PrototypingEducationEducational AI LearningOpen Source Language ModelsPrivacy-Preserving Language ProcessingDemocratized AI AccessSelf-hostedResearchCSS
Taxonomy
Deployment Context
Industries
Modalities
Skill Areas
Recent Activity
Updated 1 years ago
7 Days
0
30 Days
0
90 Days
0
Quality
prototype- Quality
- medium
- Maturity
- prototype
Categories
Foundation ModelsPrimaryInference & ServingEdge & Mobile AISearch & KnowledgeDev Tools & AutomationLearning ResourcesOther AI / ML
PM Skills
Developer Platform
Languages
CSS100.0%
Timeline
- Project created
- Mar 12, 2023
- Forked
- Mar 23, 2026
- Your last push
- 1 years ago
- Upstream last push
- 1 years ago
- Tracked since
- Jun 18, 2024
Similar Repos
pgvector cosine similarity · $0
Loading…