LM Studio is a powerful free desktop application for running open-source language models locally on your own hardware. It features a modern GUI, built-in model discovery from HuggingFace, GPU acceleration (NVIDIA, AMD, Intel, Apple), and an OpenAI-compatible local API server. LM Studio is free for both personal and commercial use and provides offline operation with full privacy.
License: Free for personal and commercial use (Proprietary)
Latest Version: 0.4.6 (February 27, 2026)
Website: lmstudio.ai
GitHub: lmstudio-ai
- π₯οΈ Modern Desktop UI - Clean, intuitive interface with split view support
- π₯ Model Discovery - Built-in HuggingFace integration with search
- β‘ GPU Acceleration - NVIDIA CUDA, AMD ROCm, Intel Arc, Apple Metal/MLX
- π OpenAI-Compatible API - Drop-in replacement for OpenAI API calls
- π Document Attachments - RAG with local files (PDF, TXT, MD)
- π― Speculative Decoding - Faster inference with draft models
- π§ Developer Mode - Advanced options and per-model settings
- π€ MCP Support - Model Context Protocol integration
- π LM Link - Connect to remote LM Studio instances (E2E encrypted)
- π» CLI Tool (lms) - Terminal-based model management and chat
- π Python SDK -
lmstudio package for Python integration
- π¦ TypeScript SDK -
@lmstudio/sdk for JavaScript/TypeScript
- π₯οΈ llmster - Headless daemon for server/cloud/CI deployments
- Local LLM Development - Test and experiment with models offline
- Privacy-Focused AI - Fully local operation, no data leaves your machine
- API Backend - Local OpenAI-compatible server for applications
- Document Analysis - RAG with local PDFs and documents
- Model Benchmarking - Test different models and configurations
- Team Collaboration - Share models via LM Link or Hub organization
- Enterprise Deployment - Internal AI tools with SSO (Enterprise plan)
| Component |
Technology |
| Backend |
C++, llama.cpp |
| Frontend |
Electron (TypeScript) |
| Inference Engine |
llama.cpp, MLX (Apple) |
| API |
OpenAI-compatible REST API |
| CLI |
Node.js (lms) |
| SDKs |
Python, TypeScript (MIT licensed) |
| Component |
Minimum |
Recommended |
| OS |
Windows 10, macOS 11, Linux |
Latest OS version |
| RAM |
16 GB |
32-64 GB |
| GPU VRAM |
4 GB |
12-24+ GB |
| Disk |
20 GB |
100+ GB (NVMe SSD) |
| GPU Support |
NVIDIA, AMD, Intel, Apple |
Latest GPU drivers |
- β
Free for personal and commercial use (since July 2025)
- β
Desktop apps for Windows, macOS, Linux
- β
GPU acceleration (CUDA, ROCm, Metal, MLX)
- β
OpenAI-compatible API server
- β
Python and TypeScript SDKs (MIT licensed)
- β
CLI tool (lms) included
- β
llmster headless daemon available
- β
LM Link for remote connections
- β οΈ Proprietary application (not open-source)
- β οΈ Enterprise features require paid plan
Any questions?
Feel free to contact us. Find all contact information on our contact page.