Back to Home

LM Studio

Run open-source models locally with a beautiful interface. Full privacy, zero cloud dependency.

Why LM Studio

Model Discovery

Browse and download models directly from Hugging Face with one click.

Easy Setup

OpenAI-Compatible API

Local server mode exposes an OpenAI-compatible API endpoint.

Compatible

GPU Acceleration

Automatic GPU detection and optimization for faster inference.

Fast

Cross-Platform

Available for Mac, Windows, and Linux with native performance.

Universal

Best For

  • Privacy-sensitive work — Everything runs locally, ideal for confidential data
  • Model exploration — Easy to try different GGUF models from Hugging Face
  • Development and testing — Local API server for app development without cloud costs
  • Offline environments — Works without internet after models are downloaded

Using LM Studio with Evvl

Download LM Studio from lmstudio.ai, then download and load your desired models.

Desktop App: Start LM Studio's local server, then configure Evvl to connect to its API endpoint. No API key needed.

Web App: LM Studio isn't available in the web app due to CORS restrictions. Use the desktop app for local models.

Compare local vs cloud models

See how local models compare to GPT-4o, Claude, and Gemini on your specific tasks.

Download Desktop App