🤖AI Hub
← All Providers

Ollama

Run AI models locally or in the cloud. Ollama provides an easy-to-use runtime for open-source LLMs including Llama, Mistral, Phi, Gemma, Qwen, DeepSeek, and more.

Visit Ollama

Plans & Pricing

Local (Free)

$0
  • Run models locally on your machine
  • No API costs
  • Supports 100+ open models
  • GPU acceleration

Ollama Cloud

Free (beta)
  • Cloud-hosted inference
  • REST API
  • No setup required
  • Limited rate limits on free tier

Free Tier

Ollama is free to download and run locally. Ollama Cloud has a free tier with rate limits.

Models (4)

Compare →
ModelIn /1MOut /1M
Llama 4 Scout (Ollama)
ChatStreaming
FreeFree
Mistral Small 3.1 (Ollama)
ChatStreaming
FreeFree
Qwen3 32B (Ollama)
ChatStreaming
FreeFree
DeepSeek V3 (Ollama)
ChatStreaming
FreeFree