DeepSeek

Available Models

DeepSeek R1 Open-source reasoning model matching OpenAI o1. 671B/37B MoE with MIT license. Transparent reasoning tokens for explainable AI applications.

DeepSeek V3.1 Enhanced V3 with improved tool use and code generation. Two-phase long-context training for better coherence. Best-in-class efficiency at 671B/37B architecture.

DeepSeek V3 671B parameter MoE model with 37B active parameters. Supports thinking and non-thinking modes with FP8 training. Strong reasoning, coding, and Chinese language capabilities at extremely low cost.