The Business & Technology Network
Helping Business Interpret and Use Technology
«  
  »
S M T W T F S
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
10
 
11
 
12
 
13
 
14
 
15
 
16
 
17
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
 
 
 
 
 

DeepSeek sharpens its math AI with MoE-powered Prover upgrade

DATE POSTED:April 30, 2025
DeepSeek sharpens its math AI with MoE-powered Prover upgrade

DeepSeek, a Chinese AI lab, has upgraded its AI model Prover, designed to solve math-related proofs and theorems, with the release of version V2 on AI development platform Hugging Face on Wednesday.

The latest version appears to be built on top of DeepSeek’s V3 model, which boasts 671 billion parameters and utilizes a mixture-of-experts (MoE) architecture. This architecture enables the model to break down complex tasks into subtasks and delegate them to specialized “expert” components.

In the context of AI models, parameters are a rough measure of a model’s problem-solving capabilities. DeepSeek last updated Prover in August, describing it as a custom model for formal theorem proving and mathematical reasoning.

The upgrade comes as DeepSeek continues to expand its AI offerings. In February, Reuters reported that the company was considering raising outside funding for the first time. Recently, DeepSeek released an upgraded version of its general-purpose V3 model and is expected to update its R1 “reasoning” model soon.

Featured image credit