According to 1M AI News, Rakuten Group released the open-source model Rakuten AI 3.0, calling it "Japan's largest high-performance AI model." This model uses the MoE architecture, with a total of 671 billion parameters, activating 37 billion parameters per inference, and a 128K context window. It is optimized for Japanese language scenarios and outperforms GPT-4o in multiple Japanese benchmark tests. This model is one of the results of the GENIAC project promoted by Japan's Ministry of Economy, Trade and Industry and the New Energy and Industrial Technology Development Organization (NEDO), and received some computing power support. Rakuten did not disclose the source of the underlying model, only stating that it was built based on open-source community results. The community discovered in the HuggingFace model files that its config.json contains "deepseek_v3" and related architecture fields, and the parameter scale and context configuration are consistent with DeepSeek V3, indicating that the model may be based on DeepSeek V3 with Japanese fine-tuning.