Skip to main content

Moonshot AI

Category: AI Infrastructure

Moonshot AI (北京月之暗面科技有限公司) is a Beijing-based AI foundation model company that develops advanced large language models, including the Kimi AI assistant series, specializing in ultra-long-context processing with industry-leading capabilities. Moonshot AI was founded in 2023. The company is led by Yang Zhilin. Based in Beijing, China. Team size: 200-300. Total funding raised: $1.77B. Latest round: Series C. Key investors include ["Alibaba Group", "Tencent", "HongShan (Sequoia China)", "Zhen Fund", "IDG Capital", "Gaorong Capital"].

Founded
2023
Headquarters
Beijing, China
Team size
200-300
Total funding
$1.77B

Value proposition

Industry-leading long-context processing capabilities (up to 2M tokens), open-source foundation models (Kimi K2 with 1T parameters), and cost-effective enterprise AI solutions. MoE architecture for efficient inference.

Products and solutions

["Kimi AI Assistant", "Kimi K2 (1T-parameter open-source model)", "Kimi K2 Thinking (MoE architecture, 32B active parameters)", "Long-context processing APIs", "Enterprise AI solutions"]

Unique value

Pioneering 'lossless long-context' processing for AI models, enabling them to handle extensive text inputs without losing critical information. Focus on open-source leadership and scalable architectures for AGI (Artificial General Intelligence).

Target customer

Enterprises requiring AI-driven solutions for natural language processing, content generation, and industry-specific AI applications. Developers and researchers seeking open-source foundation models.

Industries served

["Technology", "Finance", "Healthcare", "E-commerce", "Media & Entertainment", "Research & Development"]

Technology advantage

Proprietary research in transformer architectures (XLNet/Transformer-XL), ultra-long-context processing (up to 2M tokens), efficient MoE architecture deployment, and strategic partnerships with Alibaba/Tencent for cloud infrastructure.

How they differentiate

Moonshot AI differentiates through its Kimi series' ultra-long-context processing (up to 2M tokens), open-source model accessibility (Kimi K2), cost-effective enterprise solutions, and strong performance in coding and reasoning benchmarks. Kimi K2 Thinking specifically outperforms rivals in complex reasoning tasks.

Main competitors

["OpenAI", "Anthropic", "DeepSeek", "Baichuan AI", "Zhipu AI", "Minimax"]

Key partnerships

["Alibaba","Tencent","Sequoia Capital (investor)","Tsinghua University (research collaborations)"]

Notable customers

["Enterprise AI solutions for Chinese clients", "Research institutions", "Developers using open-source models"]

Major milestones

[ "2023-03: Company founded in Beijing", "2023-10: Launched Kimi AI assistant with 2M token context window", "2024-02: Raised $1B Series B led by Alibaba Group", "2024-05: Reached $3B valuation", "2024-08: Raised $300M Series B extension led by Tencent", "2025-07: Launched Kimi K2, 1T-parameter open-source model", "2025-11: Launched Kimi K2 Thinking with MoE architecture" ]

Growth metrics

Valuation: $3B+ (2024). Team size: 200-300 employees. Kimi K2 achieved strong performance in coding and reasoning benchmarks, outperforming GPT-4.1 and other global competitors.

Market positioning

Positioned as a leader in China's generative AI market, competing globally with cost-effective models and enterprise-focused AI solutions. Part of China's "Six Little Dragons" unicorn group with focus on AGI research and open-source accessibility.

Geographic focus

China (primary market), with expanding enterprise clients worldwide. Key competitors are concentrated in China's AI ecosystem and global foundation model market.

Patents and IP

Not publicly disclosed as of 2025.

About Yang Zhilin

PhD from Carnegie Mellon University, former researcher at Google Brain and Meta AI, leading expert in transformer architecture and natural language processing with influential papers like XLNet and Transformer-XL

Official website: