The Way forward for the Global Open-Source AI Ecosystem: From DeepSeek to AI+

-


Adina Yakefu's avatar

Irene Solaiman's avatar


That is the third and final blog in a three-part series on China’s open source community’s historical advancements since January 2025’s “DeepSeek Moment.” The primary blog on strategic changes and open artifact growth is on the market here, and the second blog on architectural and hardware shifts is on the market here.

On this third article, we examine paths and trajectories of distinguished Chinese AI organizations, and posit future directions for open source.

For AI researchers and developers contributing to and counting on the open source ecosystem and for policymakers understanding the rapidly changing environment, because of intraorganizational and global community gains, open source is the dominant and popular approach for Chinese AI organizations for the near future. Openly sharing artifacts from models to papers to deployment infrastructure maps to a method with the goal of large-scale deployment and integration. 



China’s Organic Open Source AI Ecosystem

Having examined strategic and architectural changes since DeepSeek’s R1, we get a glimpse for the primary time at how an organic open source AI ecosystem is taking shape in China. A culmination of powerful players, some established in open source, some latest players, and a few changing course entirely to contribute to the brand new open culture, signal that the open collaborative approach is mutually helpful. 

This collaboration is reaching beyond national boundaries; essentially the most followed organization on Hugging Face is DeepSeek, and the fourth most followed is Qwen. 

Screenshot 2026-02-03 at 09.50.11

Along with models, openly sharing science and techniques has not only informed other AI organizations, but additionally the complete open source community. The preferred papers on Hugging Face largely come from Chinese organizations, namely ByteDance, DeepSeek, Tencent, and Qwen.

Screenshot 2026-02-02 at 09.10.35
source: https://huggingface.co/spaces/evijit/PaperVerse



The Established

Alibaba positioned open source as an ecosystem and infrastructure strategy. Qwen was not shaped as a single flagship model, but constantly expanded right into a family covering multiple sizes, tasks, and modalities, with frequent updates on Hugging Face and their very own platform ModelScope. Its influence didn’t think about any single version. As an alternative, it was repeatedly reused as a component across different scenarios, step by step taking over the role of a general AI foundation. By mid-2025, Qwen became the model with most derivatives on Hugging Face, with over 113k models using Qwen as a base, and over 200k model repositories tagging Qwen, far exceeding Meta’s Llama’s 27k or DeepSeek’s 6k. Organization-wide, Alibaba boasts essentially the most derivatives almost as much as each Google and Meta combined.

At the identical time, Alibaba aligned model development with cloud and hardware infrastructure, integrating models, chips, platforms, and applications right into a single engineering stack.

Tencent also made a big move from borrowing to constructing. As one in every of the primary major corporations to integrate DeepSeek into core consumer-facing products after R1’s release, Tencent didn’t initially frame open source as a public narrative. As an alternative, it brought mature models in through plug-in style integration, ran large-scale internal validation, and only later began to release its own capabilities. From May 2025 onward, Tencent accelerated open releases in areas where it already had strengths, corresponding to vision, video, and 3D with its own brand named Tencent Hunyuan (it’s now Tencent HY), and these models quickly gained adoption locally. 

ByteDance, by following its “AI application factory” approach, starts selectively open sourcing high value components while keeping its competitive concentrate on product entry points and enormous scale usage. On this context, the ByteDance Seed team has contributed several notable open-source artifacts, including UI-TARS-1.5 for multimodal UI understanding, **Seed-Coder **for data-centric code modeling, and the SuperGPQA dataset for systematic reasoning evaluation. Despite a comparatively low-profile open-source presence, ByteDance has achieved significant scale in China’s AI market, with its AI application Doubao surpassing 100 million DAU in December 2025.

Essentially the most notable change inside Baidu, whose CEO openly calls short on open source, also began its shift: after years of prioritizing closed models, it re-entered the ecosystem through free access and open release, corresponding to the Ernie 4.5 series. This shift was accompanied by renewed investment in its open-source framework, PaddlePaddle, in addition to its own AI chip Kunlunxin, which announced an IPO on January 1, 2026. By connecting models, chips, and PaddlePaddle inside a more open system, Baidu can lower costs, attract developers, and influence standards, while maintaining strategic control under shared constraints of compute, cost, and regulation.



The Normalcy of “DeepSeek Moments”

Amongst startups, Moonshot, Z.ai, and MiniMax adjusted rapidly and brought latest momentum to the open-source community inside months after R1. Models corresponding to Kimi K2, GLM-4.5, and MiniMax M2 all earned places on AI-World’s open-model milestone rankings.At the top of 2025, Z.ai and MiniMax released their most advanced open-source models up to now and subsequently announced their IPO plans in close succession.

The open-sourcing of Kimi K2 was widely described as a “One other DeepSeek moment” for the community. Although Moonshot has not announced an IPO, market reports indicate that the corporate raised roughly $500M in funding by the top of 2025, with AGI and agent-based systems positioned as its primary commercialization objectives. 

Those application-first corporations corresponding to Xiaohongshu, Bilibili, Xiaomi, and Meituan, previously focused only on the applying layer, began training and releasing their very own models. With their native advantage in real usage scenarios and domain data, once strong reasoning became available at low price through open source, constructing in-house models became practical. It tunes AI around their specific businesses, relatively than being constrained by the price structures or limits of external providers.

If the business world seized the ROI positive opportunity for growth, research institutions and the broader community welcomed the shift much more willingly. Organizations corresponding to BAAI and Shanghai AI Lab redirected more effort toward toolchains, evaluation systems, data platforms, and deployment infrastructure, projects like FlagOpen, OpenDataLab, and OpenCompass. These efforts didn’t chase single-model performance, but as an alternative strengthened the long-term foundations of the ecosystem.



Foundations for the Future

The defining feature of the brand new ecosystem isn’t that there are more models, but that a whole chain has formed. Models may be open-sourced and prolonged; deployments may be reused and scaled; software and hardware may be coordinated and swapped; and governance capabilities may be embedded and audited. This can be a shift from isolated breakthroughs to a system that may actually run in the actual world.

This ecosystem didn’t appear overnight. It’s built on years of amassed infrastructure “tailwind” since 2017. Over the past several years, China has periodically invested in data centers and compute centers, step by step forming a nationwide, integrated compute layout centered on the “East Data, West Compute” strategy. The national plan established 8 major compute hubs and 10 data center clusters, guiding compute demand from the east toward the central and western regions.

Public information suggests China intends to take a position in continual growth in energy capability. China’s total compute capability is around 1590 EFLOPS as of the yr 2025, rating among the many top globally. Sources in China assert that intelligent compute capability, tailored for AI training and deployment, is anticipated to grow by roughly 43% yr over yr, far outpacing general-purpose compute. At the identical time,the common data center power usage effectiveness (PUE) fell to around 1.46, indicating higher effectiveness and providing a solid hardware foundation for AI at scale. Energy is a transparent key focus. 

If the 2017 “Latest Generation AI Development Plan” was mainly about setting direction and constructing foundations, then the August 2025 “AI+” motion plan clearly shifted focus toward large-scale deployment and deep integration. This marks a directionally different pursuit from AGI. The emergence of R1 provided the missing “lift” on the engineering and ecosystem level. It was the catalyst that systematically activated compute, energy, and data infrastructure that had already been built.

Consequently, within the yr following R1’s release, China’s AI development accelerated along two important paths. First, AI became more deeply embedded in industrial processes, moving beyond chatbots toward agents and workflows. Second, greater emphasis was placed on autonomous and controllable AI systems, reflected in additional flexible training pathways and increasingly localized deployment strategies. 

Looking back, the actual turning point was not the expansion within the variety of models, but a fundamental change in how open-source models are used. Open source moved from an optional alternative to a default assumption in system design. Models became reusable and composable components inside larger engineering systems.



Looking Back to Look Forward

From DeepSeek to “AI+”, China’s path in 2025 was not about chasing peak performance. It was about constructing a practical path organized around open source, engineering efficiency, and scalable delivery, a path that has already begun to run by itself.

Resource constraints didn’t limit China’s AI development. In some respects, they reshaped its trajectory. The discharge of DeepSeek R1 acted as a catalytic event, triggering a sequence of responses across the domestic industry and accelerating the formation of a more organically structured ecosystem. At the identical time, this shift created a critical window for continued domestic research and development. As this ecosystem matures, its longer-term impact—and how the worldwide AI community may engage with an increasingly self-sustaining AI ecosystem in China—will turn into vital questions for future discussion.



Source link

ASK ANA

What are your thoughts on this topic?
Let us know in the comments below.

0 0 votes
Article Rating
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

0
Would love your thoughts, please comment.x
()
x