US chip giant Nvidia holds a significant advantage with its CUDA computing platform, while OpenAI may be unwise to heavily invest in the “scaling law”, according to a top Chinese scientist addressing two critical issues affecting the future of artificial intelligence (AI) development and US-China technology rivalry.
Li Guojie, a prominent computer scientist at the Chinese Academy of Sciences, said the country’s AI accelerator chips – including those from Huawei Technologies’ Ascend series, Hygon Information Technology’s deep-learning computing unit and Cambricon Technologies – were comparable to Nvidia’s offerings in terms of hardware capabilities.
However, Nvidia’s true core strength lay in its CUDA ecosystem – which engineers use to develop applications on the firm’s graphics processing units (GPUs) – so China must develop an alternative system to achieve self-sufficiency in AI, Li said.
Do you have questions about the biggest topics and trends from around the world? Get the answers with SCMP Knowledge, our new platform of curated content with explainers, FAQs, analyses and infographics brought to you by our award-winning team.
“DeepSeek has made an impact on the CUDA ecosystem, but it has not completely bypassed CUDA, as barriers remain,” Li, 81, said in comments published last Thursday by Study Times, the weekly newspaper of the Communist Party’s senior cadre training school.
“In the long run, we need to establish a set of controllable AI software tool systems that surpass CUDA.”
Chinese start-up DeepSeek has upended assumptions about how much resources are needed to build advanced AI models. Photo: Xinhua alt=Chinese start-up DeepSeek has upended assumptions about how much resources are needed to build advanced AI models. Photo: Xinhua>
Li likened China’s efforts to replace Western hardware-software systems – such as Windows and Intel, or Android and Arm – with the need to build an AI software ecosystem. “It is an extremely arduous task that requires careful planning and long-term efforts,” he said.
The comments come as China moves closer to achieving a self-sufficient AI system despite US technology restrictions. Huawei founder Ren Zhengfei said at a recent symposium chaired by President Xi Jinping that concerns over chip shortages and operating systems had eased.
The success of DeepSeek, which has produced low-cost, high-performance models with limited resources, has bolstered China’s chip developers and AI adoption in various industries.
Li suggested that China could implement policies to promote AI applications in computers, smartphones and other devices to expand the use of home-grown AI models and domestically developed GPUs.
The scientist also questioned the wisdom of ChatGPT creator OpenAI’s reliance on the “scaling law”, which posits that increased spending on computing resources leads to improved intelligence.
OpenAI and others have poured major investments into improving their AI models. Photo: dpa alt=OpenAI and others have poured major investments into improving their AI models. Photo: dpa>
“In AI, the scaling law is viewed by some as an axiom … and companies like OpenAI and the US AI investment community have treated it like a winning formula,” Li said. “But the scaling law is not a scientifically verified principle like Newton’s laws; it is a generalisation based on the recent experiences of OpenAI and others in developing large language models.
“From a scientific research perspective, it is an educated guess; from an investment standpoint, it is a gamble on a specific technological pathway,” Li said, noting the delay in the launch of the GPT-5 model could signal a decline in scaling.
Li added that DeepSeek’s success illustrated an alternative approach to improving model performance, which was through algorithm optimisation.
“The emergence of DeepSeek has forced the AI community to seriously reconsider whether to keep burning money and gamble, or to seek a new way to optimise algorithms,” he said. “DeepSeek’s achievements suggest that algorithmic and model architecture optimisation can also lead to miracles.”
Still, Li cautioned against labelling the scaling law as a dead end. “Whether increasing the amount of training data will yield returns corresponding to investment will depend on actual outcomes in the future.”
This article originally appeared in the South China Morning Post (SCMP), the most authoritative voice reporting on China and Asia for more than a century. For more SCMP stories, please explore the SCMP app or visit the SCMP’s Facebook and Twitter pages. Copyright © 2025 South China Morning Post Publishers Ltd. All rights reserved.
Copyright (c) 2025. South China Morning Post Publishers Ltd. All rights reserved.