Rechercher dans ce blog

Saturday, December 2, 2023

China to meet AI market demand with local chips - Asia Times

China is going to rely more on the use of local artificial intelligence (AI) chips than foreign chips, as it can no longer obtain Nvidia’s most-advanced chips due to the United States’ chip export controls.

The country does not lack AI chip makers, which include Huawei Technologies, Moore Threads, Cambrian and Stream Computing.

Their total shipment will reach 1.34 million units in 2023, up 22.5% from last year, according to the International Data Corporation (IDC), a market intelligence provider.

However, technology experts said a lot more has to be done before Chinese AI chips can become popular.

Lin Yonghua, vice president and chief engineer at the Beijing Academy of Artificial Intelligence, said China should develop some technology frameworks that enable servers to use different kinds of AI chips at the same time.

He said such a move can help resolve the nation’s AI chip shortage problem. He said there’s reason to hope that more and more of China’s data centers will shift to use local AI chips.

Robust market growth

China’s AI server markets will grow 82.5% to US$9.1 billion yuan this year, according to a report published by IDC and Inspur Electronic Information Industry Co Ltd during the annual AI Computing Conference (AICC) in Beijing on November 29.

The figure will increase to US$13.4 billion yuan in 2027, 47% up from the 2023 level, said the report.

The compound annual growth rate of China’s AI computing power will be 33.9% in 2022-2027, compared with the 16.6% annual growth of traditional computing power.

IDC said the global AI server markets will grow from US$19.5 billion in 2022 to US$34.7 billion in 2026. Globally, servers that can create AI-generated content (AIGC), including texts, images, songs and videos, will account for 31.7% of all servers in 2026, up from the current level of 11.9%. 

ChatGPT, launched by the Microsoft-backed OpenAI in November 2022, is a form of generative AI. November 30 marked the one-year anniversary of the debut of ChatGPT.

After seeing the success of ChatGPT, many Chinese companies have increased their investments in chatbots over the past year. They include the companies collectively called BAT – Baidu, Alibaba and Tencent – as well as Huawei Technologies. 

Baidu is based in Beijing while Tencent and Huawei are in Shenzhen. Huawei and Suzhou Industrial Park (SIP), a China-Singapore cooperation area, co-founded the Huawei (Suzhou) Artificial Intelligence Innovation Center. Alibaba has its headquarters in Hangzhou. 

The top five cities in terms of AI investments are now Beijing, Hangzhou, Shenzhen, Shanghai and Suzhou. In 2018-2022, the top five were Beijing, Hangzhou, Shanghai, Shenzhen and Guangzhou. 

Zhou Zhengang, a vice president of IDC, said many Chinese companies are interested in investing in generative AI services while two thirds of these firms have already made their investments.

He said many Chinese internet firms and telecom operators have also started building their AI computing centers. He said as of August this year, China has AI computing centers built in 30 cities with a total investment of 20 billion yuan (US$2.82 billion). 

Competing with humans

In August last year, the Biden administration ordered US chipmakers to stop exporting to China or Russia graphic processing units (GPUs) that operate at interconnect bandwidths of 600 gigabytes per second or above. 

Due to this rule, Nvidia could not ship its A100 and H100 chips to China. It then unveiled the A800 and H800 processors, which work at 400 and 300 gigabytes per second respectively, targeting the Chinese markets.

On October 17 this year, the US tightened its rules, making Nvidia unable to ship its A800, H800, L40, L40S and RTX 4090 chips to China.

Reuters reported on November 9 that Nvidia planned to release three AI chips, namely H20, L20 and L2, for China’s markets. But reportedly, on November 24, the company delayed the launch of H20 to the first quarter of 2024. 

Technology experts said the H100 is 6.68 times faster than the H20 in general. At the same time, the H20 is 20% faster than the H100 in large language model (LLM) reasoning. 

LLMs are deep learning algorithms that can recognize, summarize, translate, predict and generate content using very large datasets, according to Nvidia’s website.

Nvidia Chief Executive Jensen Huang said at an event in the US on Wednesday that AI will be “fairly competitive” with humans in as few as five years. 

Huang said the rising competition in the AI industry will lead to the launch of more off-the-shelf AI tools that companies in different industries will tune according to their needs, from chip design and software creation to drug discovery and radiology.

Fastest AI supercomputer

While it’s uncertain that Chinese firms can import the H20 chips next year, their US counterparts meanwhile are using Nvidia’s most cutting-edge chips. 

On November 28, Amazon Web Services (AWS), a subsidiary of Amazon, became the first cloud service provider to use Nvidia’s GH200 NVL32 multi-node platform, which connects 32 Grace Hopper Superchips into one instance. 

AWS and Nvidia are partnering on Project Ceiba to design the world’s fastest GPU-powered AI supercomputer, which uses 16,384 GH200 Superchips and is capable of processing 65 exaflops of AI. The machine will be used by Nvidia for generative AI innovation. 

A GH200 NVL32 server can train a trillion-parameter model over 1.7 times faster than Nvidia’s HGX H100 server.

An unnamed chip industry expert told the China Securities Journal that Chinese AI chips are lagging behind foreign ones in terms of performance. However, he said, more and more Chinese firms are now willing to use Chinese AI chips or build hybrid servers with different chips due to the limited supply of foreign chips.

Liu Jun, senior vice president of Inspur Electronic Information, said Wednesday that the quality of China’s AI development will depend on the computing powers of AI servers. He said China needs to boost the computing power and processing efficiency of its AI servers in order to support more complex and large-scale AI applications in the coming few years.

Inspur is a state-owned AI server distributor that relies heavily on Nvidia’s chips. It has a more-than-50% share in China’s AI server market. Its customers include Baidu’s Ernie Bot.

The company said on November 27 that it will build its servers with chips made by a variety of companies. 

Read: Nvidia dumbs down AI chips for Chinese markets

Follow Jeff Pao on Twitter at @jeffpao3

Adblock test (Why?)


China to meet AI market demand with local chips - Asia Times
Read More

No comments:

Post a Comment

Ti Tree Local Court list, Friday, January 26 - NT News

[unable to retrieve full-text content] Ti Tree Local Court list, Friday, January 26    NT News Ti Tree Local Court list, Friday, January 2...