What's new

Nvidia's AI GPUs Are Selling for up to $70,000 in China

Hamartia Antidote

ELITE MEMBER
Joined
Nov 17, 2013
Messages
35,188
Reaction score
30
Country
United States
Location
United States

Nvidia

(Image credit: Nvidia

Demand for generative artificial intelligence-based services and fears that the U.S. government could restrict sales of GPUs for AI workloads continues to drive up prices in the People's Republic of China. In some cases, the price of Nvidia's H800 compute GPU can reach as high as ¥500,000, or about $70,000 per unit, MyDrivers reports. In fact, it's still hard to get a GPU, even at this pricing.



Last month the price of Nvidia's A800 GPU, which is used for artificial intelligence (AI) and high-performance computing (HPC) applications, jumped 20% to ¥110,000 ($15,000) practically overnight after rumors that the U.S. government can restrict exports of such products to China emerged. Now, Nvidia's A800 and H800 compute GPUs can reach ¥120,000 ($16,800, presumably for A800), ¥250,000 ($34,970), ¥300,000 ($41,970), and even ¥500,000 ($69,950, presumably for H800).


But even if one has the money to pay for China-oriented A800 and H800 GPUs — cut-down versions of Nvidia's A100 and H100 compute GPUs with reduced performance and scalability — it may be impossible to obtain one of these devices. Instead of buying from a distributor or reseller, one may need to talk directly to Nvidia China or even Nvidia corporate headquarters, the report says.

The ridiculously high prices should not come as a surprise. The vast majority of AI clusters are based on Nvidia's compute GPUs and run software designed for Nvidia's CUDA software layer that exclusively supports processors from the green company. If owners of AI services and clusters cannot get enough compute GPUs to support the growing demand for their products, the quality of their services will degrade, and they risk losing their business over time.

Nvidia does not comment on pricing for its data center GPUs, so take the report about prices of the company's compute GPUs with a grain of salt. Meanwhile, an Nvidia H800 compute GPU in an add-in-board form-factor costs $30,603 in the U.S. Meanwhile, CDW, one of the prominent resellers of data center hardware, lists only one of such cards, and it takes 5-7 business days to get it, which may point to relatively short supply of these products.
 
Meanwhile the gulf in AI between China and the West continues to widen. I actually read some of the AI "research" papers from China being produced like a biscuit factory, pure garbage.
They believe quantity is better than quality.
 

Back
Top Bottom