Banner_viho - 副本

The price went up to $45,000 each! Such chips become “hard currency”

According to the latest news, the price of GPU is still rising. As the “hard currency” of artificial intelligence infrastructure, overseas startups have begun to use GPU for mortgage financing. This emerging avenue of financing underscores the value of such hardware in the capital-intensive AI “arms race.”

A reporter learned from several industry insiders engaged in large-scale model development that there is still a continuous shortage of artificial intelligence chips. Some industry insiders said that not only small companies, but even leading platforms in the artificial intelligence industry are also facing a shortage of chips. They do not expect the AI ​​chip shortage to improve significantly for at least a year.

In addition, checking the eBay website found that the price of H100 has reached as high as 45,000 US dollars, which is an increase of more than 10% from the price of 40,000 US dollars in April this year, and the supply of goods has also decreased significantly compared with the first half of the year.

Microsoft’s recent earnings report highlighted signs of a possible chronic shortage of artificial intelligence chips. In Microsoft’s financial report, the accessibility of GPU was identified as an investment risk factor for the first time. “We will continue to pursue opportunities to expand server computing power in our data centers, depending on the availability of land, energy, network supplies, and GPUs and other components,” Microsoft’s earnings report read.

Recently, it was reported that the next-generation GPT large model GPT5 needs 50,000 H100 chips with the highest configuration from Nvidia. The global market demand for H100 chips has reached 430,000. Nvidia’s production capacity may not be able to meet such a large demand for computing power.

Since the launch of ChatGPT at the end of last year, there has been an upsurge of large models around the world for more than half a year. The huge demand for AI has also exposed the limitations of the global supply chain for chips used to develop and deploy AI models.

As the key hardware that can help run artificial intelligence training and deploy artificial intelligence algorithms, GPU is facing a huge gap. At present, more than 95% of the world’s large models use Nvidia’s GPU chips.

OpenAI CEO Altman (Sam Altman) recently stated publicly: “Our GPU is very short, and the fewer people who use ChatGPT, the better, so as to ensure that users have enough computing power.”

According to a research report released by investor service company Moody’s in May this year, Nvidia will achieve “unparalleled” revenue growth in the next few quarters, and its data center business revenue will exceed the sum of rivals Intel and AMD.

Post time: Aug-09-2023