첫 페이지 News 본문

According to the latest data from market research consulting firm Omdia, Microsoft has become the largest buyer of Nvidia's flagship product, the Hopper chip, far ahead of its competitors in other technology fields.
Omdia analysts estimate that Microsoft purchased 485000 Hopper chips this year, compared to the second largest US customer Meta Platforms who purchased 224000 chips, which is less than half of Microsoft's procurement volume.
Omdia claims that its calculation results are based on capital expenditures, server shipments, supply chain intelligence, and other data disclosed by various companies. According to Omdia, ByteDance and Tencent respectively ordered about 230000 Nvidia chips this year, a higher number than Meta.
Although Amazon and Google are trying to deploy their own customized alternative products, they still purchased 196000 and 169000 Hoppers respectively. The data also shows that the total number of chips purchased by Tesla and xAI managed by Musk is slightly higher than that of Amazon.
Last month, Nvidia CEO Huang Renxun stated during the earnings call that although the next generation of Blackwell chips is planned to start shipping this quarter, the current Hopper chip is still very popular, thanks to the excellent contributions of basic model developers in pre training, post training, and inference.
Since the debut of chatbot ChatGPT two years ago, large tech companies have invested billions of dollars in developing AI infrastructure, ushering in an unprecedented investment boom that has made Nvidia's AI chips one of the hottest commodities in Silicon Valley.
Huang Renxun has repeatedly mentioned that "the demand for Nvidia products is very strong, and everyone wants to be the first to receive them, and everyone wants to receive the most products." Last month, Nvidia announced a 94% year-on-year increase in revenue and a 109% increase in net revenue for the third quarter.
Compared to other technology companies, Microsoft can be said to be the most proactive in building infrastructure, as it not only needs data centers to run its own AI services (such as Copilot), but also rents out computing power to cloud service customers through its Azure division.
Omdia believes that Microsoft will purchase three times more Nvidia chips in 2024 than in 2023. Microsoft Azure executive Alistair Speirs told the media, "Good data center infrastructure is very complex and capital intensive projects that require years of planning
Speirs added, "Therefore, it is important to forecast our growth and leave some room for maneuver." Omdia estimates that by 2024, global tech companies will spend $229 billion on servers, with Microsoft at $31 billion and Amazon at $26 billion.
Vlad Galabov, head of cloud computing and data center research at Omdia, stated that approximately 43% of server spending in 2024 will go to NVIDIA. "NVIDIA GPUs account for a very high share, but we expect this to be close to its peak," he said
On the one hand, Nvidia's main competitor in the GPU field, AMD (AMD), is making progress. Omdia stated that Meta purchased 173000 AMD MI300 chips this year, and Microsoft also purchased 96000.
At the same time, large technology companies have also increased their use of their own chips. Google has been developing its Tensor Processing Unit (TPU) for the past decade, and Meta has also launched its self-developed MTIA chip, with each company deploying approximately 1.5 million units.
Amazon is also investing in its Trainium and Inferentia processors, deploying approximately 1.3 million such chips this year. Earlier this month, Amazon announced its plan to use its latest Trainium chip to build a new cluster for its partner Anthropic.
Compared to them, Microsoft is more conservative in its use of its first self-developed chip "Maia", with only about 200000 deployed this year.
您需要登录后才可以回帖 登录 | Sign Up

本版积分规则

王俊杰2017 注册会员
  • Follow

    0

  • Following

    0

  • Articles

    28