첫 페이지 News 본문

Blue Whale News, November 26th (Reporter Zhu Junxi) Despite the close attention of regulatory agencies in multiple countries on the investment of large technology companies in AI startups, Amazon announced last week that it would still add $4 billion in investment to AI unicorn Anthropic, becoming its "main cloud service and training partner".
Anthropic announced in a recent blog post that it plans to use Amazon's AI chips to train and deploy its most advanced base models. Amazon has a total investment of $8 billion in Anthropic and still holds a minority stake. Anthropic stated that similar to Amazon's previous $4 billion investment, this new transaction will be conducted in stages in the form of convertible notes. The initial investment is 1.3 billion US dollars, and the rest will be disbursed later.
Intensifying monopoly or anti-monopoly?
Anthropic is considered OpenAI's biggest competitor, founded by the siblings Dario Amodei and Daniela Amodei, former OpenAI executives. In 2023, Amazon invested $1.25 billion in Anthropic for the first time and an additional $2.75 billion in March of this year, becoming the main cloud provider for Anthropic.
Another cloud giant, Google, also promised to invest $2 billion in Anthropic last year. But because Google has its own AI model and its investment scale is not as large as Amazon, its influence on Anthropic is relatively small.
These investment methods are similar to the cooperation model between Microsoft and OpenAI, and are currently under scrutiny by antitrust agencies in many parts of the world. In September, the UK Competition and Markets Authority decided to approve Amazon's previous investment in Anthropic, but is still investigating Google's investment and Microsoft's deal with OpenAI separately. Affected by the defeat of Google's antitrust case, the US Department of Justice proposed specific remedial measures last week, including prohibiting Google from acquiring or investing in any search competitors, query based artificial intelligence products, or advertising technology. This also means that if the judge ultimately approves the decision, Google will be forced to withdraw its investment in Anthropic and terminate the partnership between the two parties.
The development of large-scale models is an expensive money game, and Anthropic is also facing problems such as cost surges and development bottlenecks. Anthropic CEO Dario Amodei once stated in a podcast that the training cost of AI models will increase to $10 billion or even $100 billion in the next three years, while the training cost of OpenAI GPT-4o is about $100 million. Despite investing huge costs, these leading AI companies have recently found that their returns are starting to decline, failing to achieve the expected performance improvement when training new models, which has led Anthropic to postpone the release plan of the Claude 3.5 Opus model.
Dario Amodei mentioned in the podcast that Anthropic still plans to release the Claude 3.5 Opus model, but has not promised a specific timeline. He expressed a tendency to believe that the Scaling Law will continue to apply, but is unsure about this. There are many things that could interrupt the process of achieving more powerful AI, including the possibility of running out of data, but there is optimism that AI companies will find ways to overcome these obstacles.
Faced with similar challenges, OpenAI and Google have chosen to update existing models while shifting towards new paradigms. OpenAI launched the inference model o1 in September. Unlike investing a lot of computing resources in the pre training stage of large models, o1 emphasizes the inference stage and achieves performance improvement by increasing reinforcement learning and thinking time. Google has also been exposed to be developing inference models similar to O1. In terms of funding, OpenAI completed a new round of funding of $6.6 billion last month, which will be used to continue investing in AI research and products, expanding infrastructure, and attracting talent.
According to insiders cited by relevant media, Anthropic hopes to raise more funds with the support of Amazon, so it is still in negotiations with other investors. According to a previous report by Silicon Valley technology media The Information, Anthropic is attempting to seek financing at a valuation of $30 billion to $40 billion. OpenAI's latest valuation is as high as $157 billion, and from a financial perspective, its performance is even stronger, with an expected annualized revenue of $4 billion. In contrast, Anthropic expects an annual revenue of approximately $800 million this year, of which 25% -50% will also be paid to cloud partners in the form of a share, while OpenAI's payment ratio in this regard is relatively small.
In the fierce AI competition, there are strong opponents like OpenAI, as well as being surrounded by giants like Google. Anthropic's receipt of a large investment from Amazon may not be a negative signal for maintaining market competition.
Amazon's AI chip ambitions
As the world's largest cloud provider, Amazon has also faced fierce competition from rivals in the AI field in recent years. Following closely behind, Microsoft Cloud seized the opportunity through its deep cooperation with OpenAI, while Google provided cloud services based on its self-developed models such as Gemini. For Amazon AWS, providing users with access to Claude models is one of its key competitive advantages, so it needs to strengthen its cooperation with Anthropic to consolidate its market position.
Amazon's ambition is not limited to this, it has also been trying to build its own large models and actively layout in the field of AI chips. When signing a cooperation agreement with customers, Amazon will suggest that they use its designed chips, so that the team can identify areas that need improvement.
The Information reported earlier this month that during the negotiations between Amazon and Anthropic for their second investment, the quantity of Amazon chips used became a key focus of discussion, directly affecting the outcome of the negotiations and Amazon's investment scale. Anthropic ultimately succeeded in obtaining investment, but did not indicate how many Amazon chips it would use or whether it would continue to use Nvidia chips through Amazon AWS.
In the current AI chip market, Nvidia holds a dominant position with a market share of up to 98%. Amazon, Microsoft, Google, and other companies are Nvidia's main customers, competing to purchase its high-performance GPU chips for training AI models or providing cloud computing services. But these cloud giants are also intensifying their research on AI chips in order to reduce their dependence on Nvidia.
Amazon acquired Israeli chip design company Annapurna Labs for $350 million in 2015 and developed chip products through it. Since 2018, Amazon has launched three generations of AI chips. According to the latest media reports, Amazon has started shipping the latest Trainium2 chips to data centers in some places, with the goal of deploying 100000 chips in clusters. Research and development personnel are eager to ensure the successful operation of this chip in data centers before the end of the year.
However, for clients like Anthropic, Nvidia's advantage lies not only in the performance of its AI chips, but also in its powerful software platform CUDA. Developers have become accustomed to building AI applications through Nvidia's software and hardware collaboration system. In terms of chip product update cycle, Amazon's speed is not as fast as chip manufacturers such as Nvidia and AMD. Its plan is to launch a new chip to the market every 18 months or so. Nvidia's AI chip roadmap has accelerated from biennial updates to yearly updates, with the new generation of Blackwell chips currently in full production and delivered to major customers such as OpenAI and Microsoft.
At last week's Q3 earnings conference call, Nvidia founder and CEO Huang Renxun stated that demand for Blackwell chips greatly exceeds supply, and every customer is competing to enter the market, hoping to enhance their data centers. Nvidia is accelerating the expansion of supply to meet their requirements. In addition to Blackwell, demand for the previous generation of Hopper chips will also continue into the first few quarters of next year.
For cloud providers such as Amazon, in addition to competing for Nvidia's latest chips, what they can do is to accelerate the progress of AI chip self-developed. Even if it cannot completely replace Nvidia, at least we can alleviate our dependence on it through self-developed chips, and gain more advantages in cost reduction and bargaining.
您需要登录后才可以回帖 登录 | Sign Up

本版积分规则

胡胡胡美丽_ss 注册会员
  • Follow

    0

  • Following

    0

  • Articles

    34