첫 페이지 News 본문

On April 23rd local time, Microsoft launched the open-source lightweight AI model Phi-3 series, claiming that it is currently the most powerful and cost-effective "Xiaoyu Yan model" on the market. The smallest version of this series, the Phi-3-mini, although only with a parameter size of 3.8 billion, demonstrates performance that exceeds the model's parameter size by more than twice, outperforming the Meta's Llama 3 8B in multiple benchmark tests. The Phi-3-small and Phi-3-medium versions can even surpass the GPT-3.5 Turbo. Even more noteworthy is that the Phi-3-mini uses very little memory and can generate 12 tokens per second on the A16 Bionic chip on the iPhone 14. This means that this model does not need to be connected to the internet and can run directly on the phone. Moreover, it is revealed that the cost of Phi-3 may only be one tenth of that of an equivalent performance model.
CandyLake.com is an information publishing platform and only provides information storage space services.
Disclaimer: The views expressed in this article are those of the author only, this article does not represent the position of CandyLake.com, and does not constitute advice, please treat with caution.
您需要登录后才可以回帖 登录 | Sign Up

本版积分规则

我放心你带套猛 注册会员
  • Follow

    0

  • Following

    0

  • Articles

    31