첫 페이지 News 본문

On July 23rd local time, Meta officially released the latest version of its language model Llama3.1. This release is seen by the AI community as a powerful counterattack against the "open source backwardness theory", and Meta founder and CEO Mark Zuckerberg also stated during the release that "open source AI is the path to the future".
OpenAI has always been criticized by the outside world for the closed nature of ChatGPT, claiming that although it is called "Open", it actually does "Close" things. However, the strength of closed source big models represented by ChatGPT-4o often discourages the industry, as if the concept that "closed source big models must have better performance than open source big models" has become the default.
But the release of Llama3.1 this time seems to rewrite this pattern. Meta has released three versions of Llama3.1, namely 8B, 70B, and 405B, with 405B being the "top of the line" version. Meta claims that its performance is comparable to the best closed source models.
The Strongest Open Source Model
Why can Llama3.1 405B compete with the best closed source models? Along with the release of Llama3.1, Meta also published a paper titled 'The Llama 3 Herd of Models', which detailed the development details of the Llama 3 model.
Firstly, in terms of usage, Llama3.1 supports 8 languages and the context windows of all three versions have been extended to 128K, which is the same as GPT-4 Turbo; Meanwhile, Llama3.1 405B has 405 billion model parameters, with a training scale 50 times larger than Llama2, and adopts a dense Transformer architecture to maintain more stable performance. In this way, Llama can process up to 96000 words of text at once, and can handle both long and short texts with ease.
In the paper, Meta also published performance comparison data between Llama3.1 405B and closed source models such as ChatGPT-4o and Claude 3.5 Sonnet. The test results show that Llama3.1 405B leads in multiple aspects such as general performance, long text processing, and multilingual processing. For example, in the ZeroSCROLLS project testing, Llama3.1 405B scored 95.2, while the latter two were both 90.5.
The outstanding performance and large training base of Llama3.1 have earned it the title of "the strongest open-source big model". However, the current Llama3.1 is still a large model mainly focused on language processing and does not support processing images, videos, or speech. This means that ChatGPT still has outstanding capabilities in multimodal task processing.
Open source AI is the path of the future
Perhaps the actual user experience of Llama has not yet reached a perfect level, but the release of Llama 3.1 405B is of great significance to AI workers around the world, as it opens a new page in the open source and closed source struggle for large models.
On the Meta official website, Zuckerberg released an open letter firmly proclaiming that "open-source AI is the path to the future". In the letter, he stated that although multiple companies are developing leading closed source models, open source is rapidly narrowing the gap. Taking Llama as an example, last year Llama 2 could only compete with older versions of the general large model, but this year Llama 3 has achieved competition with the most advanced large models and is leading in some fields.
Therefore, Zuckerberg hopes to turn Llama into the Linux of the big model era and become the industry standard for open source AI. In the early days of high-performance computing, major technology companies invested heavily in developing their own closed source versions of Unix... Today, open-source Linux has become the industry standard foundation for cloud computing and operating systems that run most mobile devices, and I believe artificial intelligence will develop in a similar way
您需要登录后才可以回帖 登录 | Sign Up

本版积分规则

  • 11월 14일, 세계예선 아시아지역 제3단계 C조 제5라운드, 중국남자축구는 바레인남자축구와 원정경기를 가졌다.축구 국가대표팀은 바레인을 1-0으로 꺾고 예선 2연승을 거두었다. 특히 이번 경기 국내 유일한 중계 ...
    我是来围观的逊
    10 분전
    Up
    Down
    Reply
    Favorite
  • "영비릉: 2024회계연도 영업수입 동기대비 8% 감소"영비릉은 2024회계연도 재무제보를 발표했다.2024 회계연도 매출은 149억5500만 유로로 전년 동기 대비 8% 감소했습니다.이익은 31억 500만 유로입니다.이익률은 ...
    勇敢的树袋熊1
    3 일전
    Up
    Down
    Reply
    Favorite
  • 계면신문기자 장우발 4분기의 영업수입이 하락한후 텐센트음악은 다시 성장으로 돌아왔다. 11월 12일, 텐센트음악은 최신 재보를 발표했다.2024년 9월 30일까지 이 회사의 3분기 총수입은 70억 2천만 위안으로 전년 ...
    勇敢的树袋熊1
    그저께 15:27
    Up
    Down
    Reply
    Favorite
  • 본사소식 (기자 원전새): 11월 14일, 다다그룹 (나스닥코드: DADA) 은 2024년 3분기 실적보고를 발표했다. 수치가 보여준데 따르면 고품질발전전략에 지속적으로 전념하고 사용자체험을 끊임없이 최적화하며 공급을 ...
    家养宠物繁殖
    어제 15:21
    Up
    Down
    Reply
    Favorite
邹高清 新手上路
  • Follow

    0

  • Following

    0

  • Articles

    0