• Home
  • Ai
  • Ai News
  • DeepSeek V3 Open Source AI Model With Mixture of Experts Architecture Released

DeepSeek-V3 Open-Source AI Model With Mixture-of-Experts Architecture Released

The model features 671B parameters, much higher than Meta Llama 3.1 model's 405B parameters.

DeepSeek-V3 Open-Source AI Model With Mixture-of-Experts Architecture Released

Photo Credit: DeepSeek

The AI model adopts Multi-head Latent Attention (MLA) and DeepSeekMoE architectures

Highlights
  • DeepSeek-V3 was pre-trained on 14.8 trillion tokens
  • The AI model also comes with advanced reasoning capabilities
  • It scored 87.1 percent on the MMLU benchmark
Advertisement

DeepSeek, a Chinese artificial intelligence (AI) firm, released the DeepSeek-V3 AI model on Thursday. The new open-source large language model (LLM) features a massive 671 billion parameters, surpassing the Meta Llama 3.1 model which has 405 billion parameters. Despite its size, the researchers claimed that the LLM is focused towards efficiency with its mixture-of-expert (MoE) architecture. Due to this, the AI model can only activate specific parameters relevant to the task provided and ensure efficiency and accuracy. Notably, it is a text-based model and does not have multimodal capabilities.

DeepSeek-V3 AI Model Released

The open-source DeepSeek-V3 AI model is currently being hosted on Hugging Face. According to the listing, the LLM is geared towards efficient inference and cost-effective training. For this, the researchers adopted Multi-head Latent Attention (MLA) and DeepSeekMoE architectures.

Essentially, the AI model only activates the parameters which are relevant to the topic of the prompt, ensuring faster processing and higher accuracy compared to typical models of this size. Pre-trained on 14.8 trillion tokens, the DeepSeek-V3 uses techniques such as supervised fine-tuning and reinforcement learning to generate high-quality responses.

The Chinese firm claimed that despite its size, the AI model was fully trained in 2.788 million hours with the Nvidia H800 GPU. DeepSeek-V3's architecture also includes a load-balancing technique to minimise performance degradation. This technique was first used on its predecessor.

Coming to performance, the researchers shared evals from internal testing of the model and claimed that it outperforms Meta Llama 3.1 and Qwen 2.5 models on the Big-Bench High-Performance (BBH), Massive Multitask Language Understanding (MMLU), HumanEval, MATH, and several other benchmarks. However, these are currently not verified by third-party researchers.

One of the main highlights of the DeepSeek-V3 is its massive size of 671 billion parameters. While larger models exist, for example, the Gemini 1.5 Pro has one trillion parameters, such size in the open source space is rare. Prior to this, the largest open-source AI model was Meta's Llama 3.1 with 405 billion parameters.

At present, DeepSeek-V3's code can be accessed by its Hugging Face listing under an MIT license for personal and commercial usage. Additionally, the AI model can also be tested via the company's online chatbot platform. Those looking to build using the AI model can also access the API.

Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Akash Dutta
Akash Dutta is a Senior Sub Editor at Gadgets 360. He is particularly interested in the social impact of technological developments and loves reading about emerging fields such as AI, metaverse, and fediverse. In his free time, he can be seen supporting his favourite football club - Chelsea, watching movies and anime, and sharing passionate opinions on food. More
Crypto Price Today: Bitcoin Sees Price Dip, Joins Most Cryptocurrencies in a Market-Wide Correction
Best Mid-Range Smartphones of 2024: Redmi Note 14 Pro+, OnePlus Nord 4, Realme 13 Pro+, and More
Facebook Gadgets360 Twitter Share Tweet Snapchat LinkedIn Reddit Comment google-newsGoogle News

Advertisement

Follow Us
© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »