• Home
  • Ai
  • Ai News
  • OpenAI Alleges Its AI Models Were Used to Build DeepSeek R1: Report

OpenAI Alleges Its AI Models Were Used to Build DeepSeek-R1: Report

OpenAI reportedly claimed that it had seen evidence of distillation of its AI models, which it suspected to be from DeepSeek.

Facebook Gadgets360 Twitter Share Tweet Snapchat LinkedIn Reddit Comment google-newsGoogle News
OpenAI Alleges Its AI Models Were Used to Build DeepSeek-R1: Report

Photo Credit: Reuters

AI model distillation is a technique used to transfer knowledge from a larger model to a smaller model

Highlights
  • OpenAI’s terms of service forbid using outputs to develop new AI models
  • DeepSeek-R1 is an open-source reasoning-focused AI model
  • The distillation was reportedly done using the OpenAI APIs
Advertisement

OpenAI has reportedly claimed that DeepSeek might have distilled its artificial intelligence (AI) models to build the R1 model. As per the report, the San Francisco-based AI firm stated that it has evidence that some users were using its AI models' outputs for a competitor, which is suspected to be DeepSeek. Notably, the Chinese company released the open-source DeepSeek-R1 AI model last week and hosted it on GitHub and Hugging Face. The reasoning-focused model surpassed the capabilities of the ChatGPT-maker's o1 AI models in several benchmarks.

OpenAI Says It Has Evidence of Foulplay

According to a Financial Times report, OpenAI claimed that its proprietary AI models were used to train DeepSeek's models. The company told the publication that it had seen evidence of distillation from several accounts using the OpenAI application programming interface (API). The AI firm and its cloud partner Microsoft investigated the issue and blocked their access.

In a statement to the Financial Times, OpenAI said, “We know [China]-based companies — and others — are constantly trying to distil the models of leading US AI companies.” The ChatGPT-maker also highlighted that it is working closely with the US government to protect its frontier models from competitors and adversaries.

Notably, AI model distillation is a technique used to transfer knowledge from a large model to a smaller and more efficient model. The goal here is to bring the smaller model on par or ahead of the larger model while reducing computational requirements. Notably, OpenAI's GPT-4 has roughly 1.8 trillion parameters while DeepSeek-R1 has 1.5 billion parameters, which would fit the description.

The knowledge transfer typically takes place by using the relevant dataset from the larger model to train the smaller model, when a company is creating more efficient versions of its model in-house. For instance, Meta used the Llama 3 AI model to create several coding-focused Llama models.

However, this is not possible when a competitor, which does not have access to the datasets of a proprietary model, wants to distil a model. If OpenAI's allegations are true, this could have been done by adding prompt injections to its APIs to generate a large number of outputs. This natural language data is then converted to code and fed to a base model.

Notably, OpenAI has not publicly issued a statement regarding this. Recently, the company CEO Sam Altman praised DeepSeek for creating such an advanced AI model and increasing the competition in the AI space.

Play Video
Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Akash Dutta
Akash Dutta is a Senior Sub Editor at Gadgets 360. He is particularly interested in the social impact of technological developments and loves reading about emerging fields such as AI, metaverse, and fediverse. In hi... more  »
WWE 2K25 March Release, Editions, Cover Stars Revealed; Pre-Orders Go Live
Alibaba Releases Qwen 2.5 AI Model, Claims It Surpasses DeepSeek-V3

Advertisement

Follow Us
© Copyright Red Pixels Ventures Limited 2025. All rights reserved.
Trending Products »
Latest Tech News »