DeepSeek’s R1 release has generated heated discussions on the topic of model distillation and how companies may protect against unauthorized distillation. Model distillation has broad IP implications ...
Hosted on MSN

What is AI Distillation?

Distillation, also known as model or knowledge distillation, is a process where knowledge is transferred from a large, complex AI ‘teacher’ model to a smaller and more efficient ‘student’ model. Doing ...
David Sacks, U.S. President Donald Trump's AI and crypto czar. David Sacks says OpenAI has evidence that Chinese company DeepSeek used a technique called "distillation" to build a rival model. OpenAI ...
Model distillation is one of the technology trends that has reached a level of maturity identified in Gartner’s 2025 Hype Cycle for artificial intelligence (AI) as “the slope of enlightenment”.
When DeepSeek-R1 launched recently, it immediately captured the attention of the global artificial intelligence community, prompting major players such as OpenAI, Microsoft, and Meta to investigate ...
Chinese artificial intelligence lab DeepSeek roiled markets in January, setting off a massive tech and semiconductor selloff after unveiling AI models that it said were cheaper and more efficient than ...
The rapid advancements in AI have brought powerful large language models (LLMs) to the forefront. However, most high-performing models are massive, compute-heavy, and require cloud-based inference, ...