Chinese artificial intelligence lab DeepSeek roiled markets in January, setting off a massive tech and semiconductor selloff after unveiling AI models that it said were cheaper and more efficient than ...
Microsoft researchers have developed On-Policy Context Distillation (OPCD), a training method that permanently embeds ...
Forbes contributors publish independent expert analyses and insights. There’s a new wrinkle in the saga of Chinese company DeepSeek’s recent announcement of a super-capable R1 model that combines high ...
David Sacks, U.S. President Donald Trump's AI and crypto czar. David Sacks says OpenAI has evidence that Chinese company DeepSeek used a technique called "distillation" to build a rival model. OpenAI ...
Model distillation is one of the technology trends that has reached a level of maturity identified in Gartner’s 2025 Hype Cycle for artificial intelligence (AI) as “the slope of enlightenment”.