Main Menu

News:

Welcome to the AI & AI prompt sharing forum!

What Is AI Distillation? Is DeepSeek guilty of AI Distillation

Started by Admin, Feb 08, 2025, 11:49 AM

Previous topic - Next topic

0 Members and 2 Guests are viewing this topic.

Admin

In January, a Chinese startup called DeepSeek shocked the tech world. It built a powerful AI model using fewer chips and less money than anyone expected. Silicon Valley was impressed. Wall Street panicked. Tech stocks dropped. Washington worried about falling behind. Meanwhile, in China, people celebrated. One commentator even claimed DeepSeek had shattered the idea that U.S. tech was unbeatable.



But then, OpenAI—the company behind ChatGPT—dropped a bombshell. It started investigating DeepSeek for allegedly training its chatbot using ChatGPT itself. If true, DeepSeek didn't beat Silicon Valley—it copied it.

OpenAI believes DeepSeek may have "distilled" its technology. That means it may have flooded ChatGPT with questions, recorded the answers, and used them to train its own AI. At one point, DeepSeek's chatbot even responded "ChatGPT" when asked what model it was. (It doesn't do that anymore.)

DeepSeek's AI is impressive. It performs at a high level and cost much less to develop. But if it was built on OpenAI's work, that raises big questions. It's not hacking, but it does break OpenAI's rules. And if DeepSeek did it, others will too.

For years, building cutting-edge AI required massive amounts of money and energy. That might be changing. We could be entering the age of AI copycats.