Main Menu

News:

Welcome to the AI & AI prompt sharing forum!

Nvidia says its new GPUs are the fastest for DeepSeek AI

Started by Admin, Feb 01, 2025, 12:51 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Admin

Nvidia is praising the performance of DeepSeek's open-source AI models on its new RTX 50-series GPUs, claiming they can run DeepSeek's models faster than anything else on the PC market. But this might not be the most important point.



This week, Nvidia saw the biggest one-day loss in market value ever for a U.S. company, and a lot of that is being linked to DeepSeek. The Chinese company's new R1 model doesn't need powerful Nvidia hardware to match the performance of OpenAI's models, meaning DeepSeek was able to train it at a much lower cost. This suggests that Nvidia's top chips might not be as essential for AI progress, which could impact the company's future.

That said, DeepSeek did use Nvidia GPUs (just not the top-end ones). They used weaker H800 chips, which the U.S. allows Nvidia to export to China. Nvidia's latest blog post is highlighting how its new 50-series RTX GPUs can be used for running R1 models, claiming they offer maximum performance on PCs.

But the big deal is how DeepSeek trained its models in the first place. (And remember, China is getting a less powerful version of the RTX 5090.)

Other companies are also jumping on the DeepSeek train. R1 is now available on AWS, and Microsoft has made it available on its Azure AI Foundry platform and GitHub. But there are rumors that Microsoft and OpenAI are looking into whether DeepSeek used OpenAI's data without permission.