Economy Chinese startup DeepSeek hammers US stocks with cheaper open-source AI model

Interesting interview that explains a lot of the reasoning of the way GPU's tie into these problems.

 
oh look at shook americans
same happened when Japan overtook USA in 70s with its tech.
naturally, americans couldnt handle it so they curbed japanese exports to the point of them not being competitive anymore

but china is too big for that shit

while US has been bickering about trans people in bathrooms, chinese have been going full steam ahead with investments in tech sector

chinese cities already look like something out of blade runner, while american cities are either burning, filled with homeless or have unusable public transport
 
Maybe, but it's also innovative.


According to these guys, it offers improved performance and a much lower cost. I wouldn't use Chinese tech like that anyway, but if your only goal is cheap and fast, it's apparently a good solution.

Isn't that the goal of most American and multinational corporations anyway? Cheap and fast. It's just a problem because it is Chinese? My question is if the government keeps giving concessions and subsidies to all these corporations in America and their heads (i.e. Musk and Zuckerberg) and American companies keep getting their shit pushed in by foreign entities shouldn't we re-evaluate the concessions given to them?

I mean China is beating America on EV production and all America has is the terrible looking Cybertruck and alternatives that don't look good such as Rivian. Also, it's not like America has learned the lesson. I mean, for example, a lot of people use Microsoft products but no matter how much money they get they stay behind compared to Linux in certain aspects.
 
I have heard that they are doing it with nvidia chips that they actually aren't supposed to have.. is that true?

Every company that trains models this large does so with Nvidia hardware. That's why it has a $3 trillion market cap.

There are certain GPUs that Nvidia can no longer export to China namely the H100. Nvidia has created several weaker GPUs to get around the export ban and legally sale to China. The H100s Chinese equivalent is the H800. There are some reports that the Chinese used H100s for this model. And there are other reports that say H800s were used. So it's not clear.


But what's important about this model is that it's way more efficient to train. So if it were trained on H100s then they wouldn't need nearly as many as OpenAI has needed for its models (so few that they wouldn't need to mass import and couldn obtain them through back channels). And it's also possible that it's May have been able to train it on H800s as well.
 
Last edited:
Isn't that the goal of most American and multinational corporations anyway? Cheap and fast. It's just a problem because it is Chinese? My question is if the government keeps giving concessions and subsidies to all these corporations in America and their heads (i.e. Musk and Zuckerberg) and American companies keep getting their shit pushed in by foreign entities shouldn't we re-evaluate the concessions given to them?

I mean China is beating America on EV production and all America has is the terrible looking Cybertruck and alternatives that don't look good such as Rivian. Also, it's not like America has learned the lesson. I mean, for example, a lot of people use Microsoft products but no matter how much money they get they stay behind compared to Linux in certain aspects.
Re: the bold, there are issues with anti-competitive practices, IP theft, and state control of (or ability to control when deemed necessary) sensitive information about customers.

Are you aware that Tik Tok collects information about what people are doing with their phones even when they aren't using the app, and that the Chinese government can walk in and take that data pretty much anytime they want? And that's just an example.
 
Re: the bold, there are issues with anti-competitive practices, IP theft, and state control of (or ability to control when deemed necessary) sensitive information about users.

Aren't these all things that companies like Meta and Google already do? Everything you listed above we already allow companies on American soil to do openly with the right "people". The only difference is that Chinese companies don't give money to American Political campaigns (or maybe they do).
 
Isn't that the goal of most American and multinational corporations anyway? Cheap and fast. It's just a problem because it is Chinese? My question is if the government keeps giving concessions and subsidies to all these corporations in America and their heads (i.e. Musk and Zuckerberg) and American companies keep getting their shit pushed in by foreign entities shouldn't we re-evaluate the concessions given to them?

I mean China is beating America on EV production and all America has is the terrible looking Cybertruck and alternatives that don't look good such as Rivian. Also, it's not like America has learned the lesson. I mean, for example, a lot of people use Microsoft products but no matter how much money they get they stay behind compared to Linux in certain aspects.

The country that dominates in AI will dominate the world. The economic ramifications should be obvious. It's a multi-trillion dollar industry. Maybe 10s or even 100s of trillions. And will have some effect on basically every existing industry. As big or even bigger than the impact of personal computers and the Internet.

But there are also military implications too. Not just in a autonomous weapons but in the scale and speed at which new advanced weapons can developed.
 
Aren't these all things that companies like Meta and Google already do? Everything you listed above we already allow companies on American soil to do openly with the right "people". The only difference is that Chinese companies don't give money to American Political campaigns (or maybe they do).
Even if we take it as a given that "we already allow companies on American soil to do so openly", if you think that's the only reason then discussing it further is pointless.
 
Last edited:


Someone got Deepseek biggest model running on 8 mac minis lol.



This is a big deal as far as token cost goes.




DeepSeek V3 671B (4-bit)2.915.37
Llama 3.1 405B (4-bit)29.710.88
Llama 3.3 70B (4-bit)3.143.89
[th]
Model​
[/th][th]
Time-To-First-Token (TTFT) in seconds​
[/th][th]
Tokens-Per-Second (TPS)​
[/th]​
 
Last edited:
China is going to win

People are in denial

Unlike Japan in the 80s, the US can't force China to sink its own economy after becoming so powerful.
 
You mean like Telsas? Because those spontaneously blow up too.
7 Teslas per day are spontaneously blowing up? Because that is what the reports are saying about the BYD cars.
 
Maybe, but it's also innovative.


According to these guys, it offers improved performance and a much lower cost. I wouldn't use Chinese tech like that anyway, but if your only goal is cheap and fast, it's apparently a good solution.

Damn. You're actually right for a change.

The key here is that it is (A) Open Source and thus available to all and the innovations for it will likely occur much faster, and (B) it takes less energy which is a huge deal with today's AI and our shortage of electricity when it comes to the expansion of AI, Blockchain, and EVs where are all energy hogs.
 
The AI I really want is one where it can be located and perform on my machine without the need for processors elsewhere. There are a few reasons for my desire.

1. It keeps my queries and the information I want as mine. Even if my IP (Local AI) reached out to other sources to answer my questions, it can be done with a VPN to shield me from intrusive technology companies that want to know my very world.

2. It will require far less energy and as an energy / technology professional I know what energy crunch is coming. It's a big deal to find an AI that uses less energy.

If this AI can eventually pull those two things off... I'm interested.
 
That's not a Chinese chip you're talking to.
NVDA powered, doggies.
Tweektown full of sh$t the CEO said a number of times was done on older H800 cards because they could not get their hands on newer H100 cards due to import restrictions and costs. This is about amount of memory they have available to build their models. In what has turned out pretty inventive they decided not to use 32 bit accuracy but 8 decimal one. Saved them time and significant amounts of processing.
 
Back
Top