this post was submitted on 17 Apr 2025
58 points (96.8% liked)

LocalLLaMA

2878 readers
16 users here now

Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.

Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.

As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.

founded 2 years ago
MODERATORS
 

The Trump administration is considering new restrictions on the Chinese AI lab DeepSeek that would limit it from buying Nvidia’s AI chips and potentially bar Americans from accessing its AI services, The New York Times reported on Wednesday.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 3 days ago (2 children)

Ive tried DeepSeek, it’s not even that good. ChatGPT, Google, and even Grok are better and offer more features, like image generation and web search, while DeepSeek only has chat (and reasoning, but all the others have that too now).

The only thing DeepSeek has going for it is that they released their models for free so you can run them on your own hardware if you want.

[–] [email protected] 2 points 2 days ago (1 children)

Deepseek is much better than anything else I've ran. It has an inner monolog which allows it to solve more complex problems.

[–] [email protected] 1 points 2 days ago (1 children)

ChatGPT and Grok have that too now. It’s called “Reason” (ChatGPT) or “Think” (Grok).

[–] [email protected] 2 points 2 days ago (1 children)

You can't run those locally though so that doesn't natter much

[–] [email protected] 1 points 1 day ago (1 children)

I guess I forgot what community I'm in 🤦‍♂️

[–] [email protected] 3 points 1 day ago
[–] [email protected] 8 points 3 days ago (1 children)

that's a big fucking benefit though

[–] [email protected] 0 points 2 days ago (1 children)

You mean being able to run them locally? Sure, if you got the hardware to do it. The full size model is a whopping 404 GB, good luck running that on consumer hardware.

[–] [email protected] 3 points 2 days ago* (last edited 2 days ago) (1 children)

I run the 16b version. It works fine on my laptop on the CPU.

[–] [email protected] 0 points 2 days ago

Sure, that’ll work. It’s just not as smart as the full version.