this post was submitted on 31 Jan 2025
376 points (94.5% liked)

Open Source

36988 readers
64 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
 

Article: https://proton.me/blog/deepseek

Calls it "Deepsneak", failing to make it clear that the reason people love Deepseek is that you can download and it run it securely on any of your own private devices or servers - unlike most of the competing SOTA AIs.

I can't speak for Proton, but the last couple weeks are showing some very clear biases coming out.

(page 4) 28 comments
sorted by: hot top controversial new old
[–] [email protected] 28 points 3 months ago (3 children)
[–] [email protected] 9 points 3 months ago

Well you just made me choke on my laughter. Well done, well done.

load more comments (2 replies)
[–] [email protected] 5 points 3 months ago* (last edited 3 months ago) (6 children)

im not an expert at criticism, but I think its fair from their part.

I mean, can you remind me what are the hardware requirements to run deepseek locally?
oh, you need a high-end graphics card with at least 8 GB VRAM for that*? for the highly distilled variants! for more complete ones you need multiple such graphics card interconnected! how do you even do that with more than 2 cards on a consumer motherboard??

how many do you think have access to such a system, I mean even 1 high-end gpu with just 8 GB VRAM, considering that more and more people only have a smartphone nowadays, but also that these are very expensive even for gamers?
and as you will read in the 2nd referenced article below, memory size is not the only factor: the distill requiring only 1 GB VRAM still requires a high-end gpu for the model to be usable.

https://www.tomshardware.com/tech-industry/artificial-intelligence/amd-released-instructions-for-running-deepseek-on-ryzen-ai-cpus-and-radeon-gpus

https://bizon-tech.com/blog/how-to-run-deepseek-r1-locally-a-free-alternative-to-openais-o1-model-hardware-requirements#a6

https://codingmall.com/knowledge-base/25-global/240733-what-are-the-system-requirements-for-running-deepseek-models-locally

so my point is that when talking about deepseek, you can't ignore how they operate their online service, as most people will only be able to try that.

I understand that recently it's very trendy, and cool to shit on Proton, but they have a very strong point here.

[–] [email protected] 14 points 3 months ago (1 children)

Just because the average consumer doesn’t have the hardware to use it in a private manner does not mean it’s not achievable. The article straight up pretends self hosting doesn’t exist.

load more comments (1 replies)
load more comments (5 replies)
[–] [email protected] 7 points 3 months ago (5 children)

It might be that they're equating the name with the app and company, not the open source model, based on one of the first lines:

AI chat apps like ChatGPT collect user data, filter responses, and make content moderation decisions that are not always transparent.

Emphasis mine. The rest of the article reads the same way.

Most people aren't privacy-conscious enough to care who gets what data and who's building the binaries and web apps, so sounding the alarm is appropriate for people who barely know the difference between AI and AGI.

I get that people are mad at Proton right now (anyone have a link? I'm behind on the recent stuff), but we should ensure we get mad at things that are real, not invent imaginary ones based on contrived contexts.

[–] [email protected] 17 points 3 months ago* (last edited 3 months ago) (1 children)

Here is a general write up about the CEO showing their maga colors.

More happened in the reddit thread though that added some more elements, like the ceo opting for a new user name with "88" in it (a common right wing reference), his unprompted use of the phrase "didnt mean to trigger you," him evasively refusing to clarify what his stance actually was because "that would be more politics," on and on. You can read through that thread here, although proton corporate are mods, so i have no idea what they may have deleted at this point.

The thread was full of "mask on" behavior that is pretty transparent to anyone experienced with the alt right on the internet.

[–] [email protected] 3 points 3 months ago (2 children)

Thank you so much! That was way beyond what I could have hoped.

I'll read the link you provided in a bit, but that does sound really bad. Must suck to work at a company you think is helping people stay private only to have the CEO come out as pro-fascism.

load more comments (2 replies)
[–] [email protected] 3 points 3 months ago

it is certainly that. but recently its become very trendy to hate Proton, so its just easier to do that instead of thinking. I'm really disappointed in this community

load more comments (3 replies)
[–] [email protected] 12 points 3 months ago (1 children)

There are many llms you can use offline

[–] [email protected] 119 points 3 months ago (3 children)

DeepSeek is open source, meaning you can modify code(new window) on your own app to create an independent — and more secure — version. This has led some to hope that a more privacy-friendly version of DeepSeek could be developed. However, using DeepSeek in its current form — as it exists today, hosted in China — comes with serious risks for anyone concerned about their most sensitive, private information.

Any model trained or operated on DeepSeek’s servers is still subject to Chinese data laws, meaning that the Chinese government can demand access at any time.

What???? Whoever wrote this sounds like he has 0 understanding of how it works. There is no "more privacy-friendly version" that could be developed, the models are already out and you can run the entire model 100% locally. That's as privacy-friendly as it gets.

"Any model trained or operated on DeepSeek's servers are still subject to Chinese data laws"

Operated, yes. Trained, no. The model is MIT licensed, China has nothing on you when you run it yourself. I expect better from a company whose whole business is on privacy.

[–] [email protected] 35 points 3 months ago (1 children)

To be fair, most people can't actually self-host Deepseek, but there already are other providers offering API access to it.

[–] [email protected] 32 points 3 months ago (5 children)

There are plenty of step-by-step guides to run Deepseek locally. Hell, someone even had it running on a Raspberry Pi. It seems to be much more efficient than other current alternatives.

That's about as openly available to self host as you can get without a 1-button installer.

[–] [email protected] 6 points 3 months ago* (last edited 3 months ago) (10 children)

Those are not deepseek R1. They are unrelated models like llama3 from Meta or Qwen from Alibaba "distilled" by deepseek.

This is a common method to smarten a smaller model from a larger one.

Ollama should have never labelled them deepseek:8B/32B. Way too many people misunderstood that.

load more comments (10 replies)
[–] [email protected] 17 points 3 months ago (6 children)

You can run an imitation of the DeepSeek R1 model, but not the actual one unless you literally buy a dozen of whatever NVIDIA’s top GPU is at the moment.

load more comments (6 replies)
load more comments (3 replies)
load more comments (1 replies)
[–] [email protected] 24 points 3 months ago

To be fair its correct but it's poor writing to skip the self hosted component. These articles target the company not the model.

[–] [email protected] 168 points 3 months ago (2 children)

Pretty rich coming from Proton, who shoved a LLM into their mail client mere months ago.

[–] [email protected] 38 points 3 months ago (1 children)

wait, what? How did I miss that? I use protonmail, and I didn't see anything about an LLM in the mail client. Nor have I noticed it when I check my mail. Where/how do I find and disable that shit?

[–] [email protected] 51 points 3 months ago (1 children)
[–] [email protected] 55 points 3 months ago (1 children)

Thank you. I've saved the link and will be disabling it next time I log in. Can't fucking escape this AI/LLM bullshit anywhere.

[–] [email protected] 72 points 3 months ago (7 children)

The combination of AI, crypto wallet and CEO's pro-MAGA comments (all within six months or so!) are why I quit Proton. They've completely lost the plot. I just want a reliable email service and file storage.

[–] [email protected] 23 points 3 months ago (7 children)

I'm considering leaving proton too. The two things I really care about are simplelogin and the VPN with port forwarding. As far as I understand it, proton is about the last VPN option you can trust with port forwarding

[–] [email protected] 17 points 3 months ago (1 children)

Happily using AirVPN for port forwarding.

[–] [email protected] 5 points 3 months ago (4 children)

I'm strongly considering switching to them! How do you like it?

[–] [email protected] 6 points 3 months ago (3 children)

The interface - GUI and website - is straight out of 2008 and documentation could be better, but otherwise it works just fine for torrenting and browsing. No complaints there.

load more comments (3 replies)
load more comments (3 replies)
load more comments (6 replies)
load more comments (6 replies)
load more comments (1 replies)
load more comments
view more: ‹ prev next ›