this post was submitted on 24 Jan 2024
21 points (100.0% liked)

Technology

58144 readers
4471 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Who would've thought? This isn’t going to fly with the EU.

Article 5.3 of the Digital Markets Act (DMA): "The gatekeeper shall not prevent business users from offering the same products or services to end users through third-party online intermediation services or through their own direct online sales channel at prices or conditions that are different from those offered through the online intermediation services of the gatekeeper."

Friendly reminder that you can sideload apps without jailbreaking or paying for a dev account using TrollStore, which utilises core trust bugs to bypass/spoof some app validation keys, on a iPhone XR or newer on iOS 14.0 up to 16.6.1. (ANY version for iPhone X and older)

Install guide: Trollstore

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -2 points 7 months ago* (last edited 7 months ago) (39 children)

IOS is the worst operating system i have ever used

Why do people buy it?

[–] [email protected] 0 points 7 months ago (1 children)

Privacy and security mostly I would imagine

[–] [email protected] 0 points 7 months ago* (last edited 7 months ago) (1 children)

Closed source software can't be audited, so it can't be secure. If software isn't secure, the exploits rid it of any privacy.

See: The bimonthly remote takeover bugs that keep getting found. Like this one: https://citizenlab.ca/2023/09/blastpass-nso-group-iphone-zero-click-zero-day-exploit-captured-in-the-wild/

"Oh whoopsy doopsy, looks like your iPhone, camera, files, GPS and more were accessible to someone who sent you an iMessage.. for the third time this year"

[–] [email protected] 0 points 7 months ago (1 children)

Closed source software can't be audited, so it can't be secure

That’s the biggest load of bullshit I’ve ever heard.

Closed source software is audited all the time.

[–] [email protected] 0 points 7 months ago* (last edited 7 months ago) (1 children)

Ok let me rephrase - nobody without a conflict of interest can audit a closed source application. If Microsoft paid for an audit of Windows, that doesn't tell you anything about whether or not Windows is backdoored.

[–] [email protected] 0 points 7 months ago (1 children)

The audit is not for you. Closed source software is audited all the time, but the results of those audits are generally confidential. This is about finding security bugs, not deliberate backdoors.

The key with this is who do you trust. Sure, open source can be audited by everyone, but is it? You can’t audit all the code you use yourself, even if you have the skills, it’s simply too much. So you still need to trust another person or company, it really doesn’t change the equation that much.

[–] [email protected] 0 points 7 months ago* (last edited 7 months ago) (1 children)

In practice, most common open source software is used and contributed to by hundreds of people. So it naturally does get audited by that process. Closed source software can't be confirmed to not be malicious, so it can't be confirmed to be secure, so back to my original point, it can't be private.

I didn't go into that much detail in my original comment, but it was what I meant when I first wrote it. As far as "does everyone audit the software they use", the answer is obviously no. But, the software I use is mostly FOSS and contributed to by dozens of users, sometimes including myself. So when alarms are rung over the smallest things, you have a better idea of the attack vectors and privacy implications.

[–] [email protected] 0 points 7 months ago (1 children)

In practice, most common open source software is used and contributed to by hundreds of people. So it naturally does get audited by that process.

Just working on software is not the same as actively looking for exploits. Software security auditing requires a specialised set of skills. Open source also makes it easier for black-hat hackers to find exploits.

Hundreds of people working on something is a double-edged sword. It also makes it easy for someone to sneak in an exploit. A single-character mistake in code could cause an exploitable bug, and if you are intent on deliberately introducing such an issue it can be very hard to spot and even if caught can be explained away as an honest to god mistake.

By contrast, lots of software companies screen their employees, especially if they are working on critical code.

[–] [email protected] 1 points 7 months ago* (last edited 7 months ago)

I don't know if you really believe what you're saying, but I'll continue answering anyways. I worked at Manulife, the largest private insurance company in Canada, and ignoring the fact our security team was mostly focused on pen testing (which as you know, in contrast to audits tells you nothing about whether a system is secure), but the audits were infrequent and limited in scope. Most corporations don't even do audits (and hire the cheapest engineers to do the job), and as a consumer, there's no way to easily tell which audits covered the security aspects you care about.

If you want to talk about the security of open source more, besides what is already mentioned above, not only are Google, Canonical and RedHat growing their open source security teams (combined employing close to 1,000 people whose job is to audit and patch popular open source apps), but also open source projects can likewise pay for audits themselves (See Mullvad or Monero as examples).

I will concede that it is possible for proprietary software to be secure. But in practice, it's simply not, and too hard to tell. It's certainly not secure when compared to similar open source offerings.

load more comments (37 replies)