this post was submitted on 27 Feb 2025
983 points (96.8% liked)

Technology

63376 readers
4273 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Per one tech forum this week: “Google has quietly installed an app on all Android devices called ‘Android System SafetyCore’. It claims to be a ‘security’ application, but whilst running in the background, it collects call logs, contacts, location, your microphone, and much more making this application ‘spyware’ and a HUGE privacy concern. It is strongly advised to uninstall this program if you can. To do this, navigate to 'Settings’ > 'Apps’, then delete the application.”

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 40 points 12 hours ago (5 children)
[–] [email protected] -4 points 9 hours ago* (last edited 8 hours ago)

graphene folks have a real love for the word misinformation (and FUD, and brigading). That's not you under there👻, Daniel, is it?

After 5 years of his ~~antics~~ hateful bullshit lies, I think I can genuinely say that word triggers me.

[–] [email protected] 8 points 9 hours ago

If the app did what op is claiming then the EU would have a field day fining google.

[–] [email protected] 27 points 10 hours ago (3 children)

To quote the most salient post

The app doesn't provide client-side scanning used to report things to Google or anyone else. It provides on-device machine learning models usable by applications to classify content as being spam, scams, malware, etc. This allows apps to check content locally without sharing it with a service and mark it with warnings for users.

Which is a sorely needed feature to tackle problems like SMS scams

[–] [email protected] 0 points 4 hours ago (1 children)

You don't need advanced scanning technology running on every device with access to every single bit of data you ever seen to detect scam. You need telco operator to stop forwarding forged messages headers and… that's it. Cheap, efficient, zero risk related to invasion of privacy through a piece of software you did not need but was put there "for your own good".

[–] [email protected] 4 points 2 hours ago

I will perhaps be nitpicking, but... not exactly, not always. People get their shit hacked all the time due to poor practices. And then those hacked things can send emails and texts and other spam all they want, and it'll not be forged headers, so you still need spam filtering.

[–] [email protected] 7 points 8 hours ago (1 children)

Why do you need machine learning for detecting scams?

Is someone in 2025 trying to help you out of the goodness of their heart? No. Move on.

[–] [email protected] 4 points 7 hours ago (1 children)

If you want to talk money then it is in businesses best interest that money from their users is being used on their products, not being scammed through the use of their products.

Secondly machine learning or algorithms can detect patterns in ways a human can't. In some circles I've read that the programmers themselves can't decipher in the code how the end result is spat out, just that the inputs will guide it. Besides the fact that scammers can circumvent any carefully laid down antispam, antiscam, anti-virus through traditional software, a learning algorithm will be magnitudes harder to bypass. Or easier. Depends on the algorithm

[–] [email protected] 1 points 6 hours ago

I don't know the point of the first paragraph...scams are bad? Yes? Does anyone not agree? (I guess scammers)

For the second we are talking in the wild abstract, so I feel comfortable pointing out that every automated system humanity has come up with so far has pulled in our own biases and since ai models are trained by us, this should be no different. Second, if the models are fallible, you cannot talk about success without talking false positives. I don't care if it blocks every scammer out there if it also blocks a message from my doctor. Until we have data on consensus between these new algorithms and desired outcomes, it's pointless to claim they are better at X.

[–] [email protected] 6 points 9 hours ago (1 children)

if the cellular carriers were forced to verify that caller-ID (or SMS equivalent) was accurate SMS scams would disappear (or at least be weaker). Google shouldn't have to do the job of the carriers, and if they wanted to implement this anyway they should let the user choose what service they want to perform the task similar to how they let the user choose which "Android system WebView" should be used.

[–] [email protected] 4 points 7 hours ago

Carriers don't care. They are selling you data. They don't care how it's used. Google is selling you a phone. Apple held down the market for a long time for being the phone that has some of the best security. As an android user that makes me want to switch phones. Not carriers.

[–] [email protected] 4 points 12 hours ago (1 children)

So is this really just a local AI model? Or is it something bigger? My S25 Ultra has the app but it hasn't used any battery or data.

[–] [email protected] 1 points 9 hours ago (1 children)

I mean the grapheneos devs say it is. Are they going to lie.

[–] [email protected] 4 points 8 hours ago

Yes, absolutely, and regularly, and without shame.

But not usually about technical stuff.