this post was submitted on 23 May 2024
705 points (99.0% liked)

Technology

59137 readers
2278 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 129 points 5 months ago* (last edited 5 months ago) (5 children)

If you're here because of the AI headline, this is important to read.

We’re looking at how we can use local, on-device AI models -- i.e., more private -- to enhance your browsing experience further. One feature we’re starting with next quarter is AI-generated alt-text for images inserted into PDFs, which makes it more accessible to visually impaired users and people with learning disabilities.

They are implementing AI how it should be. Don't let all the shitty companies blind you to the fact what we call AI has positive sides.

[–] [email protected] 13 points 5 months ago (1 children)

They are implementing AI how it should be.

The term is so overused and abused that I'm not clear what they're even promising. Are they localizing a LLM? Are they providing some kind of very fancy macroing? Are they linking up with ChatGPT somehow or integrating with Co-pilot? There's no way to tell from the verbage.

And that's not even really Mozilla's fault. It's just how the term AI can mean anything from "overhyped javascript" to "multi-billion dollar datacenter full of fake Scarlett Johansson voice patterns".

[–] [email protected] 15 points 5 months ago (1 children)

there are language models that are quite feasible to run locally for easier tasks like this. “local” rules out both ChatGPT and Co-pilot since those models are enormous. AI generally means machine learned neural networks these days, even if a pile of if-else used to pass in the past.

not sure how they’re going to handle low-resource machines, but as far as AI integrations go this one is rather tame

[–] [email protected] -1 points 5 months ago (3 children)

AI generally means machine learned neural networks these days

Right, but a neural network traditionally rules out using a single local machine. Hell, we have entire chip architecture that revolves around neural net optimization. I can't imagine needing that kind of configuration for my internet browser.

not sure how they’re going to handle low-resource machines

One of the perks of Firefox is its relative thinness. Chrome was a shameless resource hog even in its best days, and IE wasn't any better. Do I really want Firefox chewing hundreds of MB of memory so it can... what? Simulate a 600 processor cluster doing weird finger art?

load more comments (3 replies)
load more comments (4 replies)
[–] [email protected] 8 points 5 months ago (4 children)
[–] [email protected] 7 points 5 months ago

Vertical tabs will be neat for my ultra-wide!

load more comments (3 replies)
[–] [email protected] 10 points 5 months ago (2 children)

Building back to that 2005 standard feature set.

load more comments (2 replies)
[–] [email protected] 38 points 5 months ago (3 children)

Local AI sounds nice. One reason I'm cynical about the current state of AI is because of how many send all your data to another company

[–] [email protected] 4 points 5 months ago* (last edited 5 months ago)

Eh, I don't particularly care too much either way. It seems to be solving problems with the 80/20 approach: 80% of the benefit for 20% of the effort. However, getting that last 20% is probably way more difficult than just building purpose-built solutions from the start.

So I'm guessing we'll see a lot more "decent but not quite there" products, and they'll never "get there."

So it might be fun to play with, but it's not something I'm interested in using day-to-day. Then again, maybe I'm completely wrong and it's the best thing since sliced bread, but as someone who has worked on very basid NLP projects in the past (distantly related to modern LLMs), I just find it hard to look past the limitations.

load more comments (2 replies)
load more comments
view more: ‹ prev next ›