this post was submitted on 22 Jun 2025
777 points (94.4% liked)

Technology

71949 readers
9211 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

Then retrain on that.

Far too much garbage in any foundation model trained on uncorrected data.

Source.

More Context

Source.

Source.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 21 points 4 days ago

Yes please do that Elon, please poison grok with garbage until full model collapse.

[–] [email protected] 16 points 4 days ago

Is he still carrying his little human shield around with him everywhere or can someone Luigi this fucker already?

[–] [email protected] 10 points 4 days ago

if you won't tell my truth I'll force you to acknowledge my truth.

nothing says abusive asshole more than this.

[–] [email protected] 22 points 4 days ago

which has advanced reasoning

No it doesn't.

[–] [email protected] 40 points 4 days ago (2 children)

The thing that annoys me most is that there have been studies done on LLMs where, when trained on subsets of output, it produces increasingly noisier output.

Sources (unordered):

Whatever nonsense Muskrat is spewing, it is factually incorrect. He won't be able to successfully retrain any model on generated content. At least, not an LLM if he wants a successful product. If anything, he will be producing a model that is heavily trained on censored datasets.

load more comments (2 replies)
[–] [email protected] 14 points 4 days ago

I remember when I learned what corpus meant too

[–] [email protected] 9 points 4 days ago

I'm just seeing bakes in the lies.

[–] [email protected] 9 points 4 days ago

Unironically Orwellian

[–] [email protected] 8 points 4 days ago
[–] [email protected] 36 points 4 days ago (1 children)

I never would have thought it possible that a person could be so full of themselves to say something like that

[–] [email protected] 4 points 4 days ago (1 children)

An interesting thought experiment: I think he's full of shit, you think he's full of himself. Maybe there's a "theory of everything" here somewhere. E = shit squared?

load more comments (1 replies)
[–] [email protected] 73 points 4 days ago (4 children)

He's been frustrated by the fact that he can't make Wikipedia 'tell the truth' for years. This will be his attempt to replace it.

load more comments (4 replies)
[–] [email protected] 69 points 4 days ago (1 children)

Elon Musk, like most pseudo intellectuals, has a very shallow understanding of things. Human knowledge is full of holes, and they cannot simply be resolved through logic, which Mush the dweeb imagines.

[–] [email protected] -1 points 4 days ago (1 children)

Uh, just a thought. Please pardon, I'm not an Elon shill, I just think your argument phrasing is off.

How would you know there are holes in understanding, without logic. How would you remedy gaps of understanding in human knowledge, without the application of logic to find things are consistent?

[–] [email protected] 14 points 4 days ago* (last edited 4 days ago) (2 children)

You have to have data to apply your logic too.

If it is raining, the sidewalk is wet. Does that mean if the sidewalk is wet, that it is raining?

There are domains of human knowledge that we will never have data on. There’s no logical way for me to 100% determine what was in Abraham Lincoln’s pockets on the day he was shot.

When you read real academic texts, you’ll notice that there is always the “this suggests that,” “we can speculate that,” etc etc. The real world is not straight math and binary logic. The closest fields to that might be physics and chemistry to a lesser extent, but even then - theoretical physics must be backed by experimentation and data.

load more comments (2 replies)
[–] [email protected] 9 points 4 days ago

Not sure if has been said already. Fuck musk.

[–] [email protected] 100 points 4 days ago (1 children)

[My] translation: "I want to rewrite history to what I want".

[–] [email protected] 23 points 4 days ago (1 children)

That was my first impression, but then it shifted into "I want my AI to be the shittiest of them all".

[–] [email protected] 18 points 4 days ago

Why not both?

[–] [email protected] 12 points 4 days ago

Don't feed the trolls.

[–] [email protected] 8 points 4 days ago

advanced reasoning

If it's so advanced, it should be able to reason out that all human knowledge is standing in the shoulders of others and how errors have prompted us to explore other areas and learn things we never would have otherwise.

[–] [email protected] 6 points 4 days ago

I'm sure the second Grok in the human centipede will find that very nutritious.

If you use that Grok, you'll be third in the centipede. Enjoy.

[–] [email protected] 3 points 4 days ago

i'll allow it so long as grok acknowledges that musk was made rich from inherited wealth that was created using an apartheid emerald mine.

[–] [email protected] 4 points 4 days ago

Meme image: Elon ducking his own tiny nub.

[–] [email protected] 1 points 4 days ago

You want to have a non-final product write the training for the next level of bot? Sure, makes sense if you're stupid. Why did all these companies waste time stealing when they could just have one bot make data for the next bot to train on? Infinite data!

load more comments
view more: ‹ prev next ›