this post was submitted on 13 May 2025
1 points (100.0% liked)

TechTakes

1870 readers
31 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 1 week ago (1 children)

yeah nah, it's bad then too actually

[–] [email protected] 0 points 1 week ago (5 children)

How? It's just like googling stuff but less annoying

[–] [email protected] 0 points 1 week ago

Were you too young to use a computer back when Google was good?

[–] [email protected] 0 points 1 week ago

it is not just like googling stuff if it actively fucks up already existing parts of the code

[–] [email protected] 0 points 1 week ago (1 children)

also, fucking ew:

Needs to be put in it’s place like a misbehaving dog, lol

why do AI guys always have weird power fantasies about how they interact with their slop machines

[–] [email protected] 0 points 1 week ago

It’s almost as if they have problematic conceptions (or lack thereof) of exploitation and power dynamics!

[–] [email protected] 0 points 1 week ago

given your posts in this thread, I don’t think I trust your judgement on what less annoying looks like

[–] [email protected] 0 points 1 week ago (3 children)

Google used to return helpful results that answered questions without needing to be corrected before it started returning AI slop. So maybe that is true now, but only because the search results are the same AI slop as the AI.

For example, results in stack overflow generally include some discussion about why a solution addressed the issue that provided extra context for why you might use it or do something else instead. AI slop just returns a result which may or may not be correct but it will be presented as a solution without any context.

[–] [email protected] 0 points 1 week ago (1 children)

Google became shit not because of AI but because of SEO.

The enshitification was going on long before OpenAI was even a thing. Remember when we had to add the "reddit" tag just to make sure to get actual results instead of some badly written bloated text?

[–] [email protected] 0 points 1 week ago (1 children)

Google search became shit when they made the guy in charge of ads also in charge of search.

[–] [email protected] 0 points 1 week ago (1 children)

this is actually the correct case - it is both written about (prabhakar raghavan, look him up), and the exact mechanics of how they did it were detailed in documents surfaced in one of the lawsuits that google recently lost (the ones that found they them to be a monopoly)

[–] [email protected] 0 points 1 week ago (1 children)

Ackshually, Google became shit when they started posturing as a for-profit entity. Gather round, comrades, let us sing the internationale

[–] [email protected] 0 points 1 week ago (1 children)

ah yes, the borg deep cuts, iykyk

[–] [email protected] 0 points 1 week ago

I bring a sort of biological and technological uniqueness to the collective that the federation doesn’t really like

[–] [email protected] 0 points 1 week ago (1 children)

The funny thing about stack overflow is that the vocal detractors have a kernel of truth to their complaints about elitism, but if you interact with them enough you realize they're often the reason the gate keeping is necessary to keep the quality high.

[–] [email protected] 0 points 1 week ago* (last edited 1 week ago) (1 children)

I used to answer new questions on SO daily a few years back and 50% of all questions are basically unanswerable.

You'd also have the nice September Effect when a semester started and every other question would be someone just copy pasting their homework verbatim and being very surprised we closed it in like a minute.

The thing about that is that literally anyone can answer SO questions. Like try and do that. Pick a language or a tech you're most familiar with, filter that tag and sort by new. Click on every new question. After an hour you'll understand just why most questions have to be closed immediately to keep the site sane.

Whenever I see criticism of SO that's like "oh they'll just close your question for no reason" I can't help but think okay, there's overwhelming chance you're just one of Those and not an innocent casualty of an overeager closer.

[–] [email protected] 0 points 1 week ago

I remember in my OS course we were advised to practice good “netiquette” if we were going to go bother the fine folks on stack overflow. Times have changed

[–] [email protected] 0 points 1 week ago (5 children)

Stack overflow resulted in people with highly specialised examples that wouldn't suit your application. It's easier to just ask an AI to write a simple loop for you whenever you forget a bit of syntax

[–] [email protected] 0 points 1 week ago

Fun fact, SO is not a place to go to ask for trivial syntax and it's expressly off-topic, because guess what, people answering questions on SO are not your personal fucking google searchers

[–] [email protected] 0 points 1 week ago

wow imagine needing to understand the code you’re dealing with and not just copypasting a bunch of shit around

reading documentation and source code must be an excruciating amount of exercise for your poor brain - it has to even do something! poor thing

[–] [email protected] 0 points 1 week ago

You've inadvertently pointed out the exact problem: LLM approaches can (unreliably) manage boilerplate and basic stuff but fail at anything more advanced, and by handling the basic stuff they give people false confidence that leads to them submitting slop (that gets rejected) to open source projects. LLMs, as the linked pivot-to-ai post explains, aren't even at the level of occasionally making decent open source contributions.

[–] [email protected] 0 points 1 week ago* (last edited 1 week ago) (2 children)

Man i remember eclipse doing code completion for for loops and other common snippets in like 2005. LLM riders don't even seem to know what tools have been in use for decades and think using an LLM for these things is somehow revolutionary.

[–] [email protected] 0 points 1 week ago

Forever in my mind, the guy who said on another post he uses an LLM to convert strings to uppercase when that's literally a builtin command in VSCode, give people cannons and they're start shooting mosquitoes with them every fucking time

[–] [email protected] 0 points 1 week ago (1 children)

the promptfondlers that make their way into our threads sometimes try to brag about how the LLM is the only way to do basic editor tasks, like wrapping symbols in brackets or diffing logs. it’s incredible every time

[–] [email protected] 0 points 1 week ago (1 children)
[–] [email protected] 0 points 1 week ago (1 children)

yep, I came up with promptfans (as a reference for describing all the weirdos who do free PR and hype work for this shit), and then @skillsissuer came up with promptfondlers for describing those that do this kind of bullshit

(and promptfuckers has become to collective noun I think of for all of them)

[–] [email protected] 0 points 1 week ago (1 children)

I like promptfarmers for the LLM companies and developers. It reflects there attitude of passively hoping that letting their model grow in scale will bring in some future harvest of money.

[–] [email protected] 0 points 1 week ago (1 children)

hmm, I like that!

and then I guess "promptfarmowner" would be saltman etc?

[–] [email protected] 0 points 1 week ago (1 children)

Grain futures salesmen on farms full of plant life (99.5% of which is weeds). ...I don't have a snappy label yet.

[–] [email protected] 0 points 1 week ago

artisanal legumist

[–] [email protected] 0 points 1 week ago

Air so polluted it makes people sick, but it's all worth it because you can't be arsed to remember the syntax of a for loop.