this post was submitted on 12 Oct 2024
1 points (100.0% liked)

TechTakes

1490 readers
18 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
top 48 comments
sorted by: hot top controversial new old
[–] [email protected] 0 points 2 months ago (4 children)

"Comment whose upvotes all come from programming dot justworks dot dev dot infosec dot works" sure has become a genre of comment.

[–] [email protected] 0 points 2 months ago (1 children)

Federating is a vector of disease

[–] [email protected] 0 points 2 months ago (1 children)

And honey, I'm patient zero 😎

[–] [email protected] 0 points 2 months ago (1 children)

fucking weird comment to post.

[–] [email protected] 0 points 2 months ago
[–] [email protected] 0 points 2 months ago (1 children)

Comments coming from .dev should default to comic sans.

[–] [email protected] 0 points 2 months ago

holy shit this sounds like an amazing plugin

[–] [email protected] 0 points 2 months ago (1 children)

How can you tell who up votes a comment?

[–] [email protected] 0 points 2 months ago

instance admins have a button for it

[–] [email protected] 0 points 2 months ago

"I can safely bet that by 'all upvotes come from programming dot justworks dot dev dot infosec dot works' you actually mean 'a vast majority of upvotes come from these tech instances' even before reading your comment."

"Or in other words I correctly interpreted what you meant but apparently the way you said it is a problem because I prefer to blame users rather than peddlers."

[–] [email protected] 0 points 2 months ago* (last edited 2 months ago) (1 children)

As a fellow Interesting Wedding Haver, I have to give all the credit in the world to the author for handling this with grace instead of, say, becoming a terrorist. I would have been proud to own the "Tracy did nothing wrong" tshirt.

[–] [email protected] 0 points 2 months ago (1 children)

Credit to her for making the best of a bad situation. "We almost couldn't get legally married, so we had to bring in Elvis to officiate the paperwork after the ceremony" is going to be a top-tier wedding story for every party going forward.

[–] [email protected] 0 points 2 months ago (1 children)

good thing Elvis is everywhere

[–] [email protected] 0 points 2 months ago

As long as music is alive, so is The King. A-thank you. Thank you very much.

[–] [email protected] 0 points 2 months ago (3 children)

I can make a safe assumption before reading the article that ChatGPT didn't ruin the wedding, but rather somebody that was using ChatGPT ruined the wedding.

[–] [email protected] 0 points 2 months ago (1 children)

"ChatGPT is good, but only if no one in a position of authority uses it"

Cool.

[–] [email protected] 0 points 2 months ago (1 children)

"This hammer can't plan a wedding. Hammers are useless."

[–] [email protected] 0 points 2 months ago (1 children)

almost all of your posts are exactly this worthless and exhausting and that’s fucking incredible

[–] [email protected] 0 points 2 months ago* (last edited 2 months ago) (1 children)

I get the feeling you're exactly the kind of person who shouldn't have a proompt, much less a hammer

[–] [email protected] 0 points 2 months ago (1 children)

no absolutely, I shouldn’t ever “have a proompt”, whatever the fuck that means

the promptfondlers really aren’t alright now that public opinion’s against the horseshit tech they love

[–] [email protected] 0 points 2 months ago (2 children)

istg these people seem to roll "b-b-b-but <saltman|musk|sundar|....> gifted this technology to me personally, how could I possibly look this gift horse in the mouth" on the inside of their heads

[–] [email protected] 0 points 2 months ago

The worst kind of golem

[–] [email protected] 0 points 2 months ago

(nevermind them doing the equivalent of rolling into someone else's lounge then stripping down and getting comfortable on the couch, without asking)

[–] [email protected] 0 points 2 months ago (1 children)

why would you say something so inane my god

[–] [email protected] 0 points 2 months ago (1 children)

do you think they ever got round to reading the article, or were they spent after coming up with “hmmmm I bet chatgpt didn’t somehow prompt itself” as if that were a mystery that needed solving

[–] [email protected] 0 points 2 months ago (1 children)

I had to take a nap after my profound thoughts.

[–] [email protected] 0 points 2 months ago

wankery will do that to a man

[–] [email protected] 0 points 2 months ago* (last edited 2 months ago) (1 children)

"blame the person, not the tools" doesn't work when the tool's marketing team is explicitly touting said tool as a panacea for all problems. on the micro scale, sure, the wedding planner is at fault, but if you zoom out even a tiny bit it's pretty obvious what enabled them to fuck up for as long as they did

[–] [email protected] 0 points 2 months ago (1 children)

what if the person in question is also a tool

[–] [email protected] 0 points 2 months ago

oh gods they're multiplying

[–] [email protected] 0 points 2 months ago (2 children)

Yea yea words.

Trust but verify.

[–] [email protected] 0 points 2 months ago (2 children)

Here's a better idea - treat anything from ChatGPT as a lie, even if it offers sources

[–] [email protected] 0 points 2 months ago (1 children)

I think we should require professionals to disclose whether or not they use AI.

Imagine you're an author and you pay an editor $3000 and all they do is run your manuscript through ChatGPT. One, they didn't provide any value because you could have done the same thing for free; and two, if they didn't disclose the use of AI, you wouldnt even know your novel had been fed into one and might be used by the AI for training.

[–] [email protected] 0 points 2 months ago

I think we should require professionals not to use the thing currently termed AI.

Or if you think it's unreasonable to ask them not to contribute to a frivolous and destructive fad or don't think the environmental or social impacts are bad enough to implement a ban like this, at least maybe we should require professionals not to use LLMs for technical information

[–] [email protected] 0 points 2 months ago (1 children)

But the article author wasn’t interfacing with chatgpt, she was interfacing with a human paid to help with the things the article author did not know. The wedding planner was a supposed expert in this interaction, but instead simply sent back regurgitated chatgpt slop.

Is this the fault of the wedding planner? Yes. Is it the fault of chatgpt? Also yes.

[–] [email protected] 0 points 2 months ago

Scams are LLM's best use case.

They're not capable of actual intelligence or providing anything that would remotely mislead a subject matter expert. You're not going to convince a skilled software developer that your LLM slop is competent code.

But they're damn good at looking the part to convince people who don't know the subject that they're real.

[–] [email protected] 0 points 2 months ago

what does this have to do with the article

[–] [email protected] 0 points 2 months ago (2 children)

ChatGPT didn't nearly destroy her wedding, her lousy wedding planner did. Also whats she got against capital letters?

[–] [email protected] 0 points 2 months ago

*What's

For the sin of making a grammar error in a post criticizing grammar you must now do ten push-ups.

[–] [email protected] 0 points 2 months ago (2 children)

Yea yea guns don't kill people, bullet impacts kill people. Dishonesty and incompetence are nothing new, but you may note that the wedding planner's unfounded confidence in ChatGPT exacerbated the problem in a novel way. Why did the planner trust the bogus information about Vegas wedding officiants? Is someone maybe presenting these LLM bots as an appropriate tool for looking up such information?

[–] [email protected] 0 points 2 months ago* (last edited 2 months ago) (1 children)

Bullet impacts don't kill people, tissue deorganization and fluid loss kill people!

[–] [email protected] 0 points 2 months ago (1 children)
[–] [email protected] 0 points 2 months ago* (last edited 2 months ago) (1 children)

It's clearly a joke walking the first phrase they said a bit further, to an absurdly literal point.

[–] [email protected] 0 points 2 months ago (1 children)
[–] [email protected] 0 points 2 months ago (2 children)
[–] [email protected] 0 points 2 months ago

you couldn’t fucking help yourself could you

[–] [email protected] 0 points 2 months ago
[–] [email protected] 0 points 2 months ago

Yes, even some influential people at my employer have started to peddle the idea that only “old-fashioned” people are still using Google, while all the forward-thinking people are prompting an AI. For this reason alone, I think that negative examples like this one deserve a lot more attention.