"Comment whose upvotes all come from programming dot justworks dot dev dot infosec dot works" sure has become a genre of comment.
TechTakes
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
Federating is a vector of disease
And honey, I'm patient zero 😎
fucking weird comment to post.
😎
Comments coming from .dev should default to comic sans.
holy shit this sounds like an amazing plugin
How can you tell who up votes a comment?
instance admins have a button for it
"I can safely bet that by 'all upvotes come from programming dot justworks dot dev dot infosec dot works' you actually mean 'a vast majority of upvotes come from these tech instances' even before reading your comment."
"Or in other words I correctly interpreted what you meant but apparently the way you said it is a problem because I prefer to blame users rather than peddlers."
As a fellow Interesting Wedding Haver, I have to give all the credit in the world to the author for handling this with grace instead of, say, becoming a terrorist. I would have been proud to own the "Tracy did nothing wrong" tshirt.
Credit to her for making the best of a bad situation. "We almost couldn't get legally married, so we had to bring in Elvis to officiate the paperwork after the ceremony" is going to be a top-tier wedding story for every party going forward.
good thing Elvis is everywhere
As long as music is alive, so is The King. A-thank you. Thank you very much.
I can make a safe assumption before reading the article that ChatGPT didn't ruin the wedding, but rather somebody that was using ChatGPT ruined the wedding.
"ChatGPT is good, but only if no one in a position of authority uses it"
Cool.
"This hammer can't plan a wedding. Hammers are useless."
almost all of your posts are exactly this worthless and exhausting and that’s fucking incredible
I get the feeling you're exactly the kind of person who shouldn't have a proompt, much less a hammer
no absolutely, I shouldn’t ever “have a proompt”, whatever the fuck that means
the promptfondlers really aren’t alright now that public opinion’s against the horseshit tech they love
istg these people seem to roll "b-b-b-but <saltman|musk|sundar|....> gifted this technology to me personally, how could I possibly look this gift horse in the mouth" on the inside of their heads
The worst kind of golem
(nevermind them doing the equivalent of rolling into someone else's lounge then stripping down and getting comfortable on the couch, without asking)
why would you say something so inane my god
do you think they ever got round to reading the article, or were they spent after coming up with “hmmmm I bet chatgpt didn’t somehow prompt itself” as if that were a mystery that needed solving
I had to take a nap after my profound thoughts.
wankery will do that to a man
"blame the person, not the tools" doesn't work when the tool's marketing team is explicitly touting said tool as a panacea for all problems. on the micro scale, sure, the wedding planner is at fault, but if you zoom out even a tiny bit it's pretty obvious what enabled them to fuck up for as long as they did
what if the person in question is also a tool
oh gods they're multiplying
Yea yea words.
Trust but verify.
Here's a better idea - treat anything from ChatGPT as a lie, even if it offers sources
I think we should require professionals to disclose whether or not they use AI.
Imagine you're an author and you pay an editor $3000 and all they do is run your manuscript through ChatGPT. One, they didn't provide any value because you could have done the same thing for free; and two, if they didn't disclose the use of AI, you wouldnt even know your novel had been fed into one and might be used by the AI for training.
I think we should require professionals not to use the thing currently termed AI.
Or if you think it's unreasonable to ask them not to contribute to a frivolous and destructive fad or don't think the environmental or social impacts are bad enough to implement a ban like this, at least maybe we should require professionals not to use LLMs for technical information
But the article author wasn’t interfacing with chatgpt, she was interfacing with a human paid to help with the things the article author did not know. The wedding planner was a supposed expert in this interaction, but instead simply sent back regurgitated chatgpt slop.
Is this the fault of the wedding planner? Yes. Is it the fault of chatgpt? Also yes.
Scams are LLM's best use case.
They're not capable of actual intelligence or providing anything that would remotely mislead a subject matter expert. You're not going to convince a skilled software developer that your LLM slop is competent code.
But they're damn good at looking the part to convince people who don't know the subject that they're real.
what does this have to do with the article
ChatGPT didn't nearly destroy her wedding, her lousy wedding planner did. Also whats she got against capital letters?
*What's
For the sin of making a grammar error in a post criticizing grammar you must now do ten push-ups.
Yea yea guns don't kill people, bullet impacts kill people. Dishonesty and incompetence are nothing new, but you may note that the wedding planner's unfounded confidence in ChatGPT exacerbated the problem in a novel way. Why did the planner trust the bogus information about Vegas wedding officiants? Is someone maybe presenting these LLM bots as an appropriate tool for looking up such information?
Bullet impacts don't kill people, tissue deorganization and fluid loss kill people!
let’s not
It's clearly a joke walking the first phrase they said a bit further, to an absurdly literal point.
yes, let's not
Hilarious
you couldn’t fucking help yourself could you
fuckoff
Yes, even some influential people at my employer have started to peddle the idea that only “old-fashioned” people are still using Google, while all the forward-thinking people are prompting an AI. For this reason alone, I think that negative examples like this one deserve a lot more attention.