this post was submitted on 29 Jan 2024
439 points (85.0% liked)

Ask Lemmy

26270 readers
1362 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected]. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 1 year ago
MODERATORS
 

Am I the only one getting agitated by the word AI (Artificial Intelligence)?

Real AI does not exist yet,
atm we only have LLMs (Large Language Models),
which do not think on their own,
but pass turing tests
(fool humans into thinking that they can think).

Imo AI is just a marketing buzzword,
created by rich capitalistic a-holes,
who already invested in LLM stocks,
and now are looking for a profit.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] -1 points 7 months ago (1 children)

Humans possess an esoteric ability to create new ideas out of nowhere, never before thought of. Humans are also capable of inspiration, which may appear similar to the way that AI's remix old inputs into "new" outputs, but the rules of creativity aren't bound by any set parameters the way a LLM is. I'm going to risk making a comment that ages like milk and just spitball: true artificial intelligence that matches a human is impossible.

[–] [email protected] 7 points 7 months ago* (last edited 7 months ago) (1 children)

If I stuck you on a black box with and removed every single one of your senses, took away your ability to memorize things I don't really think you'd generate new ideas either. Human creativity relies heavily on output from the outside world. LLMs are not human like intelligence but do exhibit pretty amazing emergent behavior. LLMs are more sophisticated than you think. Human like AI has to be possible unless there is something intrinsically different about the human brain that breaks our current understanding of the world. Barring a "soul", the human brain has to be nothing but calculations taking place in a chemical medium. Meaning that human like AI or even better must be achievable.

load more comments (1 replies)
[–] [email protected] 2 points 7 months ago
[–] [email protected] 14 points 7 months ago

Yes, but I'm more annoyed with posts and conversations about it that are like this one. People on Lemmy swear they hate how uninformed and stupid the average person is when it comes to AI, they hate the click bait articles etc etc. Aaand then there's at least 5 different posts about it on the front page every. single. day., with all the comments saying exactly the same thing they said the day before, which is:

"Users are idiots for trusting a tech company, it's not Google's responsibility to keep your private data safe." "No one understands what 'AI' actually means except me." "Every middle-America dad, grandma and 10 year old should have their very own self hosted xyz whatever LLM, and they're morons if they don't and they deserve to have their data leaked." And can't forget the ubiquitous arguments about what "copyright infringement" means when all the comments are actually in agreement, but they still just keep repeating themselves over and over.

[–] [email protected] 6 points 7 months ago (1 children)

Just wait for "quantum ai"

load more comments (1 replies)
[–] [email protected] 2 points 7 months ago

you are not alone, it annoys me to no avail and I keep correcting & explaining to people who have no clue about how computers and LLMs work.

[–] [email protected] 6 points 7 months ago* (last edited 7 months ago) (1 children)

Don't worry, the hype will die sooner than later, just like with cryptocurrencies. What will remain are the power and resource hungry statistical models doing nice work in some specific domains, some long faces and some people having made a bunch of money from it. But yeah, the term also makes me angry, that's why I started referring to them as statistical models.

Am I the only one seeing a parallel between the spectrum planned <-> "free"-market economy and classical algorithm <-> statistical model/ML? It seems that some people prefer to have some magic invisible handle their problems instead of doing the tough work. I'm not saying that there is not space for both but we seem to be leaning on the magic side a bit too much lately.

load more comments (1 replies)
[–] [email protected] 9 points 7 months ago

"real ai" isn't a coherent concept.

Turing test isn't a literal test. It's a rhetorical concept that turing used to underline his logical positivist approach to things like intelligence, consciousness etc.

[–] [email protected] 20 points 7 months ago (1 children)

People: build an algorithm to generate text that sounds like a person wrote it by finding patterns in text written by people

Algorithm: outputs text that sounds like a person wrote it

Holyfuck its self aware guys

load more comments (1 replies)
[–] [email protected] 0 points 7 months ago

It's annoying because either all of it should be AI or none of it.

[–] [email protected] 14 points 7 months ago (2 children)

I work in AI, and the fatigue is real.

What I've found most painful is how people with no fucking clue about AI or ML chime in with their expert advice, when in reality they're as much an expert on AI as a calculator salesman is an expert in linear algebra. Having worked closely with scientists that hold PhD's, publish papers regularly, and who work on experiments for years, it makes me hate the hustle culture that's built up around AI. It's mostly crypto cunts looking for their next scheme, or businesses looking to abuse buzzwords to make themselves sound smart.

Purely my two-cents, but while LLM's have surprised a lot of people with their high quality output. With that being said, they are known to heavily hallucinate, cost fuckloads, and there is a growing group of people that wonder whether the great advances we've seen are either due to a lot of hand-holding, or the use of a LOT of PII or stolen data. I don't think we'll see an improvement from what we've already seen, just many other companies having their own similar AI tools that help a little with very well-defined menial tasks.

I think the hype will die out eventually, and companies that decided to bin actual workers in favour of AI will likely not be around 12-24 months later. Hopefully most people and businesses will see through the bullshit, and see that the CEO of a small ad agency that has positioned himself as an AI expert is actually a lying simpleton.

As for it being "real AI" or "real ML", who gives a fuck. If researchers are happy with the definition, who are we to be pedantic? Besides, there are a lot of systems behind the scenes running compositional models, handing entity resolution, or building metrics for success/failure criteria to feed back into improving models.

[–] [email protected] 0 points 7 months ago

they are known to heavily hallucinate, cost fuckloads, and there is a growing group of people that wonder whether the great advances we’ve seen are either due to a lot of hand-holding,

Same can be said for certain humans.

load more comments (1 replies)
[–] [email protected] 2 points 7 months ago

I get agitated by the word AI when it's obviously not using machine learning, or when it's used to shove ChatGPT into something without any reason to use it over just using ChatGPT.

[–] [email protected] 7 points 7 months ago

AI experts in interviews will tell you that like 99% of phrasing around AI used by people is fundamentally incorrect, and that management of corporations are the worst about it.

[–] [email protected] 2 points 7 months ago

Not at all, im pretty exvited for AI and AGI. Its the future. Feels like smartphone era all over again

[–] [email protected] 17 points 7 months ago (1 children)

Yes, but if they say "AI" then people give them money.

[–] [email protected] 3 points 7 months ago (1 children)

Don't forget machine learning.

[–] [email protected] 6 points 7 months ago (1 children)

Our patented machine learning Blackjack artificial intelligence knows exactly when to stick and when to draw!

The algorithm:

while hand.value<17 do DrawCard();
[–] [email protected] 1 points 7 months ago

Hotdog. Not hotdog.

[–] [email protected] 11 points 7 months ago* (last edited 7 months ago)

To be fair it's still AI, If I remember correctly what I learned from uni LLM are in the category what we call expert systems. We could call them that way, then again LLM did not exist back then, and most of the public does not know all this techno mumbo-jumbo words. So here we are AI it is.

[–] [email protected] 18 points 7 months ago

I'm pissed that large corps are working hard on propaganda to say that LLMs and theft of copyright is good if they do it

load more comments
view more: ‹ prev next ›